Monday, April 1, 2013

HB4369 - Expansion of EAA Schools, Testimonial Against

TESTIMONY TO THE HOUSE EDUCATION COMMITTEE ON HB 4369

By Dr. Thomas C. Pedroni on March 13, 2013

Good Afternoon House Education Committee:

(Video of testimony)

My name is Tom Pedroni, and I am an Associate Professor of Curriculum Studies at Wayne State University.  In her testimony last week, Mary Esselman of the EAA reported what appeared to be substantial progress in student growth. Specifically, Esselman shared data with this committee attesting to the fact that a substantial number of students in the EAA appeared to be on target for one or more years of growth over the course of the EAA's first year.  Both Esselman and Covington referred to this growth as evidence that the EAA model is a good model, and that therefore this committee would be acting in Michigan students' best interests to expand the EAA model to more schools in more regions of the state.

The purpose of my testimony today is to show, using Esselman's own numbers, that student growth in the EAA as represented by the first two administrations of Scantron's Performance Series tests, are not nearly as remarkable as Esselman claimed them to be.  In fact, as I will show, student growth using Esselman's own numbers is quite the opposite of remarkable. Furthermore, using teacher testimony, I will show that the chaotic testing conditions under which the baseline tests in the fall were administered in some schools likely contributed to student scores that under represented students' actual abilities. If the baseline scores are in fact partly lower due to testing conditions, this has the effect of artificially increasing the apparent growth from the baseline to the most recent administration.

First on the numbers that Esselman shared in her testimony before this committee last week.

The EAA school year started September 4. Assuming that the tests were given around February 1 (and the EAA says they were given in January and February) that would mean that the tests represent growth from September 4 to roughly February 1. That is a time span of roughly five months. For that time span, Esselman presented us with the proportion of students who are on track to make one year or more of growth during the school year. The EAA is giving four such tests across the school year, with the first having been the baseline test. If the three tests that occur after the baseline test are spaced evenly through the entire school year, it would mean that the results from the most recent test, administered around February 1, were measured against what students are expected to learn in the first third of the school year.  In most schools, the school year is nine months. Thus, EAA students on or around February 1 were being compared to what most students in most schools should have learned in the first three months of the school year. So when we look at Esselman's charts, what we are really seeing is how students fared in their first five months of work compared to what students in most schools should be learning in their first three months of work.  After five months of instruction in the EAA, students' scores demonstrate how much they learned compared to what other students are expected to learn in three months. For most schools five months is not a third of the school year. It is rather more than half of the school year.

So taking more than half the time of a typical school year (and this is forgetting that the EAA school day is also longer than the typical student school day), how did EAA schools compare to what students should be learning within three months? Well, to find out, all we need to do is flip the charts that Esselman presented us with during her testimony last week. So, if you look at reading, Esselman says that 48 percent of students are on track, if we assume that five months is a third of the school year, to make the progress that they should be making over the entire year. That leaves 52% of students who are NOT on track, even after five months, to make three months of progress toward a full year's expected growth. More than half of the EAA students did not make progress in five months that most schools should make in three months. Even if the EAA had accomplished this growth in only a third of the school year, still more than half of the EAA students did not make minimal expected growth within that time span. Is this a laudable model that should be expanded before taking a closer look?

Now let's look at math.  In math, Esselman reports that 43% of students in EAA schools are on track to make a year's worth of growth, again after five months compared to what most students are expected to show in growth in only three months.  That means that 57% of EAA students are NOT on track to make one year's worth of growth, even though they had five months to demonstrate what most schools only have three months to demonstrate.  Even if EAA students had shown this amount of growth in just three months, still 57% did not make minimal expected growth. Again, is this a laudable model?

So the claims of growth made by Esselman, when considered in this light, appear to actually represent a distinct lack of success attaining student growth in the EAA.

But now I'm going to use teacher testimony to show that even that unremarkable growth is probably an exaggeration of how much students actually grew. Now, we already know that beginning of the year baselines are always lower because of the decay of student learning over the summer months. Students entering the EAA in September had been off since school ended last June. This already means that student scores in the fall represent less than what they knew in June. That dynamic affects all students who are off during the summer months, and therefore all students scores in any school district that is off during the summer.  That is simply a well-known fact in the testing world.

But in addition to that there were conditions in the EAA schools this fall that drove the baseline even lower.  As we have now heard from many teachers at many schools in the EAA, there was a chaotic environment that surrounded the administration of the baseline test at many schools. Thus, the baseline test results from last fall likely underrepresent students' real abilities. Listen to the following teacher's testimony, and think to yourself if it sounds like under such conditions students would have performed at their best. If not, then the second test would actually show more growth than what was real:

"In my building (Burns), the internet was out constantly during the tests. Students would be in the middle of a test and the whole thing would shut off, limiting their ability to take the test as well as restricting how many students we could test at once. Because of this, we had to test within a very extended window of time. Students sometimes did not have headphones with which to test, even though some portions required them. Other students had log-ins that did not work, or worked every other time they sat down to test. At times, students tested in the cafeteria because there was nowhere else to test them, so conditions weren't great." 
"That being said, Burns is said to have had the MOST growth out of any EAA school in the district re: Performance Series. Considering the middle school students haven't had a teacher for 3 out of the 4 core subjects for over 3 months now, I seriously doubt this growth is valid. Unless, of course, our daily substitutes were just pumping them full of information, which I know for a fact they weren't because they weren't provided any behavior systems to manage the classrooms. Most substitutes didn't last the full class day, let alone months at a time."
--- Teacher, Burns Elementary-Middle School.

I ask this committee to reconsider the growth that Mary Esselman claimed last week as a reason for why the EAA legislation should be approved.  Was student growth as shown on the tests remarkable? No, on the contrary it was quite unremarkable, to put it mildly. Were even those claims to growth exaggerated by the chaotic conditions under which the baseline test was administered? Based on several teachers' reports, including one that I shared with you, the answer is likely yes. The EAA may be a model of something, but it is not a positive model that should be replicated or codified or expanded. I urge you to vote no on HB 4369.

Thank you.
Thomas C. Pedroni
Associate Professor, Curriculum Studies

No comments: