(Guest Post by Matthew Ladner)
So just for fun I decided to calculate the cohort math gains for states and state charter sectors in NAEP. Note that on the charter side there are considerable sampling issues in deriving an estimate for a relatively small group of students, making it a really great idea to check a secondary source of data rather than accepting an 8th grade NAEP score for charter students as written on a stone tablet by a higher power. Various other caveats also apply- high gain scores are not the same as high scores, for instance. The number one gainer below (MN charters) does not have an especially high average 8th grade math score, and in fact lags about ten points below the statewide average for Minnesota. The opposite is true of the number two gainer (AZ charters) which have significantly higher overall scores than their statewide average and high scores overall. Charter sectors dominated by lots of new schools getting their sea legs full of students taking an academic hit getting used to a new school can create an optical illusion in a snapshot, such as those provided by NAEP. Many of these sectors may be on their way to improving in other words as ineffective/undesirable charters close, new ones open and survive, etc.
While NAEP has sampled the same cohort of students as both 4th and 8th graders, they are not of course testing the same students. Students move around, both between states and between district and charter schools. I don’t expect that many states are losing their high performing math students at high rates and having them replaced by low performing math students. In other words, at the state level kids moving around probably does not amount to much because things average out in the aggregate. I’m less confident of this being the case at the charter sector level. There are other caveats that could be dwelt upon, but that would start to violate the Prime Directive.
So okay you’ve been warned- each of these gain scores needs to be viewed in a broader context- far more context than I am going to be able provide here. Having said all of that:
Charter sectors cover both the top and the bottom of the chart- 7 out of the top 10, and the top four overall gains.
Down at the bottom of the chart alas we see the bottom five spots covered by charter sectors. So, Pennsylvania charters, we need to talk. The NAEP listed your 2011 average 4th grade math score as 241, and your 2015 average 8th grade score as 249. I **ahem** double checked the numbers just to be sure I hadn’t made some mistake. The district numbers for Philadelphia in the TUDA- 225 for 2011 4th graders, 267 for 2015 8th graders. Both of those scores are catastrophically terrible, but the second one is at least meaningfully higher than the first one. Something goofy with these NAEP numbers? PA charters dominated by dropout recovery programs? Who let the dogs out?
As stated above, no hard and fast conclusions should be drawn from this little insomnia driven exercise, but PA charters might want to turn up the water pressure:
Keep in mind that MN has kept its own math standards. Its legislature listened to its U of MN mathematicians. MN uses its own math tests, too.
Good for them. AZ is in the process of revising standards but topped the nation in statewide math gains for both the 2009 to 2013 and again for the 2011 to 2015 period after having adopted the STANDARDS WHICH MUST NOT BE NAMED in 2010. Also did well before 2010. Loveless analysis indicates standards = snore.
What is known about the math tests used in AZ? (not NAEP’s) An open question.
Thanks. I looked through and couldn’t find information on who set cut scores and who vetted the math test items. Format for many test items is like PARCC’s. Maybe AZMerit is a PARCC test? New state tests in AZ as of 2015.
It is the AIR test for PARCC standards. Given the lack of an apparent relationship between the standards and NAEP trends (2015 math scores dropped in most states like Texas that did not adopt and in most states that did) one is left with the impression that either the standards don’t make much of a difference OR for some reason they made an unusually large impact in Arizona. This latter explanation gets strained further in that they would have to have a super-duper positive impact in AZ charter schools but less of one in the districts.
I am going with explanation one: they don’t make much of a difference, which is the position of both Hanushek and Loveless.