(Guest Post by Matthew Ladner)
NAEP is going to release the 2011 Reading and Mathematics results on November 1st. I thought it would be interesting to boldly make some predictions in advance. Here’s my first one: the 2011 results won’t be all that different from the 2009 results.
I know, I’m going waaaaay out on a limb here, but that’s my prediction and I am sticking to it.
While a number of states have engaged in far-reaching reforms, the vast majority of these efforts still lie in the implementation stage. Possible exceptions in my mind include Washington D.C., Louisiana and Florida.
For DC, the 2011 NAEP will constitute the first plausible check on the tenure of Michelle Rhee. DCPS began making substantial math and reading progress in the mid 1990s, with huge gains but with scores still low. Assuming normal lags between changes and impacts, I believe that the 2009 NAEP arrived a bit early. I’ll be very interested to see what happens with the 2011 scores. Washington DC is also experiencing gentrification, so I will look at the free and reduced lunch numbers.
Louisiana will be a very interesting case, as some important statewide reforms still remain in the implementation phase, but where New Orleans has been in serious reform mode since 2005. I’ll take a look at the trend in urban numbers.
Florida of course enjoyed a steady increase in NAEP scores since 1998. Florida lawmakers also instituted a fresh set of far-reaching reforms in 2011, but the verdict on those will come years down the road. Governor Crist failed to pursue far-reaching reforms of his own, and vetoed some of those that reached his desk. Florida’s scores may rise again, but I won’t be surprised if they hit a plateau.
The Great Recession may also make this NAEP a little less incremental that usual. It will be interesting to see what happens to scores in the “Sand States” with the greatest property crashes (Arizona, California, Florida and Nevada) in addition to other states with acute economic distress like Michigan.
I will look with some interest at Arizona’s scores. Not only is the state face down on the economic canvass, with house building and flipping having been signature industries before the pop, it is possible that the infamous SB 1070 may lead to the illusion of progress in Hispanic scores. To the extent that the already partially overturned SB 1070 convinced undocumented families to leave Arizona, it may create the appearance of academic improvement.
Outside of that, I’ll be looking for pleasant surprises. Tell me what you are interested in seeing from the 2011 NAEP in the comments.
I’ll be looking for further hard proof that lumping together reduced-price and free-lunch eligible children as a single entity is essentially a scam.
A scam? It is easy enough to seperate them in the NAEP data, and 80% of FRL kids nationally are free lunch eligible kids.
Sure, it’s simple for a battle-tested education policy wonk to tease out the data. Even a borderline moron civilian like me eventually figured out how to do it.
But it’s usually free + reduced shorthand when comparisons are being made between states, districts, neighborhood schools, charters, etc. It becomes problematic in the many jurisdictions where it’d be far more accurate to lump together documented non-eligible and reduced than reduced and free.
A broader example of what I’m talking about can be found in the Yahoo coverage of the Waconda, KS, district that Jay linked to recently. The author twice (almost approvingly!) cites a district poverty rate of 65% (we’ll put aside the fact that the district itself lists a rate of 50%), and wonders, gee, if this district experiencing crushingly high poverty can get such great results, then why can’t others?
The most recent seasonally adjusted unemployment rate in Mitchell County, KS, is 4.4%. The 2010 census lists a poverty rate of 9.5%, a homeownership rate of 80%, and a median household income of $41,000–something’s not adding up here. Median household incomes in the South Bronx, e.g., are under $20,000, with localized unemployment rates around 25%.
Of course, Waconda is an anomaly–the cost of living, particularly housing, is insanely low, and $40,000 travels further here than almost anywhere else in the country. The children in this district certainly aren’t well off, and I don’t want to minimize the great work being done by the schools. But it’s ludicrous to compare this kind of “poverty” with the poverty found in many of our cities or even other rural areas like Appalachia or south Texas. And it’s precisely the inaccurate “reduced + free” lumping that enables the comparison in the first place.
Yes, this is a bugaboo of mine, brought on by years’ worth of local charters touting their ability to better serve children living in poverty when they almost always serve fewer free-lunch eligible kids and more reduced-price than their district counterparts. But I would hope that we could all find common ground on having the data presented in a clear and simple way that lends itself to accurate interpretation.
Reporters should ask their sources to comment on test scores AHEAD of time. Why?
1. How it works now: easy for source to cover themselves.
Let’s say scores in X state go up. Reporter calls education commissioner. “Yes, we did Policy 1, 2, 3 and it is working.”
Scores are down. Commissioner: “Policies 1, 2, 3 haven’t had time to work.”
2. How it should work: sources asked in advance.
If reporters before data is released, it’s harder to make that claim. Either the policies have had enough time or not. Pick a side.
If not, then the Policymaker is arguing he shouldn’t be blamed for declines, but he can also not claim credit for gains released 3 days later.
If yes, Policymaker can claim credit for test score increase, but should take blame for any decline.
The problem is that some people hear in advance “unofficially” how the scores came out. So they could cheat your system.
I’m interested to see what Oklahoma’s reading scores look like. After 11 years of universal preschool — and with Oklahoma leading the nation in the percentage of kids in preschool — isn’t it reasonable to expect some improvement in NAEP scores by now?
Here’s some information to help you analyze the DC scores. I’ll start with the “good news” – math – where, unlike reading, the scores went up. However the 2011 math score increases are not impressive when compared to greater increases in the past. Fourth grade math scores increased three percentage points (219 to 222) from ’09 to ’11. This is down from a five point increase between ’07 and ’09 (214 to 219) and is equal to or lower than increases in previous years (three points between ’05 and ’07 and four points between ’03 and ’05 – way before reform).
Check it out in the upper right hand column NAEP DC snapshot page: http://nces.ed.gov/nationsreportcard/pdf/stt2011/2012451DC4.pdf
Eighth grade scores increased by the same number of percentage points (6) between ’09 -’11 as they did between ’07-’09. Where are the effects of reform here? http://nces.ed.gov/nationsreportcard/pdf/stt2011/2012451DC8.pdf
Eighth grade reading between ’09 and ’11 is completely flat at 242. http://nces.ed.gov/nationsreportcard/pdf/stt2011/2012454DC8.pdf There was a one point increase between ’07 and ’09 (from 241 to 242). Between ’05 and ’07, before reform came to DC, there was a three point increase (from 238 to 241). While reading scores have been creeping along at a dismal pace for years, reform has been no help at all.
The situation is a bit worse in the 4th grade.
Click to access 2012454DC4.pdf
What officials are calling “flat” for the 4th grade reading scores is actually a one point decline, from 202 in ’09 to 201 in ’11. This is pitiful compared to the five point increase (193 to 202) between ’07 and ’09 and the six point increase (191-197) between’05 and ’07 – prior to reform.
I look forward to your additional analysis.
Correction — I should have said “points” instead of percentage points. The NAEP uses a 0 to 500 point scale.
Please check out this site for excellent and thorough analysis of the DC scores