Beware of Mis-NAEPery but also NAEPilism

April 3, 2018

(Guest Post by Matthew Ladner)

The 2017 NAEP will be released next week, and a few notes seem in order. Over time, the term “mis-NAEPery” has slowly morphed into a catchall phrase to mean “I don’t like your conclusions.” Mis-NAEPery however has an actual meaning- or at least it should- which ought to be something along the lines of “confidently attributing NAEP trends to a particular policy.”

Arne Duncan for instance took to the pages of the Washington Post recently in order to lay claim to all positive NAEP trends since 1990 to his own tribe of reformer (center left):

Lately, a lot of people in Washington are saying that education reform hasn’t worked very well. Don’t believe it.

Since 1971, fourth-grade reading and math scores are up 13 points and 25 points, respectively. Eighth-grade reading and math scores are up eight points and 19 points, respectively. Every 10 points equates to about a year of learning, and much of the gains have been driven by students of color.

Duncan then proceeds to dismiss the possibility that student demographics had anything to do with this improvement, as the American student body has grown “It should be noted that the student population is relatively poorer and considerably more diverse than in 1971.” This is a contention however deserving dispute, given that the inflation adjusted (in constant 2011 dollars) income of the poorest fifth of Americans almost doubled between 1964 and 2011 once various transfers (food stamps, EITC etc.) have been taken into account. Any number of other things could also explain the positive trend, both policy and non-policy related, but never mind any of that, Mr. Duncan lays claim to all that is positive.

Duncan was not finished yet, however, as he was at pains to triangulate himself away from those nasty people who support more choice than just charter schools:

Some have taken the original idea of school choice — as laboratories of innovation that would help all schools improve — and used it to defund education, weaken unions and allow public dollars to fund private schools without accountability.

Well that sounds a bit like how a committed leftist would (unfairly) describe my pleasant patch of cactus. Arizona NAEP scores, could you please stand to acknowledge the cheers of the audience:

So the big problem in that chart are the blue columns. These charts stretch from the advent of the Obama years until the (until Tuesday) most recently available data. We won’t be getting new science data this year, so ignore the last two blue columns on the right. What we are looking at is changes in scores of 1 point in 4th grade math, -1 point in 8th grade math, 1 point in 4th grade reading and two points in 8th grade reading. There’s only one state that made statistically significant academic gains on all six NAEP tests during the Obama era, but it just so happens to be one of the ones adopting the policies uncharitably characterized by Duncan’s effort at triangulation.

There were some very large initiatives during these years- Common Core standards, teacher evaluation, etc. and we can’t be sure why the national numbers have been so flat, but let’s just say that a net gain of three scale points across four 500 scale point tests fails to make much of an impression. Supporters of the Common Core project for instance performed a bit of a Jedi mind trick around the 2015 NAEP by noting that scores were also meh in states that chose not to adopt, and that 2015 was early yet. Fair enough on the early bit, but the promise of an enormous investment of political capital in the project was not that adopting states would be equally meh, but rather that things would get better.

Where’s the BETTER?!?

Duncan’s misNAEPery however is of the garden variety- there has been far worse. Massachusetts for instance instituted a multi-faceted suite of policy reforms in 1993, and their NAEP scores increased from a bit better than nearby New Hampshire to two bits better than New Hampshire and tops in the country. So far as I can tell, there was approximately zero effort to establish micro-level evidence on any of the multiple reform efforts, or to disentangle to the extent policies were having a positive impact, which policies were doing what. That would be silly- everyone knows that standards and testing propelled MA to the top NAEP scores, and once everyone else does it we will surge towards education Nirvana Canadian PISA scores. Well, I refer the honourable gentleman to tiny blue columns in the chart I referenced some moments ago.

This is not to say that I am confident that testing and standards had nothing to do with MA’s high NAEP scores. I’m inclined to think they probably did, but some actual evidence would be nice before imposing this strategy on everyone. In Campbell and Stanley terms “Great Caesar’s Ghost! Look at those Massachusetts NAEP scores!” lacks evidence of both internal and external validity. In other words, we don’t know what caused MA NAEP scores, nor do we know who if anyone else might be able to pull it off, assuming policy had something to do with it.

So beware of mis-NAEPery my son- the jaws that bite, the claws that catch!  Also beware of NAEP nihilism. Taking off my social science cap, I will note that NAEP is an enormous and highly respected project and it is done expressly for the purpose of making comparisons. Yes we should exercise a high level of caution in so doing, and should check any preliminary conclusions reached against other sources of available evidence. The world is a complicated place with an almost infinite number of factors pushing achievement up or down at any point. There is a great deal of noise, and finding the signal is difficult. NAEP alone cannot establish a signal.

The fact that the premature conclusions drawn from the Massachusetts experience lacked evidence of internal and external validity did not mean that those conclusions were wrong but it did make them dangerous. Alas the world does not operate in a random assignment study. Policymakers must make decisions based upon the evidence at hand, NAEP and (hopefully) better than NAEP. The figure at the top of this post makes use of NAEP and there is a whole lot of top map green (early goodness) turning into bottom map purple (later badness) going on. This is a bad look assuming part of what you want out of your support of K-12 education is kids learning about math and reading in elementary and middle school. Let’s be careful, but let’s also see what happens next.

 

Advertisements

District/Charter Combos that Outperform RSD

February 19, 2018

(Guest Post by Matthew Ladner)

Just out of curiosity, I decided to look into the Reardon data to see how many Arizona district/charter combos outperformed or tied the Recovery School District in academic growth. The above list is by no means exhaustive, more the product of throwing in some district names into the data base before my morning caffeine. The list is not short, and includes several very poor and isolated school districts.

Mind you that no one bombed any of these districts with millions in philanthropy. You can’t go to a bar in Snowflake Arizona and meet a group of six Teach for America students the way you can in the French Quarter. They also managed this somehow without a “harbor-master.” Finally, the Reardon data ends in 2015, and since then the trend in statewide data in Arizona has been positive, but in the RSD not so much. Hopefully Reardon will update his study so we can track this over time.

I want to be careful to note that I regard the Recovery School District to have been a very clever innovation for a district that had almost nothing to leverage but empty school buildings after a hurricane. If that hurricane however had leveled Houston or Dade County I’m afraid that the limited supply of philanthropic dollars and TFA teachers would have been unequal to the (much larger) task. In order to reach scale, we are going to need solutions that do without substantial infusions of outside money, as that is likely to be in increasing short supply.

Having said that, RSD landing in the 92nd percentile in growth in the Reardon data was truly a magnificent accomplishment. The leap however from “well done” to “everyone needs to do this now!!!!” looks very dangerous imo.

 


Tuscon versus Columbus: Round Two

February 16, 2018

(Guest Post by Matthew Ladner)

Yesterday I presented statewide NAEP information contrasting urban schooling achievement trends in Arizona and Ohio, and specifically in Tucson and Columbus. Columbus is surrounded by suburban districts choosing not to participate in open enrollment (typical I fear) while Tucson is surrounded by suburban districts who do participate in open enrollment-and actively so.

Today I remembered the cool data tool that the NYT developed using Sean F. Reardon’s data.

Let me start by saying that if I had to pick a district to showcase Arizona, it would not be Tucson. While I am fully aware of some outstanding schools in TUSD, the district’s reputation (fairly or not-I am no authority on the subject) usually involves enrollment decline, empty school buildings, union sway in school board elections and controversy over some sort of voluntary “La Raza” curriculum in the high schools. A decade ago you could peer into the state’s AIMS data and watch student cohorts fall further behind as they “progressed” through the system.

Arizona however has been leading the nation in academic gains, and Tucson continues to face steady and considerable competition for students not only from charter schools and private choice programs, but also from nearby suburban districts. It is my contention that this broad competition enables the bottom up accountability that results in Arizona’s average charter school closing after only 4 years despite receiving 15 year charters from the state. Reardon’s data includes both district and charter school trends, but how did Tucson fare between 2010 (3rd grade scores) and 2015 (8th grade scores) in terms of academic growth?

Tucson Unified (and charters operating within district boundaries) scored at the 64th percentile for growth during this period. Columbus Ohio meanwhile also had a charter school law active, but no suburban districts willing to allow transfers, per the Fordham map:

How did Columbus fare in the Reardon data?

Columbus scored in the 22nd percentile in academic growth during this period. The news is also grim in Cleveland, Toledo and Dayton although Cincinnati stands out as the Ohio urban progress champion during this period. Overall however things look like in NAEP for the two states.

Now if you want to see something really cool:

The east-west on these columns indicate the relative wealth of the district, and Phoenix Elementary and charters sit at the tip of the gains spear.


New Stanford Study of Academic Gains in School Districts

December 6, 2017

(Guest Post by Matthew Ladner)

Fascinating new study from Stanford University’s Center for Education Policy Analysis using standardized test scores from 45 million students to track academic growth in over 11,000 school districts. The study tracks progress from grades 3 through 8. The money graph comes on page 33 and is included above. Just to save your eyes from squinting, let me provide a play by play: the top map shows average 3rd grade scores by district. Purple is low, green is high.

The second map shows academic progress over time between grades 3 and 8 between 2009 and 2015. Again purple is low, green is high.

Ok so just to (once again) brag on the Cactus Patch, you’ll notice that everything and anything on the top map bordering Canada looks green, anything and everything bordering Mexico from the Rio Grande Valley to So-Cal looks purple on the top map. Many decades after Senator Moynihan noted that the average performance on state tests is highly correlated with proximity to Canada, it remains the case today.

Cast your gaze down to the second map and you’ll see some signs for hope-most prominently in my book Arizona flipping from almost entirely purple to mostly green in growth. BOOOM!

Now before I get comment section bricks thrown my way from the Dr. Eponymous, let me hasten to add that these results while relying upon state test scores, are entirely consistent with Arizona’s NAEP results in Arizona’s case. Given that there is no ability or incentive to teach to the NAEP, I feel reasonably confident that either the academic knowledge or the testing “give a darn” of Arizona students (or some combination thereof) is on the rise. I interpret either of these things as very welcome developments and I’m not overly concerned about the mixture.

It is also worth noting that since these results focus only on school districts with the highest statewide percentage of Arizona students attending charter schools, and Arizona charter schools exceeding districts in academic growth on NAEP (see below), that the above charts underestimate Arizona’s total progress between 2009 and 2015. Arizona is does the purple to green flip with only the yellow columns in the below NAEP graph (same period as the Sanford study): * see correction below

Tennessee also does the purple to green flip so bully for them. Notice that most of the Northeast starts out very green and ends pretty purple, but er, they aren’t alone in this. Cool graphic features in this NYT write up where you can plug in a school district and watch it move between 3rd and 8th grade here.

It’s too much…it’s too much winning! No Arizona we’ve got to win MOARRRRRRRRR!!!!!!!

 

CORRECTION : The author included charter school scores in the districts in which they operate.