(Guest Post by Matthew Ladner)
When the National Center for Education Statistics first released the 2013 NAEP data, the website refused to cooperate with requests to give charter/district comparisons for the District of Columbia. This is of especially strong interest given that 43% of DC public school children attend charter schools.
Well lo and behold the NAEP website decided to start cooperating, and the data tells a pretty amazing story: district schools are improving over time in DC, but charters show even stronger growth.
NAEP takes new random samples of students in each testing year, but judges performance consistently across time. Making comparisons between district and charter students isn’t easy. The percentages of students in special programs for children with disabilities and English Language Learners can potentially impact average scores. So for instance if DC charter schools have fewer children with disabilities enrolled, or fewer ELL students, or fewer low-income children enrolled, they could appear to be doing a better job educating students when the truth could be quite different.
Fortunately NAEP allows us to take these factors into account. The charts below show NAEP data that gets as close to an “apples to apples” comparison as possible, comparing only the scores of Free and Reduced lunch eligible students in the general education program. Two other sources of bias that could be expected to work against charter schools involve new schools and newly transferred students. Organizations tend to not be at their best during their “shakedown cruise” and schools are no exception. Also students tend to take a temporary academic hit as they adjust to a new school after transferring. Charter schools tend to have lots of new schools full of kids who just transferred in-providing a double whammy when looking at any snapshot of performance.
Unfortunately, NAEP does not contain any tools for taking the age or the school or length of enrollment into account. Thus DC charter schools are fighting at a bit of disadvantage, and a very substantial funding disadvantage, in the below charts.
DC charters may be fighting with one hand tied behind their back, but it did not stop them from scoring a knockout on NAEP. DC charters widened their advantage in the percentage of children scoring “Basic or Better” from 4 points in 2011 to 9 points in 2013.
DC district students saw a large improvement in 8th grade reading between the 2011 and 2013 NAEP, but still found themselves trailing the achievement of DC charter students by 5%. In 4th Grade math, district students scored a very large gain, but charter students achieved an even larger improvement.
On 8th grade math, district students demonstrated impressive gains, but DC charter students were 19% more likely to score “Basic or Better.”
Hopefully the race to excellence will continue and even accelerate. Meep! Meep!
This is unalloyed good news. Nonetheless, comparisons of this sort are not genuine apples-to-apples; you get closer to that when you compare charter lottery winners with charter lottery losers, because that removes self-selection from the two student population pools; and that usually reduces or eliminates apparent charter superiority (but I have no information about whether that is true in this case in D.C., where all students appear to be doing better).
I agree that random assignment would be better, but lack access to such data if it exists. Even random assignment data might not tell the full story if not all charters hold lotteries. This is simply as close to apples to apples as I can get NAEP data.
This almost certainly is a district-by-district issue: some, not all, apparently record everyone who signs up for such lotteries as take place, record who wins the lotteries, and have the data capacity to maintain subsequent records of those students’ achievements, usually on state tests. If you had the lists of the students in these natural random assignment experiments, you could probably connect them to NAEP results, as well; but it would take a lot of work, and might not result in much to justify the effort — except, perhaps, to debunk a common theory.
Hi Bruce, technical small-ball: another issue that has come up in charter lottery loser studies is some charters don’t have enough “denied” charter students to track. Others get in off the wait list, so are listed both as “not admitted” and “admitted” and have to be discounted.
Matt: Great work. Let me ask this. If DC charters seem only okay as measured by CREDO, and DC district is perceived as one of fastest improving in USA, does that even make the DC charter accomplishment more impressive?
Thanks Mike- I’d have to take another look at how CREDO is measuring things to comment.
Thanks, Mike; good point. I suppose this would likely skew these charter lottery loser study conclusions in the direction of making average chartered school look somewhat more effective than the average truly is — assuming that the more effective chartered schools would be the ones likeliest to have lotteries and therefore lottery losers.
Bruce, actually the Boston study (Kane/Angrist) deals skillfully with the issue you raise, if you feel like taking a look. Cheers.
Do you have a link? Thanks for the tip (it’s nice to be involved in a non-acrimonious discussion of education, for a change).
Hi Bruce — Here is a link to the study that I think Mike G. had in mind: http://www.gse.harvard.edu/~pfpie/pdf/InformingTheDebate_Final.pdf
It was updated here: http://www.tbf.org/~/media/TBFOrg/Files/Reports/Charter%20School%20Demand%20and%20EffectivenessOctober2013.pdf
There are actually now quite a few similarly rigorous studies of charter school effects on participating students. Collin Hitt has written some summaries on this blog with links to the original studies. They keep coming out, so he keeps adding posts. Here are links to some of his posts and you can then follow the links to the original studies:
Thank you, Jay!
[…] D.C. schools are improving, but D.C. charters are improving even faster, writes Matthew Ladner on Jay P. Greene’s blog. National Assessment of Education Progress […]
Free & reduced mixes up two cohorts; free lunch students (obviously) poorer & tend to do worse. In NYC anyway charters have fewer free lunch students than surrounding district schools though match them in Free & reduced.
You raise an interesting point, so I ran the Free vs. Free and Reduced numbers for both the country and DC in TUDA on 4th grade reading. Nationally Free and Reduced Lunch outscores Free Lunch by a single point. In DC, the Free and Reduced Lunch Score is the same as the Reduced Price lunch score, and the category for Reduced only did not meet reporting standards for NAEP. I’m not sure whether that is because there aren’t many students that qualify (nationally 80 percent of FRL students qualify for a free lunch but it might be higher in DC) or if it has to do with the way DCPS runs their FRL program.
[…] enroll 44 percent of all DC students. However, charter test scores are also impressive. The academic gains of charter schools are even better than the fast-improving DC public schools which lead the nation in academic […]