Carr Makes It 19-0

August 17, 2011

This finding’s been replicated more often than Picard’s Earl Grey.

(Guest post by Greg Forster)

Still clearing the backlog: I haven’t had a chance yet to tout this new empirical study of Ohio’s EdChoice voucher program, by my old colleague Matt Carr, finding that – guess what, you’ll never believe this – vouchers improve outcomes at public schools!

Building on a large body of previous studies, this makes it nineteen (19) high-quality empirical studies finding school choice improves public schools and zero (0) studies finding it harms public schools.

Interestingly, Carr finds the positive impact is concentrated among the highest and lowest performing students. Since EdChoice is a failing schools voucher, you might expect schools to respond by improving service to those “bubble” students who are near the state proficiency cutoff. However, Carr finds the opposite.

Matt hypothesizes – plausibly enough – that schools are responding by improving services to the students who are most likely to use the voucher to leave. Low-performing students have the most obvious motivation to seek better services, while high-performing students are the most likely to have actively involved parents.

I do have one quibble with the study. Matt writes that his study “provides an analysis of a voucher program that has not yet been rigorously studied for its competitive effects on traditional public schools.”

Oh, really?


Cincinnati Enquirer on EdChoice: Good Story, Bad Headline

January 28, 2009

(Guest post by Greg Forster)

On Saturday the Cincinnati Enquirer ran a story on how Ohio is sitting on a bunch of student outcome data for the EdChoice voucher program and neither doing anything with them nor releasing them to researchers who could do something with them. I’m told it was picked up by AP.

The story is generally good. Transparency is always preferable. Student privacy concerns do limit the extent to which the state can release data to the general public, but the state ought to be able to release a lot more than it has, and it also ought to license private researchers to use more sensitive data on a restricted basis, just as NCES does.

The story’s author, naturally enough, wanted to provide what little data are available. So she provided the number of EdChoice students who failed the state test in each subject.

Readers of JPGB probably already know this, but any outcome measurement that just takes a snapshot of a student’s achievement level at a given moment in time, rather than tracking the change in a student’s achievement level over time, is not a good way to measure the effectiveness of an education policy. A student’s achievement level at any given moment in time is heavily affected by demographics, family, etc. Growth over time removes much of the influence of these extraneous factors (though obviously it doesn’t remove absolutely all the influence, and further research controls or statistical techniques to remove these influences more are preferable).

Moreover, EdChoice program is specifically targetd to students in the very worst of the worst public schools. These are students who are starting from a very low baseline. We should expect these students’ results to remain well below those of the general student population even if vouchers are having a fantastically positive effect. So the need to track students over time rather than simply take a snapshot of their achievement levels is especially acute here. Only a rigorous scientific study can examine whether the EdChoice voucher program is improving these students’ performance – and to do that we’d need the data that the state is sitting on.

Also, a binary measurement of outcomes (pass/fail) is never as good as a scale. The state is sitting on scale measurements of the students’ performance, but from the Enquirer story it appears that it won’t release them.

And the Enquirer was only able to obtain these pass/fail results for 2,911 students out of about 10,000 served by the program.

All that said, I don’t blame the Enquirer for reporting what few data were available. The story is focused on the state’s stinginess with data, not the performance of the program as such.

But what headline did the paper put on the story?

“Ed Choice Students Failing.”

Of course the story’s author doesn’t choose the headline. And the person who did choose the headline almost certainly had to do so under intense deadline pressure, without much space to work with, and with no knowledge about the issues other than what could be gleaned from a very quick and superficial reading of the story. Still, since the story clearly focuses on the issue of the state’s sitting on valuable data without using them, you would think they could come up with something like “Voucher Data Not Used.”


Yet Another Study Finds Vouchers Improve Public Schools

August 21, 2008

(Guest post by Greg Forster)

The Friedman Foundation has just released my new study showing that Ohio’s EdChoice voucher program had a positive impact on academic outcomes in public schools. I’m told that it has generated a number of news hits, though the only reporter to interview me so far was the author of this piece in the Columbus Dispatch. When she interviewed me I thought she was hostile, because her questions put me a little off balance, but the article is perfectly fair. I guess if the reporter is doing her job right, the interviewees ought to feel like they were being challenged. The final product is what counts.

The positive results that I found from the EdChoice program were substantial but not revolutionary. That’s not surprising, given that 1) failing-schools vouchers aren’t the optimum way to structure voucher programs in the first place, and 2) the data were from the program’s first year, when it was smaller and more restricted than it is now.

It’s too early to be sure, but among the large body of empirical studies consistently showing that vouchers improve public schools, a pattern seems to be emerging that voucher programs have a bigger impact on public schools when they’re larger, more universal, and have fewer obstacles to parental participation. That’s worth watching and studying further as opportunities arise.