Usual Suspect Mark Pocan spins a Keyser Soze story on GAO Parental Choice Report

September 16, 2016

(Guest Post by Matthew Ladner)

The American film classic the Usual Suspects (spoiler alert!) features a quick thinking unreliable narrator Verbal (played by Kevin Spacey) who concocts a vivid tale based on material on a bulletin board sitting behind the police officer who is interrogating him. Representative Marc Pocan has used equivalent powers of imagination and a recent GAO report on private choice programs as his bulletin board to spin his own imaginative and deceptive tale.

First the report:

gao

The report is a straightforward description of the nation’s voucher and ESA programs, and deals with primarily with a state of confusion among school districts as to whether they are obligated to provide “equitable services” to special needs students who participate in private choice programs. It’s a fairly dry 49 page read if you go through the report, although it does have the occasional interesting graphic like this one:

gao-1

In any case after a number of pages of descriptive work the report concludes:

gao-4

You are welcome- I waded through this report so you wouldn’t have to!

So from this bulletin board material Rep. Marc Pocanconcocts his tale of woe and destruction visiting down upon the states like Biblical plagues from private choice programs in a piece in HuffPo titled omniously Why You Should be Worried About the Rapid Rise of Private Voucher Schools:

gao-3

In other words, private choice programs are the most vicious gangster in the history of Pocan’s imagination:

These claims have even less to do with the GAO report than Officer Kujan’s bulletin board relationship with the tale of Keyser Soze.  The only “discovery” in the GAO report-districts are confused about whether they are obligated to provide special education services to students participating in private choice programs in the same fashion they do to other private school students, which is to say, not much to begin with. Thus the report recommends USDoE guidance to districts to dispel confusion because the districts retain discretion on whom to serve.

The real discovery here is that Rep. Pocan is willing to spin long-known facts about private choice programs into a breathless but ineffectual attempt at a hit piece. In order-

  1. Teacher prep has always been different between public and private schools and there is approximately zero evidence that traditional certification produces better learning, but hey if you want state certified teachers the public school system is still there as an option.
  2. Some private choice programs require schools to change their admission policies, but many do not. Let me know when you get the GI Bill to require random admission lotteries into the Ivy League and I’ll start to take you seriously on this. No? How about random lottery admissions for open enrollment transfers between district schools, who currently get to pick and choose at will? The total number of seats available may be greater for lighter touch programs and overly meddling with private schools can and has backfired in a lack of seats in high quality private schools.
  3. Money is following the child, lamest claim in the opponent playbook.
  4. Perceived deficiencies in taxpayer-subsidized public schools to students with disabilities is why parents choose to participate in the first place. Satisfaction surveys of special needs choice programs have been off the charts. Private choice programs expand the options for special needs parents.

Sadly, rather than engage in an intellectually honest debate, Rep. Pocan has constructed a boogey-man story and attempted to claim that the GAO told it to him before he started repeating it. They did nothing of the sort, and silly efforts like this is example number 89,623 of why choice opponents willingly surrender their credibility on a regular basis.


The Professional Judgment Un-Dead

March 25, 2009

It’s time we drive a stake through the heart of “professional judgment” methodologies in education.  Unfortunately, the method has come back from the grave in the most recent Fordham report on regulating vouchers in which an expert panel was asked about the best regulatory framework for voucher programs.

The methodology was previously known for its use in school funding adequacy lawsuits.  In those cases a group of educators and experts was gathered to determine the amount of spending that is required to produce an adequate education.  Not surprisingly, their professional judgment was always that we need to spend billions and billions (use Carl Sagan voice) more than we spend now.  In the most famous use of the professional judgment method, an expert panel convinced the state courts to order the addition of $15 billion to the New York City school system — that’s an extra $15,000 per student.

And advocates for school construction have relied on professional judgment methodologies to argue that we need $127 billion in additional spending to get school facilities in adequate shape.  And who could forget the JPGB professional judgment study that determined that this blog needs a spaceship, pony, martinis, cigars, and junkets to Vegas to do an adequate job?

Of course, the main problem with the professional judgment method is that it more closely resembles a political rather than a scientific process.  Asking involved parties to recommend solutions may inspire haggling, coalition-building, and grandstanding, but it doesn’t produce truth.  If we really wanted to know the best regulatory framework, shouldn’t we empirically examine the relationship between regulation and outcomes that we desire? 

Rather than engage in the hard work of collecting or examining empirical evidence, it seems to be popular among beltway organizations to gather panels of experts and ask them what they think.  Even worse, the answers depend heavily on which experts are asked and what the questions are. 

For example, do high stakes pressure schools to sacrifice the learning of certain academic subjects to improve results in others with high stakes attached?  The Center for Education Policy employed a variant of the professional judgment method by surveying school district officials to ask them if this was happening.  They found that 62% of districts reported an increase in high-stakes subjects and 44% reported a decrease in other subjects, so CEP concluded that high-stakes was narrowing the curriculum.  But the GAO surveyed teachers and found that 90% reported that there had not been a change in time spent on the low stakes subject of art.  About 4% reported an increase in focus on art and 7% reported a decrease.  So the GAO, also employing the professional judgment method, gets a very different answer than CEP.  Obviously, which experts you ask and what you ask them make an enormous difference.

Besides, if we really wanted to know about whether high stakes narrow the curriculum, shouldn’t we try to measure the outcome directly rather than ask people what they think?  Marcus Winters and I did this by studying whether high stakes in Florida negatively impinged on achievement in the low-stakes subject of science.  We found no negative effect on science achievement from raising the stakes on math and reading.  Schools that were under pressure to improve math and reading results also improved their science results.

Even if you aren’t convinced by our study, it is clear that this is a better way to get at policy questions than by using the professional judgment method.  Stop organizing committees of selected “experts” and start analyzing actual outcomes.