Replication, The True Test of Research Quality

When people can’t argue the facts, they argue peer review.  That’s been my experience when I’ve released non-peer reviewed reports.  Without peer review, folks wonder, how can we know whether to trust these results?

The reality is that even with peer review people still need to wonder whether to trust results.  Peer-review is by definition irresponsible — by which I mean that the reviewers have no responsibility.  By being anonymous, reviewers offer their opinions on the merit of research without any meaningful consequence to themselves.  Many reviewers do a laudable job, but there is nothing to stop them from using their reviews to advance findings they prefer and block findings they dislike regardless of the true merit of the work.  Peer-review is often little more than the anonymous committee vote of a panel composed of some mix of competitors and allies.  It is about as reliable as the Miss Congeniality vote at a beauty contest.  Do we really think she’s the nicest contestant or did the other contestants voting anonymously have ulterior motives for burying her with faint praise?

The true test of research quality is replication.  Science doesn’t determine the truth by having an anonymous committee vote on what is true.  Science identifies the truth by replicating past experiments, applying them to new situations, to see if the results continue to hold up. 

I’m pleased to say that several pieces of my work have been successfully replicated.  By successful replication I mean that the basic findings are upheld.  Replicators almost always make new and different choices about how to handle data or run an analysis.  The question is whether the same basic conclusion is found even when those different choices are made.

The evaluation I did with Paul Peterson and Jiangtao Du of the Milwaukee voucher experiment was successfully replicated by Cecilia Rouse.  The evaluation I did of the Charlotte voucher program was successfully replciated by Josh CowenMy study of of Florida’s A+ voucher and accountability program was successfully replicated three times — by Raj Chakrabarti; Rouse, et al; and West and Peterson.  And my graduation rate work has been successfully replicated by Rob Warren and Chris Swanson.

The interesting thing is that every one of my studies above was initially released without peer review.  And every one of them was attacked for being unreliable because they were not peer reviewed.  When they were all later published in peer reviewed journals (except the grad rate work) and successfully replicated I don’t remember ever hearing anyone retract their accusations of unreliability. 

(edited for typos)

4 Responses to Replication, The True Test of Research Quality

  1. Greg Forster says:

    Of course, if the accusations were motivated by genuine concern about scientific reliability, they would have been retracted – or at the very least the same people wouldn’t continue to make such reckless accusations about new work after having seen that the old work was vindicated in spite of their reckless accusations.

    By amazing coincidence, my own contribution to the debate over peer review has just been published on Mark Steyn’s site this morning (backstory here and here). I’m honored and humbled to have my comments featured on Mr. Steyn’s site.

  2. You might have also mentioned two other problems with with peer-review as the gold standard. reviews: most reviewers *try* to be objective but research by my colleague at UVa, Tim Wilson, and others show that people are more critical of work that conflicts with their own and more lenient on work that agrees with theirs. Still, there are some checks in the system to try to solve that problem: (1) authors can request that known “enemies” not serve as reviewers (2) a decent editor will not use a reviewer again who is obviously unfair. The second problem is not so easily solved. Some journals are just lousy and even though they are peer reviewed, they will publish almost anything.
    I agree with your basic point–replicability is crucial.

  3. Patrick says:

    Well said.

  4. Thanks, Dan. I agree that this is not a comprehensive list of the limitations of peer review. Thanks for adding these points.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: