(Guest Post by Jonathan Butcher)
Last week, a South Carolina education blog called “The Voice for School Choice” posted links to an article on the worst schools in the U.S. South Carolinians should be particularly irked with the article because 11 SC schools made the top 25. All is not what it seems, though; below is a touch of salt to be added to the results of this article (“25 Worst Performing Public Schools in the U.S.”). At issue is not the intelligence of the authors nor their ability; however, they make very strong claims as to the significance of their findings, and readers should be aware of the foundation on which the authors make these claims regarding student achievement.
“Worst Schools” was composed by a website called “Neighborhood Scout” and published on a financial blog operated by AOL called “WalletPop.” Neighborhood Scout specializes in “nationwide relocation software, retail site selection, and real estate investment advertising.” They are not an academic department at a university nor a policy research institution, and their founders do not have backgrounds in education or education policy research. The founders’ specialty is geography, computer mapping and web design (there is no evidence that the authors are different from those described on Neighborhood Scout’s web page).
Neighborhood Scout created their own methodology for the “Worst Schools” article. They subtracted the percentage of students who “passed” NAEP in a particular state (I am assuming they mean students who scored at proficient or above—though it could mean basic or above) from the “average percentage” of students in the same state who scored at the proficient or advanced level on the state’s mandatory test. Their objective was to find schools in states where there is a large difference between the percentage of students proficient on a state test and the percent proficient on NAEP in order to make judgments about the difficulty (or lack thereof) of a state test. The article does not compare similar student populations—as does NAEP—or at the least this methodology section does not indicate such disaggregation.
Of note is that the study gives no indication of being peer-reviewed, and peer-review is a robustness check even among research reports not submitted to journals. In addition, the study is a snapshot of test scores. It does not take into account improvement over time, student population changes, or compare scores to some baseline indicator. For example, in the past three years, 6th graders at W.A. Perry (one of the SC schools in the bottom 25) have gone from 48% meeting or exceeding state standards in math to 66%. They are still below the state average, but more students are meeting or exceeding state standards now than three years ago. Similar results can be found in English/Language Arts.
Admittedly, W.A. Perry’s 6th graders’ scores are below the state average; however, they are making progress. My aim is not to defend schools that may be low-performing, but a snapshot of a school’s test scores at one point in time does not a failing school make. NCLB agrees with me, as a school must be in need of improvement for three years before significant intervention takes place.
Additionally, no indication is given by the article as to the student populations served at these schools. For example, Milwaukee Spectrum School (#25) has a total population of 90 at-risk students who had a record of truancy at other schools. The school is often a last stop for students ready to drop out of high school all together. Of course the school is struggling; it is intended to serve struggling students.
In the article, different grades are represented for each school. For example, high schools are not compared to high schools, but to elementary, middle, and high schools. This presents a problem because the trend in NAEP (generally) is that more elementary students score proficient than middle school students, and more middle school students score proficient than high school students (this is true across subjects).
Further, scores are not reported for every grade in every subject. So a high school with low-scoring 11th graders may be on the “Worst Schools” list right before a middle school who has low-scoring 8th graders but a class of 6th graders with scores closer to a state’s average.