(Guest post by Brian Kisida)
Last week the Feds released the latest NAEP assessment of students’ understanding of U.S. history. It contained a mostly negative assessment of history knowledge, including some tidbits like only nine percent of fourth-graders could identify a photograph of Abraham Lincoln and give two reasons why he is an important historical figure. You know the drill: First act shocked that our students did so poorly, wring your hands a bit, blame your favorite thing/organization/political movement for creating this travesty, and then finish by lamenting the eventual end of democracy and civilized society as we know it (and plenty of people will also tell you the end of civilization can be avoided, of course, if we give schools additional resources or adopt national standards). Everyone’s doing it, from the folks over at Fordham to Diane Ravitch. Diane says she’s worried because when it comes to our high school seniors, “all of these students will be voters in a year.” Well, not if 200+ years of voter-turnout data has anything to say about it.
Another annoying thing about all of the hand-wringing coverage generated by these types of reports is the way people discuss NAEP’s outcome measures, such as “Basic,” “Proficient,” or “Advanced” as if they’re entirely objective. Here’s an excerpt from Ravitch’s statement on the issue:
“It’s worth noting that of the seven school subjects tested by NAEP, history has the smallest proportion of students who score Proficient or above in the most recent results available. Among twelfth graders, for example, only 12 percent reach Proficient in U.S. history, compared to 21 percent in science, 24 percent in both civics and writing, 25 percent in geography, 26 percent in mathematics, and 38 percent in reading.”
Or take, for example, the Boston Globe, which concluded from the same data that:
“In fact, American kids are weaker in history than in any of the other subjects tested by the NAEP — math, reading, science, writing, civics, geography, and economics.”
It’s as if they think NAEP’s outcome categories were set by the International Committee on Weights and Measures using specific gravity and atomic clocks. They aren’t. They are arbitrary categories determined by “experts,” and they certainly aren’t comparable across subjects. We can’t conclude that students are doing worse in history than they are in math or english simply by looking at proficiency rates.
The results are, however, comparable across time. When viewed longitudinally, there are a few positives in this latest report. Scores for eighth-graders were up across the board, and scores for Black and Hispanic eighth-graders were especially positive, significantly narrowing the White-Black test score gap.
However, like we’ve seen time and time again with NAEP results, twelfth-graders aren’t budging. And at the end of the day, if twelfth graders are stagnating then gains for eighth-graders are largely irrelevant.
To be honest, I think it’s difficult to guage the state of history education based upon NAEP’s measures, or based upon shoddy attempts by others to interpret them. I really don’t know, for example, exactly how many fourth graders should be able to tell me the importance of Abraham Lincoln. What I do know, and what I find disturbing, is that we have continued to allocate more resources for high school history over this same time period that high school scores have remained flat. As the NAEP report points out, the number of schools offering A.P. U.S. history courses has risen from 51 percent in 1990 to 80 percent in 2009. And the percentage of students who have taken an A.P. history class has more than doubled since 1990. You would think that would lead to some observable gains for high-schoolers.