Research Roundup

Two new studies deserve your attention.  The first is a follow-up on the random-assignment evaluation of charter schools in Boston led by Josh Angrist at MIT.  Here’s a summary from the press release:

“This study builds on earlier work using admissions lotteries which showed impressive short-run achievement gains for those randomly offered seats at Boston’s charter schools.  As the applicants have grown older, it’s now become possible to measure longer-term outcomes like SATs and college going,” said [MIT professor, Parag] Pathak.

The study shows that there are large positive effects of charter high schools on grade 10 MCAS for both English and Math.  Students are more likely to meet MCAS-based graduation requirements, earn eligibility for Adams scholarships, and score in the Proficient or Advanced categories.  Charter students are also more likely to take AP exams, though the score gains on the tests they take are modest.  SAT composite scores increase by 100 points and SAT Math scores increase by 50 points.

Early evidence on college shows that no overall effect on college enrollment, but a marked shift from two-year colleges to four-year colleges.  Charter high schools cause students to enroll in four-year public colleges, with many applicants enrolling at public schools within the state of Massachusetts.

And the other is a quick analysis by Marty West on the Education Next blog that shows families are not fooled by inflated scores on state tests with weak performance standards.  Parental assessment of school quality tracks NAEP results.  This undermines the case for Common Core assessments as an antidote to the misinformation produced by lousy state tests.  I’ll let Marty explain it:

There’s no doubt that the definition of proficiency in many states provides a misleading view of the extent to which students are prepared for success in college or careers.  Yet whether the way in which states define proficiency matters for student achievement is far from clear.  As Tom Loveless demonstrated in the 2012 Brown Center Report on American Education, the rigor of state proficiency definitions is largely unrelated to the level of student achievement on the NAEP across states.   Similarly, Russ Whitehurst and Michelle Croft have shown that the quality of state standards (as assessed by third party organizations) is unrelated to NAEP scores, a finding confirmed by the Harvard Kennedy School’s Josh Goodman in an analysis that examined the effects of changes in the quality of standards within states over time.  The lack of  a systematic relationship between either the rigor or the quality of state standards and student achievement casts doubt on claims that higher and better standards under the Common Core will, in and of themselves, spur higher student achievement.

Less attention has been paid to whether the rigor of state standards matters for public perceptions of the quality of the schools in their states and local communities.  If using a more lenient definition of proficiency leads citizens to evaluate their schools more favorably, then the advent of common expectations under the Common Core could alter public perceptions quite dramatically – perhaps increasing pressure for reform in regions of the country in which state proficiency definitions have provided an inflated view of student accomplishment.  Is such an outcome likely?

To shed light on this question, I use data from two surveys conducted in 2011 and 2012 under the auspices of Education Next and the Program on Education Policy and Governance at Harvard University.  In each year, my colleagues and I asked a nationally representative sample of roughly 2,500 Americans to grade the public schools in their local community on a standard A-F scale.  In the figures below, I examine whether the average grade the residents of each state assigned to their local schools is associated with the share of 2011 8th graders deemed proficient by the state’s own test and by the NAEP.  To the extent that differences in the definition of proficiency from one state to the next interfere with citizens’ ability to discern the performance of their local schools, we should see that the average grades citizens assign their schools hew more closely to proficiency rates as determined by state tests than by the NAEP.

The figures demonstrate the opposite….

A simple regression of the average grades citizens assign to local schools in each state on NAEP and state proficiency rates simultaneously confirms that average grades (1) are strongly correlated with NAEP proficiency rates and (2) after controlling for NAEP proficiency rates, have no relationship whatsoever with proficiency rates on state tests.   An increase in NAEP proficiency rates of 32 percentage points – the difference between Washington DC and Massachusetts – is associated with an increase in citizen ratings of more than a half of a letter grade.  Holding NAEP scores constant, a difference in state test proficiency rates matters not at all.

In short, this evidence suggests that Americans have been wise enough to ignore the woefully misleading information about student proficiency rates generated by state testing systems when forming judgments about the quality of their state’s schools.  This does not mean that they ignore state testing data altogether.  Indeed, Matthew Chingos, Michael Henderson and I have shown that, within a given state, the grades citizens assign to specific elementary and middle schools are highly correlated with state proficiency rates in those schools.  Nor does it necessarily imply that information from the NAEP has a causal effect on perceptions of school quality.  The relationship between NAEP performance and the grades citizens assign their schools could easily be driven by other variables, such as the prosperity level of the state, that influence student achievement levels and could also influence school grades.  Yet it does suggest that the implementation of the Common Core, by providing information about performance against a common standard, may have less of an impact on public perceptions of school quality than many have projected.

And that’s all we have for this roundup.  Yeeehaw!

Advertisements

One Response to Research Roundup

  1. matthewladner says:

    Those charts are very interesting. Over-confidence rather than NAEP scores seem to drive evaluations in Alaska, and note that people in Mississippi and Delaware have similar views of their schools.

    I’d like to see Marty continue to investigate this.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s