Administrative Bloat in the WSJ

December 30, 2012

This weekend the Wall Street Journal had a front page piece detailing how administrative bloat in higher education is causing costs to spiral higher.  The piece carefully dissects hiring patterns at the University of Minnesota to illustrate their general conclusion that:

Across U.S. higher education, nonclassroom costs have ballooned, administrative payrolls being a prime example. The number of employees hired by colleges and universities to manage or administer people, programs and regulations increased 50% faster than the number of instructors between 2001 and 2011, the U.S. Department of Education says. It’s part of the reason that tuition, according to the Bureau of Labor Statistics, has risen even faster than health-care costs.

This conclusion is not new, but it is great to see it getting the attention it deserves.  As regular readers of JPGB will remember, Brian Kisida, Jonathan Mills, and I released a report on administrative bloat at the nation’s 200 leading research universities.  That report elicited a tizzy fit from Arizona State University President Michael Crow, including a letter to the chancellor of my university accusing me of academic fraud.  University leaders who should be the guardians of academic freedom are too often its greatest oppressors.

But the obvious facts about administrative bloat cannot be suppressed.  Johns Hopkins professor Benjamin Ginsberg recently published an excellent book on the topic.  And despite a lousy attempt by the professional association of State Higher Education Officers to spin the data, a subsequent analysis by the Pope Center successfully replicated our results and confirmed out conclusions.

And now we have a front page article in the Wall Street Journal reporting the same thing.  Michael Crow might try writing an angry letter to the editor but university leaders won’t be able to shut this story down.  The good university leaders are already taking steps to reign in runaway non-instructional, non-research costs.  See for example Erskine Bowles efforts at the University of North Carolina or the new leadership at the University of Minnesota.  The bad university leaders will bluster, brow beat, and continue to expand the mission of universities beyond their core missions of teaching and research.


Head Start Manipulating Scumbags

December 20, 2012

I’ve heard that the latest round of results from the federal evaluation of Head Start is due to be released tomorrow afternoon.  And my psychic powers tell me that the results will show no lasting benefit from Head Start, just like the two previous rounds of results.

You heard that right — the federal government is releasing results that the administration dislikes on a Friday afternoon just before Christmas.  They might as well put the results on display in a locked filing cabinet in a disused lavatory behind the sign that says “beware of the leopard.”

Why is the Department of Health and Human Services burying this study just like they delayed, buried, or distorted the previous ones?  Well, because the study is an extremely rigorous and comprehensive evaluation, involving random assignment of a representative sample of all Head Start students nationwide, that I expect will find no enduring benefits from this program that politicians, pundits, and other dimwits constantly want to expand and fund.  Anyone who casts doubt on think tank research should cast a critical eye toward gross manipulations and abuse of research that are perpetrated by the federal government.

I should repeat that the researchers have done an excellent job evaluating Head Start in this case.  It is the bureaucratic class at the Department of Health and Human Services who have cynically manipulated, delayed, and misreported this research.  The pending report is already delayed several years and has been around for a long time.  The decision to release it on the Friday afternoon before Christmas is completely calculated.

I don’t know your names, but I’m going to invest a little energy in tracking down who is responsible for this cynical abuse of research.  If there were any reporters worth their salt left out there, they would bother to expose you but I guess that job has now been passed to bloggers and enterprising individuals.  When I do find your names I will post them so folks can know who the scumbags are who think they can manipulate the policy community by delaying, burying, or misreporting research.  And then when you get hired by that DC think tank, advocacy organization, or other waste of space we’ll be able to remember who you are and assign no credibility to what you have to say.  These kinds of dastardly acts by public servants should not be cost free and if I have any say in the matter they will not be in this case.


A Modest Proposal on State Standards

December 19, 2012

You can check in any time you like, and you can always leave.

(Guest Post by Matthew Ladner)

A few years ago while serving as a VP at the Goldwater Institute I received a request to come out hard against the adoption of Common Core standards in Arizona. I didn’t know whether it would have mattered or not but the request originated from people who I continue now to hold in a great deal of respect. I considered the matter very carefully.  I had deep misgivings regarding Common Core at the time, the most serious of which was the governance of the standards over time. At the time I was of the opinion that unless Ben Bernanke took up the task of governing the standards that it would inevitably follow that Common Core would eventually result in the Great American Dummy Down.

Nevertheless in the end I decided not to oppose Arizona’s adoption of Common Core standards.  Regardless of how bad Common Core started out or later became, Arizona simply had nothing to lose.  Arizona had just about every testing problem you could imagine- dummied down cut scores, massive teaching to test items, and something at least in the direct vicinity of outright fraud by state officials regarding the state’s testing system. Our state scores had “improved” substantially through a combination of lowered cut scores and teaching to the test items, but NAEP showed Arizona scoring below the national average on every single test and precious little progress. The status quo was worse than a waste of time.

I spent some years repeatedly pointing out this enormous flaws in the Arizona testing system. I was not willing to turn around and wrap myself in the Arizona flag to pretend these tests and standards were somehow sacred because they were developed out here in our humble patch of cactus. Now if I were living in one of the states with high and rising NAEP scores with cut scores near NAEP proficiency, my calculus would have been quite different. I would have died on a hill fighting the adoption of Common Core.

Very few states however qualify for this lofty status. Most state standards and tests qualified as meh or worse than meh. I decided that if I were to draw up a list of the top 10 education problems facing Arizona, that Common Core adoption wouldn’t make the list.

Arizona adopted Common Core as a direct response to the prospect of getting Race to the Top money which we did not ultimately win. Common Core remains however the default, and quite frankly, the main arguments being made against it these days are not compelling enough to make many reasonable people want to reject it. To briefly summarize:

1. The United States Supreme Court Decision on Obamacare fundamentally altered the odds of a “lock in.” A few years ago murmuring in Washington raised the eery prospect of making major federal education spending programs like Title I contingent on Common Core adoption. Not only did this not happen, the Supreme Court enormously complicated the already dim prospect for such a move. My understanding of the Obamacare decision would in fact make it unconstitutional to deny Title I funds to a state choosing not to participate in Common Core.

The Congress could in theory come up with a new funding stream for purposes of bribing/incentivizing state action or could even perhaps pass a tax upon the citizens of states not adopting Common Core a la the individual mandate. Let’s face it though, one can only describe the prospects of either of these things happening as quite dim-somewhere in the vicinity of an extinction inducing asteroid strike in the short to medium term.

States therefore remain free to drop Common Core at their leisure. The dozen or so states having won RTTT money might face some delays in doing so, but Common Core is hardly an issue that any President is likely to call out the National Guard over.  States voluntarily joined (albeit with many seeking RTTT money) but they also remain free to withdraw. This is fundamentally different from the old “Fiscal Blackmail” scenarios of 55 mile per hour speed limits and 21-year-old drinking ages. States can leave Common Core without federal penalty.

The Obamacare decision also largely addresses the chief concern that I have expressed: a great national dummy down of the Common Core. If it happens, states can leave. It’s not clear whether the threat of states leaving will lean against the dummy down.

2. The latest fad to sweep the Common Core debate involves horrified concerns that Common Core is going to drive literature out of schools. I don’t however presume to know the “right” balance of fictional and informational texts and like most scare stories there is less to this one than meets the jaundiced eye seeing everything as yellow.

People do have varying preferences over such things though, making these sorts of disagreements inevitable. Still, nothing close to compelling enough to make me want to switch Arizona back to the failed AIMS regime.

Common Core opponents therefore have a fundamental problem: Common Core is now the default in 45 states and superficial scare stories may be jolly good fun to spread but aren’t likely to prove to be of much utility. Common Core opponents therefore should consider a new strategy. I suggest a Constructive Vote of No Confidence.

Common Core opponents have painted themselves into a corner of being defacto in favor of preserving joke standards and tests, including some that you can pass by signing your name while blindfolded. The way to escape this trap is not just to be against Common Core but in fact in favor of something else. Something better.

In short, if I were sitting on the State Board of Education in Arizona and someone brought a motion to pull Arizona out of the Common Core effort in preference to our bad joke status quo, I would vote no. If however the suggestion was that we pull out of Common Core and instead adopt the Massachusetts standards, I could very comfortably vote yes.

Mind you, it would be a struggle to adopt MA standards in AZ, and we might not prove up to the task. The same it true of Common Core. Plus the MA standards are battle tested and I would prefer to have a group of people running the show that I can actually talk to, beat up in the press and vote against. Democracy has it’s faults, but I’ll take my chances with it.

Regardless of which side of the Common Core debate you stand on, you should not labor in defense of the indefensible status-quo of many state testing regimes. Last year for instance, the Mississippi legislature debated charter school legislation. Suburban superintendents were able to exclude their districts and then ultimately kill the legislation based upon the rather incredible notion that their fantastic districts did not need charter schools. Suburban Mississippi imagines itself to be in possession of “good schools” which would be threatened by charters, you see.

Examination of the studies comparing NAEP and state tests however shows that you can pass the Mississippi 4th grade reading test as “proficient” with a score the equivalent of 163 on the NAEP. This score is far lower than the lowest recorded NAEP score in the recorded history of the troubled Washington DC district (179) which is itself unbelievably pathetic.  The Mississippi testing system is not only failing to produce improvement, it can be best understood as a gigantic fraud in which taxpayer dollars are actively used to deceive Mississippians into a false sense of security.

Common Core is hardly an ideal strategy to deal with this problem and there are any number of ways that it could fail. Opponents should not mistake the fact that horrible state tests and standards represent a very real problem. A constructive vote of no-confidence has the potential to create a respectable alternative to Common Core which in fact would fulfill the main purpose of Common Core.

 

 

 

 


A Guide for the Perplexed — A Review of Rigorous Charter Research

December 17, 2012

(Guest Post by Collin Hitt)

So you say charter schools don’t work. That’s an empirical claim. It needs to be backed up by evidence. Here’s a helpful guide to the most rigorous research available. Once you’ve tackled this material, you’ll be in position to prove your point.

As you probably know, the gold standard method of research in social science is called random assignment. Charter schools are particularly well-suited for random assignment evaluations, since they’re usually required by law to admit students by lottery. The lotteries are fair to families – that’s why they’re put in place. But they also allow researchers to make fair comparisons between students who win or lose lotteries to attend charter schools.

To date, nine studies lottery-based evaluations of charter schools have been released. Let’s go through them, starting with the earliest work.

The first random assignment study of charter schools was released in 2004 by Caroline Hoxby and Jonah Rockoff. It focused on Chicago International Charter School. After three years, charter students had significantly higher reading scores, equal to 3.3 to 4.2 points on 100-point rankings.  Gains were even stronger for younger students.

That same year, the University of California San Diego released a study of the Preuss charter school located on the university’s campus. Test scores for charter students appeared unchanged, but the school improved college-going rates by 23 percent: 90 percent of Preuss juniors were headed to four year colleges.

So the first two random-assignment studies of charter schools won’t help your point. They find gains for charter schools. But those studies are becoming dated; most of the national charter boom has occurred since they were published. Also, the San Diego study employs few statistical controls. So these studies don’t disprove your point either. Let’s review the newer stuff.

In 2010, Harvard’s Will Dobbie and Roland Fryer released a study of the Harlem Promise Academy. Entering kindergartners experienced large gains by the third grade, sufficient to eliminate the black-white achievement gap, equal to 0.58 standard deviations (sd) in reading and 0.49 sd in math. Students who entered Harlem Promise Academy in early middle school saw smaller gains that nevertheless by the eighth grade closed the achievement gap in math and reduced it by half in reading.

Later in 2010, researchers from MIT, Harvard and Michigan released a study of KIPP Academy in Lynn, Massachusetts. The charter school is part of the national charter network, the Knowledge is Power Program (KIPP). After a single year in the school, students saw achievement gains of 0.12 sd in English and 0.35 sd in math.

And earlier this year, researchers from Yale and Brown released a study of an unnamed charter network in an anonymous school district. There were no visible math gains for charter students, but they did see awfully big reading gains of 0.35 sd and writing gains 0.79 sd.

Charter advocates will point to these studies to try to prove you wrong, since these charter schools are definitely working. In turn, you could attempt to discredit the statistical math of the authors above. (Good luck.) Or you could make a more obvious point: these studies together look only at five charter operators. There are hundreds of charter operators across the country. The researchers could be cherry-picking – studying schools that they suspected beforehand were high-performing.

Larger random-assignment studies could address these issues, if they looked at a wider number of charter schools. Luckily, we’ve got four studies that do just that, all of them fairly recent.

The first is a 2009 study led by Caroline Hoxby. It examines practically every charter school in New York City. For every year students were enrolled in a charter school, they saw 0.04 sd gains in reading and 0.09 sd gains in math. The findings here are similar to the middle school gains that Fryer found at Harlem Promise Academy, though the citywide charter gains are clearly smaller than the Promise Academy’s extraordinary gains for kindergarteners.

Later that year, a citywide study of Boston charter middle and high schools found that charters produced “extraordinarily large” gains, according to the authors, who were based at Duke, Harvard, MIT and Michigan. After only one year, Boston’s charter high schools produced gains of 0.16 sd in reading and 0.19 sd in math. Charter middle schools in the city produced similar reading gains of 0.17 sd and a remarkable 0.54 sd in math.

In 2010, the US Department of Education released the first nationwide random-assignment study of charter middle schools. It contained two useful findings. Charter schools in affluent areas produced lower results than neighboring schools, which makes some sense. Charter schools in the suburbs are competing with higher quality schools than found in the inner cities. Charter schools in urban areas, enrolling a large percentage of poor students, posted significant gains in math, over two years equal to 0.18 sd.

In 2011, the team behind the 2009 study of Boston charter schools presented findings from a statewide evaluation of Massachusetts charter middle and high schools. Overall, results were positive. As with the USDOE study of middle schools, they found that charter schools in non-urban areas produced no positive gains. On the other hand, schools located in urban areas produced middle school gains of 0.12 sd in English and 0.33 sd in math and high school gains of 0.33 sd in English and 0.39 sd in math. These gains almost perfectly mirror the findings at KIPP Lynn, which is one of many schools included in the statewide sample.

So Harlem Promise Academy and KIPP produced results that are fairly similar to other charter schools nearby. So any allegation of cherry-picking in the studies of those two schools will need to be dropped.

Altogether, these studies have remarkably similar findings that urban charter schools are producing significant gains in reading or math, or both. Suburban charter schools perform less well – you could cite this fact, but frankly this a minor concern in the battle to close the racial achievement gap in American education.

You could make a methodological point: lottery studies don’t tell us about students who never participated in lotteries. In other words, what about students who never signed up for charter schools, who don’t have charter schools in the area, or who signed up for a charter school that didn’t need to run a lottery? Some researchers use less-rigorous “observational” methods to answer these questions.

Indeed, many of the studies above include secondary observational studies to test the validity of this very argument. They look at similar but artificial comparison groups of non-charter students who for unknown reasons didn’t enroll in lotteries. Those secondary analyses broadly confirm the main random-assignment findings.

Altogether, the best research tells a consistent story: charter schools are working. In order to find much evidence to the contrary, you’ll need to dig into third or fourth tier research. And you’ll need to invent a justification to ignore the random assignment literature, though you probably shouldn’t bother. Relying solely on third-rate research simply says that you were never interested in evidence in the first place.


Refuting Rauch and EPI on the Economics of Productivity

December 12, 2012

Hard Work U 3

(Guest post by Greg Forster)

Many readers of JPGB will be familiar with the hard-left, union-friendly Economic Policy Institute. A recent article by Jonthan Rauch uses some EPI graphs to argue that the U.S. economy no longer rewards working-class employees for productivity. Over on Hang Together, I say the graphs are deceptive. The problem is a decline in productivity in the workers, caused by – JPGB readers will be shocked – lousy K-12 schools (and also a loss of the older religious work ethic).

If you’re familiar with EPI’s work, you won’t be surprised – Jay, Marcus and I took on some very shoddy work they did on teacher pay back in Education Myths.


Florida Crushes the Ball on Progress in International Literacy Study

December 11, 2012

(Guest Post by Matthew Ladner)

TIMS released 2011 results today in a variety of subjects. This time a handful of states were brave enough to volunteer for a pullout of their results. Here are the results on 4th grade reading:

PIRLS 4

Here are the pullouts:

PIRLS 3

You got it: Florida students notched the second highest score in the world. Even above (gasp!) Finland.

Late for a meeting. More later, but for now:

BOOOOOOOOOOOOOOOM!!!!!!!!!


Weingarten Has a Great Idea!

December 10, 2012

Lisa Simpson keep out sign

(Guest post by Greg Forster)

What a shock – Randi Weingarten wants to solve the teacher quality crisis with higher barriers to entry. Because unions never erect barriers to entry for a profession in order to fatten themselves by exploiting the weak and vulnerable.

Weingarten’s article opens with yet another sign that we’re winning: “Every profession worth its salt goes through such periods of self-examination. That time has come for the teaching profession.” Yes, it sure has!

But you know, maybe this is a good idea. Hey, Randi, how about this: we institute a bar exam for teachers and then anyone who passes the exam is allowed to teach. What do you say to that?


Global Report Card 2.0

December 10, 2012

With help from my colleagues, Josh McGee and Jonathan Mills, we’ve produced for the George W. Bush Institute an updated version of the Global Report Card.  The Atlantic is hosting the Global Report Card 2.0 on their web site and has a nice piece about its release today.

And click here to see coverage of last year’s Global Report Card 1.0.  And here is a video of Bob Costrell and me discussing the GRC.


The Moynihan Corollary to Baumol’s Cost Disease

December 10, 2012

(Guest Post by Matthew Ladner)

Over at the Ed Fly Blog I discuss the Moynihan Corollary to Baumol’s Cost Disease, my theory that Moynihan intended to leverage Hillarycare for welfare reform before killing it, and more on the failure of the staffing-bloat-as-ed-improvement strategy.


Rigorously Studying Cultural Education

December 6, 2012

In my last post I mentioned a large-scale random assignment study of the effects of school tours of an art museum that I am conducting with my colleagues, Brian Kisida and and Dan Bowen.  Some people have asked for more information about that project.  So, here is a brief summary of what we are doing in that study as well as some related projects examining cultural education.

The random assignment study of field trips was made possible by the fact that the Crystal Bridges Museum of American Art opened in Northwest Arkansas, an area that had never before had a major art museum.  Because there was intense interest from schools in the area in having school tours there were many more applicants for field trips than the museum could accommodate right away.  We worked with the museum to randomly assign tours to applicants.

Specifically, the museum received more than 300 applications for tours during the first semester.  We organized those applicants into matched pairs, which were often adjacent grades in the same school or the same grade in different schools with similar demographic characteristics.  We then randomly assigned one school in each matched pair to be the treatment group and one to be the control group.  We randomly ordered the matched pairs and the museum scheduled the first 55 treatment groups for school tours last spring.  The 55 matched control groups were guaranteed a tour during the next semester for participating in the study.

We then administered surveys to the randomly assigned treatment and control group students and teachers a few weeks after the treatment group visited the museum.  Those surveys were designed to measure five types of outcomes: 1) whether the school tour helped create cultural consumers (students who want to return to museums and engage in other cultural activities), 2) whether the school tour helped create cultural producers (students who want to make art), 3) whether the school tour increased student knowledge about art and history, 4) whether the school tour improved student critical thinking about works of art, and 5) whether the school tour altered student values, like empathy and tolerance.

We have already collected results from almost 6,000 K-12 students and teachers from 80 different schools during last spring’s research.  This fall we are adding another 4,000 students and teachers to the study from another 60 or so schools.  When it is all done and analyzed it will probably be the biggest, most comprehensive, and highly rigorous examination of the effects of school tours of an art museum.

As part of the study we are also asking students in grades 3-12 to write short essays in response to paintings that they have probably never seen before to assess how they critically analyze a new work of art after they’ve had a school tour of an art museum.  Last semester we coded almost 4,000 essays in response to Bo Bartlett’s painting, The Box, which was pictured in my previous post.  This semester we wanted to try something a little more abstract, so we we will be coding another 2,500 or so essays in response to Marsden Hartley’s painting, Eight Bells Folly, which is pictured above.  Dan Bowen has taken the lead in the coding and analysis of these essays and will soon be on the job market in case anyone is looking for a great and innovative researcher to hire.

There are obvious limitations to our study.  We can only measure short term effects since the control group receives the treatment the following semester.  And we can only measure a limited set of outcomes from an art experience.  But we will know a whole lot more and with higher confidence than we do now.

We are also conducting two studies with the Walton Arts Center, which is a performing arts theater in Fayetteville, Arkansas.  In one study we are are working with our colleague in the music department, Lisa Margulis, to learn about the effects of information in program notes on students’ experiences during school field trips to see performances.  We are randomly assigning students to receive program notes with information about the show they are seeing or “placebo” program notes  that do not tell them about the show they are seeing.  The question is whether information alters the experience.

And in the other study with the Walton Arts Center we are surveying more than 2,000 7th grade students in area schools to link the past performances they have seen on school field trips to their current behaviors as cultural consumers and producers as well as some empathy and tolerance outcomes.  We are also going to use attendance zone boundaries as an exogenous source of variation to make stronger causal claims about how past school field trips may have contributed to current behaviors and attitudes.

We are also in talks with various folks about additional studies, all of which will use random assignment or similarly rigorous methods.  This line of work is particularly exciting because there is a limited amount of rigorous research out there on how school cultural activities affect students.

(link edited)