Rankings Revised

Rick Hess along with Daniel Lautzenheiser have devised a ranking of the “public presence” of education academics.  They developed a 7 itemscoring rubric [that] reflects a given scholar’s body of academic work—encompassing books, articles, and the degree to which these are cited—as well as their footprint on the public discourse in 2010. ”

There is always something arbitrary and crappy about these rankings, but Rick is right when he argues, “For all their imperfections, I think these [ranking] systems convey real information—and do an effective job of sparking discussion (about questions that are variously trivial and substantial).”  Recognizing that these kinds of rankings are part recreation and part reality, I’ve made a slightly revised ranking presented below (with help from Misty Newcomb).

One of the problems with the ranking Daniel and Rick developed is that it combines some measures that accumulate over one’s career with other measures that only count accomplishments in the last year.  The career measures, Google Scholar and books published, will tend to be higher for people who have had longer careers.  Given that the ranking is meant to capture the current influence of education academics, these career items are biased in favor of senior scholars whose work may have been influential in the past, but less so in the present.

A more junior colleague pointed out this distortion to me, so I have tried to standardize the Google Scholar and book measures so that those with longer careers would have no particular advantage.  In particular, I calculated the sum of the two “career measures” — Google Scholar and books published.  Then I divided that sum by the years since the scholar received his or her terminal degree.  And to ensure that books and articles would still have the same weight in the overall score, I multiplied by the mean number of years since degrees were earned, about 23.2.

In making this adjustment I am assuming that every scholar would maintain the same rate of book and article productivity over his or her entire career.  So, the book and article “public presence” in the past year would be in proportion to the total book and article production per year over an entire career.

I make no changes to the 5 other measures in Daniel and Rick’s ranking: current Amazon sales as well as mentions in the education press, blogs, newspapers, and Congressional Record.  All of those measures reflect current “public presence.”  Adding the adjusted two career measures to these annual measures we get an adjusted total score.

Making the adjustment for length of career does not alter who is at the very top of the rankings.  As you can see below, Diane Ravitch and Linda Darling-Hammond still rule the roost.  But there are some significant changes below that, where more junior scholars jump in the rankings and more senior scholars drop.  For example, Martin West leaps to 10th place from his previous ranking of 69th, surpassing his mentor, Paul Peterson, who drops from 5th to 11th.  Roland Fryer moves up to 3rd from 11th.  Jacob Vigdor rises to 16th from 43rd.  Susanna Loeb goes to 18th from 49th.  Matthew Springer rises to 29th from 74th.  And Brian Jacob, Jonah Rockoff, and Sara Goldrick-Rab all jump almost 30 places.

On the other hand, some more senior scholars decline significantly in their public presence ranking once we make this adjustment.  Gene Glass sinks from 20th to 50th.  Henry Levin falls from 17th to 52nd.  David Berliner drops from 19th to 57th.  Kenneth Zeichner moves from 30th to 62nd .

These changes make sense and I think improve Rick and Daniel’s ranking.  Hotshot researchers like Roland Fryer, Jacob Vigdor, Susanna Loeb, Matthew Springer, Brian Jacob, Jonah Rockoff, and Sara Goldrick-Rab are having a large impact on current education policy discussions even though their careers have not been long enough to accumulate a longer list of books and articles.  The original ranking shortchanged these scholars in measuring their current “public presence.”

At the same time, more senior scholars, like Gene Glass, Hank Levin, David Berliner, and Kenneth Zeichner may have been given too much credit by the old ranking system for books and articles that were influential in the past but do not give them as much of a public presence in recent policy debates.

Of course, of greatest interest to me was what happened to my ranking.  I moved up to 21st from 39th.  This must be a better ranking.

Click on the images below to see the original and adjusted results for all 89 education academics that Rick and Daniel included in their “super-sized” ranking.  Have fun and, as David Letterman would say, please… no wagering.


9 Responses to Rankings Revised

  1. Are my eyes failing? Where’s Myron Lieberman?

  2. Hi Malcolm,

    Rick and Daniel restricted their list to scholars at universities. They didn’t include any think tankers, such as Checker Finn or Rick himself. I’m just using their list.

  3. […] This post was mentioned on Twitter by Scott McLeod and Liam Goldrick, Jay P. Greene. Jay P. Greene said: Rankings Revised: http://wp.me/peH0y-1Ry […]

  4. Greg Forster says:

    Jay, I heard somewhere that education books are dead. Why include number of books as a measure? 🙂

  5. In the piece I moderated my claim to say that they are only mostly dead. : )

  6. oo, I moved up. These rankings *are* better. Nice job, Jay!

  7. A colleague just made an excellent critique of my adjustment:

    “A rating system is odd if it treats as equivalent a professor with one year of experience who has a google rating of 1 (1 article once cited) as equivalent to a 40 year professor with a google rating of 40, which
    requires a minimum of 1600 citations (and probably hundreds or thousands more).”

    Hmmmm, I think he has a good point. Fixing this would involve changing the Google Scholar measure and not just applying an adjustment. Maybe the lesson is that there is no particularly good way to rank folks.

  8. Greg Forster says:

    Uh, or you could accept that it’s not, in fact, unfair to acknowledge the reality that the hot young kid who’s all the rage this week does not have quite the same kind of importance as a thirty-year scholar with a long list of books and citations.

    What you want here is two rankings – a “this year’s media hotness” ranking and a “long term scholarly snoozefest” ranking. Then people can combine the two directly in whatever way they wish, based on how much relative importance they give to each one; stuck-up snobs can ignore the media-hotness rankings and snot-nosed kids can ignore the scholarly rankings, and everyone else can give them relative weights in between.

    Jay, this is the part where you say “control-G.”

  9. phillyclick says:


    […]Rankings Revised « Jay P. Greene's Blog[…]…

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s