Moneyball Comes to the Ivory Tower

University professors have had the luxury of thinking they are influential without having any way to prove or disprove it — which means that everyone could be a big fish in their small pond.  But suppose you were a university president, dean, or simply a rich donor.  Could you deploy resources to attract undervalued professors and build a faculty that would punch above its salary?  In other words, if Harvard is the Yankees, could you build a faculty like the Oakland A’s?

In our recent paper, a Ph.D student at Berkeley and I propose a way to measure faculty quality by counting up how many times professors are cited by other professors.  This has been a controversial way to measure quality, but it is more objective than impressionistic opinions about who is smart and productive.  In just a weekend, it has been downloaded at a faster rate than almost any other paper I’ve written — which may also tell you a lot about professors.

  1. mask

    It seems like the value of this measure is damaged by it being public.  If professors know that their citing ability is the coin of the realm then it will change how they use it.  (This assumes this measure were in actual use to determine administration decisions).

  2. Mendel

    Two questions:

    1) Are legal journals not already included in citation services such as Scopus or Thompson-ISI Web of Knowledge?

    2) Is this new ranking scheme Dr. Yoo’s counterpunch to Richard Epstein’s “surface area / perimeter” redistricting supercomputer?

  3. Larry3435

    I would rather grade the faculty by how many times their students are cited.  I don’t really know how these two incompatible disciplines – research and teaching – managed to get conflated, but the unfortunate students get the short end by being forced to pay for research when they thought they were supposed to be getting an education. 

    So far as I can tell, about one academic paper in a thousand has any actual impact on anyone (don’t even get me started on Ph.D. theses).  I hope the success ratio of educating students is better than that.

  4. EJHill
    John Yoo:  In just a weekend, it has been downloaded at a faster rate than almost any other paper I’ve written — which may also tell you a lot about professors. 

    They just downloaded the PDF to search for their own names, not actually read it.

  5. Brian Clendinen

    It is an fundamentally flawed metric if your goal is to improve  an institutions teaching and not research prestige. It would be a good metric if your goal is to improve research. Research and teaching are two separate skills yet university promote based on Research not teaching ability. Some of the worse professor I had were big into research and all the best teachers I had did not do research.  One first needs a decent measurements of teaching outcomes to rate the quality of teachers which we don’t have in higher education on an individual teacher level.

     

    Baseball has plenty of them.

  6. Guruforhire

    Sounds like one more way to reinforce and reward group think.

  7. Eeyore

    Once any structure is created, the imaginative will find a way to maximize its benefit as needed. “You scratch my back…er, cite my paper…I’ll cite yours.”

    And couldn’t a highly competent Administrator *coughkagancough* “persuade” the system’s participants to create best advantage?

    [Eeyore is not a lawyer, nor has he played one on TV]

  8. Whiskey Sam

    Sticking with the baseball motif, is there anything preventing collusion among professors from citing each other specifically to drive up their citations?

    As an aside, the real lesson of Moneyball was the A’s never actually won anything, but it did generate a lot of self-promotion for Beane.

  9. Mendel
    Brian Clendinen: It is an fundamentally flawed metric if your goal is to improve  an institutions teaching and not research prestige. It would be a good metric if your goal is to improve research.

    From skimming the article and from several LawTalk podcasts, I think the main goal is actually a market-oriented one: to improve a law school’s US News ranking, and thus increase demand for the school among potential law students.

    Which could be even more perverse, if law school applicants are unwittingly favoring schools which put more emphasis on research and less on teaching.

  10. Stephen Dawson

    If such a metric becomes accepted its originator shall necessarily score well on it.

    I bow down before your cleverness Professor Yoo!

  11. hazel krabinski

    Your post made me laugh.  Thanks!

  12. Trace

    John — Your model is superior to that of measuring books and articles that no one reads or cites, but it still places a premium on research over teaching. In terms of the real problems facing higher education it represents a baby step in the right direction, but still perpetuates a model where undergraduate tuition subsidizes everything but undergraduate instruction.

  13. Valiuth

    Speaking from the Sciences, I know that the main way they judge your productivity is your ability to bring in grants to the institution. I have heard of professors here at the U of C being denied tenure and pushed out  because of lack of funding. I don’t think I really like this though. In the sciences this focus on grants and publishing (publishing in high impact journals is how you get grants) I think leads to professors focusing less on students and teaching. In turn university labs become science sweat shops crammed with Chinese and Indian post docs, and the science departments become just machines for the acquisition of federal grants. 

  14. Amy Schley

    I find it interesting that using that kind of a metric, if the bloggers of the Volokh Conspiracy were a law school faculty, they’d been one of the best ranked law schools in the country.

  15. John H.
    Brian Clendinen: Research and teaching are two separate skills 

    No. They’re related. Both require that one be a good explainer.

    But this metric: is it really new? In biochemistry 30 years ago, we all knew, and very reasonably respected, what Science Citation Index did. In biochemistry at least, it really is significant if other folks aren’t citing you. It means your research has failed to spawn more research. If you had the truly last word, the issue you were exploring must not have been very lively.

  16. Joseph Stanko

    Isn’t this essentially the same concept as PageRank, the algorithm that powers Google?  The basic idea there is that Google crawls every page on the web and counts the number of pages that link to each other, and the more pages link to yours, the more important yours is and so the higher it appears in Google search results.

  17. Arjay

    This has the problem that all faux objective measures have: the targets can move.  If you create incentives for lots of mutual cites, you’ll get them.  Whole journals or sub-genres will grow that will provide the cites that faculty need to pad out their vita.  In fact they will be forced to seek out these cites because they are competing against others doing the same thing.

  18. 10 cents

    The question left unanswered is who came out ahead Yoo or Epstein.

    Another question is does citing an infamous professor count.

  19. Fredösphere
    Brian Clendinen: Some of the worse professor I had were big into research and all the best teachers I had did not do research. · 18 hours ago

    One time I was congratulating a professor friend of mine from a major research university who had won a teaching award, and he told me the first piece of advice he received when he arrived there was, “don’t win any teaching awards.” He said it is simply assumed that a good teacher is not sufficiently focused on his research.

  20. Pat in Obamaland

    Productive professors? Isn’t that a contradiction in terms?

    I kid, of course.