Ricochet is the best place on the internet to discuss the issues of the day, either through commenting on posts or writing your own for our active and dynamic community in a fully moderated environment. In addition, the Ricochet Audio Network offers over 50 original podcasts with new episodes released every day.
Moneyball Comes to the Ivory Tower
University professors have had the luxury of thinking they are influential without having any way to prove or disprove it — which means that everyone could be a big fish in their small pond. But suppose you were a university president, dean, or simply a rich donor. Could you deploy resources to attract undervalued professors and build a faculty that would punch above its salary? In other words, if Harvard is the Yankees, could you build a faculty like the Oakland A’s?
In our recent paper, a Ph.D student at Berkeley and I propose a way to measure faculty quality by counting up how many times professors are cited by other professors. This has been a controversial way to measure quality, but it is more objective than impressionistic opinions about who is smart and productive. In just a weekend, it has been downloaded at a faster rate than almost any other paper I’ve written — which may also tell you a lot about professors.
Published in General
It seems like the value of this measure is damaged by it being public. If professors know that their citing ability is the coin of the realm then it will change how they use it. (This assumes this measure were in actual use to determine administration decisions).
Two questions:
1) Are legal journals not already included in citation services such as Scopus or Thompson-ISI Web of Knowledge?
2) Is this new ranking scheme Dr. Yoo’s counterpunch to Richard Epstein’s “surface area / perimeter” redistricting supercomputer?
I would rather grade the faculty by how many times their students are cited. I don’t really know how these two incompatible disciplines – research and teaching – managed to get conflated, but the unfortunate students get the short end by being forced to pay for research when they thought they were supposed to be getting an education.
So far as I can tell, about one academic paper in a thousand has any actual impact on anyone (don’t even get me started on Ph.D. theses). I hope the success ratio of educating students is better than that.
They just downloaded the PDF to search for their own names, not actually read it.
It is an fundamentally flawed metric if your goal is to improve an institutions teaching and not research prestige. It would be a good metric if your goal is to improve research. Research and teaching are two separate skills yet university promote based on Research not teaching ability. Some of the worse professor I had were big into research and all the best teachers I had did not do research. One first needs a decent measurements of teaching outcomes to rate the quality of teachers which we don’t have in higher education on an individual teacher level.
Baseball has plenty of them.
Sounds like one more way to reinforce and reward group think.
Once any structure is created, the imaginative will find a way to maximize its benefit as needed. “You scratch my back…er, cite my paper…I’ll cite yours.”
And couldn’t a highly competent Administrator *coughkagancough* “persuade” the system’s participants to create best advantage?
[Eeyore is not a lawyer, nor has he played one on TV]
Sticking with the baseball motif, is there anything preventing collusion among professors from citing each other specifically to drive up their citations?
As an aside, the real lesson of Moneyball was the A’s never actually won anything, but it did generate a lot of self-promotion for Beane.
From skimming the article and from several LawTalk podcasts, I think the main goal is actually a market-oriented one: to improve a law school’s US News ranking, and thus increase demand for the school among potential law students.
Which could be even more perverse, if law school applicants are unwittingly favoring schools which put more emphasis on research and less on teaching.
If such a metric becomes accepted its originator shall necessarily score well on it.
I bow down before your cleverness Professor Yoo!
Your post made me laugh. Thanks!
John — Your model is superior to that of measuring books and articles that no one reads or cites, but it still places a premium on research over teaching. In terms of the real problems facing higher education it represents a baby step in the right direction, but still perpetuates a model where undergraduate tuition subsidizes everything but undergraduate instruction.
Speaking from the Sciences, I know that the main way they judge your productivity is your ability to bring in grants to the institution. I have heard of professors here at the U of C being denied tenure and pushed out because of lack of funding. I don’t think I really like this though. In the sciences this focus on grants and publishing (publishing in high impact journals is how you get grants) I think leads to professors focusing less on students and teaching. In turn university labs become science sweat shops crammed with Chinese and Indian post docs, and the science departments become just machines for the acquisition of federal grants.
I find it interesting that using that kind of a metric, if the bloggers of the Volokh Conspiracy were a law school faculty, they’d been one of the best ranked law schools in the country.
No. They’re related. Both require that one be a good explainer.
But this metric: is it really new? In biochemistry 30 years ago, we all knew, and very reasonably respected, what Science Citation Index did. In biochemistry at least, it really is significant if other folks aren’t citing you. It means your research has failed to spawn more research. If you had the truly last word, the issue you were exploring must not have been very lively.
Isn’t this essentially the same concept as PageRank, the algorithm that powers Google? The basic idea there is that Google crawls every page on the web and counts the number of pages that link to each other, and the more pages link to yours, the more important yours is and so the higher it appears in Google search results.
This has the problem that all faux objective measures have: the targets can move. If you create incentives for lots of mutual cites, you’ll get them. Whole journals or sub-genres will grow that will provide the cites that faculty need to pad out their vita. In fact they will be forced to seek out these cites because they are competing against others doing the same thing.
The question left unanswered is who came out ahead Yoo or Epstein.
Another question is does citing an infamous professor count.
One time I was congratulating a professor friend of mine from a major research university who had won a teaching award, and he told me the first piece of advice he received when he arrived there was, “don’t win any teaching awards.” He said it is simply assumed that a good teacher is not sufficiently focused on his research.
Productive professors? Isn’t that a contradiction in terms?
I kid, of course.
No. They’re related. Both require that one be a good explainer.
I don’t agree. In the humanities, research-based scholarship is defined trying desperately to say something new about something inevitably old. It leads to silliness all in the name of trying to secure a career. And just because teaching and writing both require the same skill doesn’t mean they require all of the same skills. The current model of undergraduate education needs to be thoroughly rethought and while John’s suggestion is cogent, it is not nearly enough.