The Problem of Rankings

Matthew Yglesias has a couple of posts on opposition to the US News college rankings, the first noting the phenomenon, and the second pointing to Kevin Carey's work on better ranking methods. The problem with this is, I think he sort of misses the point of the objections.

Matt writes:

All that said, the very best way to deal a death-blow to this scheme would be for America's colleges and universities to work together and with third parties to try to come up with some meaningful metrics for higher education performance. All magazines make lists, but the reason the college rankings are such a hit is that there's nothing out there. Ordinal rankings are inherently kind of dumb, but higher education leaders both can and should come up with some kind of theory about what service they're providing to students and some method of measuring how well they're doing it. Since the schools don't do anything like this themselves, and since their lobbyists are wildly opposed to having the government do it, the upshot has been to outsource the function to a struggling newsmagazine that deploys screwy formulae to boost sales.

I think this involves a misreading of the objection to the US News rankings. The problem is not just that the formula is screwy, the problem is that some people reject the entire premise of the process: that there is some globally meaningful ranking of colleges.

I think the problem many academics have with the US News rankings is not so much that they dislike the weights of the individual components, or the specific ordering of the results, but that they dislike the entire idea. Education is a highly individual thing, and a college that is the "best" by some set of pseudo-objective standards may not turn out to be a good fit for any particular student. By creating rank-ordered lists of schools, we end up driving some students to seek out places that aren't actually good for them, just for the "prestige" associated with attending a higher-ranked school.

I think this is probably somewhat overblown-- students make college decisions on the basis of factors that are much sillier than the US News rankings, and most of them turn out all right-- but it's not just a matter of jealousy on the part of lower-ranked schools. Some of the schools protesting the rankings-- Dicinson and Reed colleges, for example-- do just fine in the rankings.

Personally, I think the sort of things Kevin Carey talks about in the Washington Monthly article Matt cites are perfectly reasonable, and would have no real problem with those data being available. But it's important to note that Matt's suggestion that colleges come up with their own rankings isn't likely to carry much weight with the people objecting.

To put it in political terms, telling these people "Make up your own rankings, then" is a little like telling anti-war Democrats "Fine, if you don't like the surge plan, come up with your own plan to put more troops in Baghdad." The problem isn't that they dislike the details of the implementation, so much as that they reject the whole concept.

Tags

More like this

Today's Inside Higher Ed has a story about growing resistance to the US News rankings: In the wake of meetings this week of the Annapolis Group -- an organization of liberal arts colleges -- critics of the U.S. News & World Report college rankings are expecting a significant increase in the…
I'm a bit late to the party on this, but I couldn't resist saying something. A rather obnoxious twit by the name of Richard Vedder has set up a front-group called "The Center for College Affordability and Productivity". The goal of this group is purportedly to apply market-based mechanisms to the…
Later this month, the National Research Council will, finally, release the much awaited and much anticipated Data-Based Assessment of Research-Doctorate Programs. Every 10 years, roughly, the NRC publishes graduate program rankings, the last having been come out in 1995... The rankings will be…
This could easily be a Links Dump item, but it's so good that it deserves more prominent placement. Inside Higher Ed has a spoof Editor's Note from an imaginary US News & World Report ranking of churches: In thanking those who took the time to write, I would remind all in the community of…

I agree completely.

I would say, though, that the problem is not entirely overblown. Yes, students are smarter about selecting schools than simply looking at US News rankings, but those rankings do have an effect. More important, though, is how seriously administrations take those rankings, and the kind of secondary effects it has on how colleges and Universities are run. Chasing the rankings can be very destructive. I know that 6 or 7 years ago it was a destructive force in my department vis a vis perceptions about the NRC rankings.

-Rob

There was a fake news article in the paper here for April Fools that said we fell something like ten spots in the US News rankings. I ended up getting into a big argument with a friend about how much it mattered, he said we'd lose funding and applicants, I said there was no way in hell it would change anything.

I can only speak for myself, but I didn't even look at those things. Everyone already knows that Harvard and Yale and Princeton and Duke and Stanford are more or less the top 5, and that's not going to change much.

Besides, the most important things to kids are intangibles like location or things like cost. Those rankings are essentially meaningless, and everyone should know it. Did I come here because we're tied for number three in US News? Hell no, and no one should.

I think that some colleges, those that are decent but overlooked and not elite, will start publishing their own outcomes measures, and whatever other metrics they find suitable.

When (say) College of St. Rose can say "our average graduate makes $x in their field after graduation, why isn't (say) Union College telling you how their graduates fare?", you'll feel the pressure. The 'elite' schools will have to quantify what it is they are doing that is worth the extra money.

By Upstate NY (not verified) on 23 May 2007 #permalink

I went to a little liberal arts college that was 2nd tier in the rankings. The place had mostly excellent faculty and a students that ranged from very impressive to pretty crappy. But I think what hurt us most in those rankings was the fact that endowment size is incredibly (and oddly) important. My school was pretty broke.

I think college rankings can be distracting in an entirely different way then the Carey article suggests. A primary factor that determines the usefulness of your college education is the amount of hard work you invest. Carey's mutual fund analogy illuminates the danger of this distraction. In a mutual fund the money doesn't care if it produces good returns or not. The money will only be as good as it's managers. A student who invests himself minimally in a school ranked well by any metric will have worse returns than a student who fully embraces their chosen college.

By Jason Slaunwhite (not verified) on 23 May 2007 #permalink