Ranking the Unmeasurable

Today's Inside Higher Ed has a story about growing resistance to the US News rankings:

In the wake of meetings this week of the Annapolis Group -- an organization of liberal arts colleges -- critics of the U.S. News & World Report college rankings are expecting a significant increase in the number of institutions where presidents pledge not to participate in the "reputational" portion of the rankings or to use scores in their own promotional materials.

A majority of the approximately 80 presidents at the meeting said that they did not intend to participate in the U.S. News reputational rankings in the future. Some of those presidents may have previously endorsed the movement, so the exact increase is uncertain as Annapolis Group leaders said that the expected individual presidents to announce their decisions.

Interestingly, yesterday's interesting story was about how we don't actually know anything about the students we admit:

A major study released Monday by the University of California suggests that high school grades may be good at predicting not only first-year college performance, as commonly believed, but performance throughout four undergraduate years. The same study suggests that the SAT adds little predictive value to admissions decisions and is hindered by a high link between SAT scores and socioeconomic status -- a link not present for high school grades.

And further, the study finds that all of the information admissions officers currently have is of limited value, and accounts for only 30 percent of the grade variance in colleges -- leaving 70 percent of the variance unexplained.

It's an interesting combiantion, because the admissions story is a nice demonstration of the problems the anti-US News people are talking about. Our proxy measurements of student quality all suck, and these end up being a big chunk of the rankings. You're not going to be able to make fine distinctions between institutions using measurements where you can't explain 70% of the variance.

Of course, the SAT and graduation rate data are like laser spectroscopy compared to the "reputational" portion of the rankings, which is the thing that the Annapolis Group presidents are talking about boycotting. This is a completely ridiculous process, in which the top two or three academic officers of every college are sent a survey and asked to rank other colleges. This accounts for something like 25% of the ranking all by itself, which more or less guarantees that the strongest correlation between any objective measure and a given school's ranking will be with the previous year's ranking.

Elsewhere in blogdom, Mark Kleiman snarks at the Annapolis Group:

Boycotting the U.S. News college rankings is a fine idea. But that survey filled a void. Either the colleges themselves or some friendly foundation needs to write a reasonable ranking system, collect the data, and publish the results.

I was going to be snarky in return, because the next paragraph of the Inside Higher Ed piece I linked above is:

At the same time, the Annapolis Group formally endorsed the idea of working with the National Association of Independent Colleges and Universities and the Council of Independent Colleges to create "an alternative common format that presents information about their colleges for students and their families to use in the college search process." The idea is to create online information with "easily accessible, comprehensive and quantifiable data."

It's not Mark's fault, though-- the New York Times piece he used as a source doesn't bother to mention the endorsement of alternate rankings. So, boo for the Paper of Record.

Tags
Categories

More like this

It is sad that the only information on colleges that many prospective students and their families see is a ranking from some magazine.

It doesn't help matters that some many colleges are being run more as businesses than institutions of learning that have some level of academic standard. Knowing how so many of the colleges are being run is it any surprise that we see 5 year graduation rates of 30% at some schools. While there is nothing inherently wrong in developing a curriculum that is incredibly demanding there is something extremely wrong in the current admission process and organization of many schools.

Perhaps, rather than pointing a finger at a magazine who actually has the responsibility to shareholders to make a profit; the college administrations should look at their own practices first. Be they ivy league, state run, or simply a private school it is the practices of the "industry" that have brought this on.

Not all schools are guilty of the "enroll everyone, their brother, and their dog" obviously. This can be seen in how some of the schools with the most demanding curriculum have the highest graduation rates. One item of interest in this though is that these same schools tend to also offer the least variance in what courses a student will take their first year. I believe this is one of the best things that can be done considering the process that leads up to college in the first place. By giving a demanding course schedule that initiates the student immediately to what can honestly be expected of them the few chaff that begin with the wheat either drop-out immediately or knuckle down and fly right.

For a very interesting idea I liked how MIT offers their entire 1,800-course curriculum online free of charge. Want to know what your work will be like, want to see what topics you will be introduced to, here it is. That, in my humble opinion, was a very innovative idea.