NRC: the future of rankings

Two aspects of the NRC rankings are that a) it took so long that the results are dated and people will selectively choose to use or ignore them as suits best (and then rely on the 1995 rankings instead I gather)
and, b) the process was so hard and unpleasant it will never be done again...

Hmm, that sounds familiar.
We can fix that.

See, the arduous part of the NRC was the data mining - gathering the metrics after they'd been defined.
It took a long time and required iterations and debate.

But, this is precisely the sort of thing that can be automated.
At least in large part.
eg. the publication rate is public - in astronomy it is trivially accessible through ADS, other fields the data can be extracted and is routinely extracted for all sorts of reasons.
Same with citations.

Federal grants are mostly public, browse Fastlane or NSPIRES and count 'em.

Some things are harder - like faculty composition, or student gender and ethnicity.
A lot can be inferred from web pages and annual reports.

But, this is still a pretty straightforward google project.

A 10%er at google could probably hack something together this week, and have something robust and dynamic by christmas.

Yes, a dynamically generated continually updated, your own customized NRC-like ranking.

Start with the easy bits, like faculty count, student count, graduation rates and publications/citations.
Prioritize metrics known from the current NRC study to be currently heavily weighted, don't try to do all fields, do the easy ones with already public metrics and those fields in which metric change rapidly.
Then increment - add more metrics, add more fields.

With all the data there, it is trivial then to also do your own weighting and to generate your own correlations.

Best of all, with the NRC data public, it can be used to initialize the data - so you don't have to do it ab initio, just do a rolling update of the NRC numbers (retaining the old numbers of course for sanity and to measure gradients).

This would be totally awesome and absolutely terrifying - a dynamic public database of all graduate programs in the US, exposing all strengths and weaknesses.

Now, if I only knew someone at google who might think this was a good laugh...

More like this

We are now just 12 hours from the release of the National Research Council Data Based Assessment of Graduate Programs. The tension is just overwhelming... An interesting thing about the 2010 NRC rankings is the methodology, and a final version seems to have been settled upon. As you know, Bob, the…
So, what do we make of the NRC Rankings? What drives the different rankings, and what are the issues and surprises? First, the R-rankings really are reputational - they are a bit more elaborate than just asking straight up, but what they reduce to is direct evaluation by respondents without…
Later this month, the National Research Council will, finally, release the much awaited and much anticipated Data-Based Assessment of Research-Doctorate Programs. Every 10 years, roughly, the NRC publishes graduate program rankings, the last having been come out in 1995... The rankings will be…
I see that UC Davis is touting that its ecology & evolutionary biology program was ranked #1 by US News and World Report. Check out the "Best Graduate Schools" online sampler at US News and World Report. I had a friend who narrowly chose Harvard over Davis for evolutionary ecology, so it…

I am biased because my alma mater (Caltech) and my father's (Harvard) always rank very well. However, I agree that the methodology, and the founational issues hiding behind the methdology, are seriously flawed.

At the Primary and Secondary school level, there is an intense debate in the nation's #2 (by size) public school system, Los Angeles Unified School District (LAUSD). Schools have been rated for a long time. Internal data rated teachers as well. The Los Angeles Time printed the previously hidden ratings for primary school teachers. Chaos ensued. Late last week, a primary school teacher allegedly committed suicide over an unfavorable rating. What is the equivalent of a university committing suicide over a poor NRC ranking?

Program closure, with possible termination of tenure for affiliated faculty.

This is happening, and barring a miracle, will happen more.
NRC rankings will weigh, possibly heavily, on decisions about closure and cuts.

Google offers small research awards. Sounds perfect for one of those.

Though I'll bet its really not so easy: for example faculty counts aren't really easily obtainable in an automated fashion (I mean even a graduate school can't accurately report them :)) Where you thinking of starting with non-automated data for this at the start?

But there are some simple tools that could be made today that would be a step in this direction. For example it would be pretty easy to whip up a NSF NIH grant / institute ranking in real time. And yeah in physics/astro since their publications aren't too pay-walled, there could be some easy incorporation of such data.

Quick, to the Ruby on Rails!

yeah, more I think about it, easier it'd be to do something dynamic, at least for a lot of the metrics in the sciences.

Ruby? Don't you think it'd be easier to hack something up in F77?
;-)