Not just grades but:
- Grade distribution for the class. At a minimum: class average, standard deviation, median. Even better: a breakdown by grade.
- Scores of students in the class on standardized exit exams. For example I'd like to see how students who took the class scored on physics GREs.
- Surveys of the students perception of the difficulty of the class. Comparison of this ranking for same students across other classes.
Today, when information storage is cheap, why is it that we have a grading standard consisting of a few lousy letters (less for some schools..you know who I'm talking about!) Should we shoot for a standardized more information content grading standard? Or is the current system fine for the purpose it serves (which varies widely among the users of the grades)?
- Log in to post comments
More like this
A bunch of people have been mailing me links to an article from USA today
about schools and grading systems. I think that most of the people who've
been sending it to me want me to flame it as a silly idea; but I'm not going to do that. Instead, I'm going to focus on an issue of presentation. What…
Via Thoreau, a paper from a physicist in Oregon that's pretty much a grenade lobbed into the always-explosive grade inflation discussion:
We use four years of introductory astronomy scores to analyze the ability of the current population to perform college level work and measure the amount of…
One of the perennial problems of teaching intro physics is getting students to do their homework, so I was very interested to see Andy Rundquist on Twitter post a link to a paper on the arxiv titled "How different incentives affect homework completion in introductory physics courses." When I shared…
I'm a great believer in luck and I find the harder I work, the more I have of it. -Thomas Jefferson
It's the end of the semester at my college as well as at many schools across the world, and I've spent the last week or so grading final exams. And while I was doing it, I noticed something…
The key word in your post is "standardized." The need for standards is what so many problems in the information age boil down to. If each college and university adopts its own standard, then there is no real advantage, particularly since there is no standard for distributing enhanced transcripts.
The key word in your post is "standardized." The need for standards is what so many problems in the information age boil down to. If each college and university adopts its own standard, then there is no real advantage, particularly since there is no standard for distributing enhanced transcripts.
Well there is an advantage, I think, in knowing that all of the students got A's in a class, or that the students considered the class "easy." But yes, connection of the students to some sort of standarization seems to be useful. Truthfully we already do this in an adhoc fashion: by referring to the reputation of the school.
Two things:
One is that the current system works well with our 60 second average national attention span - that's about as long as it takes to scan even the longest grade sheet.
Two is a rhetorical question. Why not simply switch to the numerical equivalents? So, for instance, in one of my classes, 95+ is an A but you'd be able to see the difference between someone who got a 95.6 and someone who got a 98 (not that the latter has ever happened).
standardization sucks.
The idea that classes with the same name should have the same content, and so be measured by a standard metric that applies across all colleges, is a toxic idea that stifles the strengths of individual teachers and forces a lowest-common denominator metric.
I teach differential equations from a dynamical systems viewpoint, one of my colleagues emphasizes analytic techniques, and yet another focuses on numerical techniques. A standardized test for `differential equations' would force all three of us to teach the same things, from the same viewpoint: from the viewpoint dictated by the test writers.
Standardized tests that measure the outcome of a program (like the GRE) could, possibly, be implemented--if the stakeholders (Grad Schools in Physics) could get together and decide what it is they'd like undergrads to have learned. Anyone who has had to sit on an assessment committee knows how contentious that could be--how much solid state physics should an undergraduate in physics be required to take?
Standardization dispells variation--it's a bad idea.
standardization sucks.
I agree but have been called "dangerous" by other physicists for thinking this.
You can add in all the extra numbers in the world, but you aren't actually making the process more fair, rigorous, or accurate. Measuing the knowledge in other people's heads is an inherantly futile gesture.
Standardization may have it's place (sorry med students bound for USMLE!) but grades suck.
You can add in all the extra numbers in the world, but you aren't actually making the process more fair, rigorous, or accurate.
Didn't say I was making it more fair. It is definitely more accurate though. Unless you think throwing away information is more accurate.
I think we could solve the standardization problem by having everyone who takes the SAT burn American flags and rebel against the machine.
It's hard to think of an argument against this... seemingly if grades are important at all, they should be accurate.
There is a big emphasis on grades already though, and not ranking students or hiding information is one tactic schools have taken to.
Are accurate grades the problem facing American education [grammar!]? If only everyone knew how bad our (future) presidents did in school...
data can only be said to `accurate' if the link between the data and what the data claims to measure has been thoroughly established (see Gould's critique of IQ tests, for example--all the IQ data that has ever been collected is, basically, worthless, for it was never established exactly what it was that was measured). Hence, including information that might not measure anything meaningful on a transcript won't make the transcript more `accurate'; you're just adding noise.
My 9th, 10th, 11th grade students are almost all upset because they say that almost all of their parents are upset about the "progress reports" snailmailed home "to the parents of..."
The report generator from the database correctly shows for each student how many points they got on each homework, quiz, and exam compared (as a percentage and letter garde) to the highest possible score. Then I hand-wrote a paragraph, customized for each student, on how I generously graded "on the curve" and that young man/woman is tentatively earning a B+.
They are freaked out over any "F" for any given homework or quiz or test. They never heard of "grading on the curve." They don't understand my explanation. But there has been a flurry of very late submissions of missing homework.
Standardized-schmanderdized. Students don't "get it" until they have their (in California) mandatory Probability and Data Analysis class. And then only the ones who understand Gaussian distributions.
Otherwise "grading on the curve" = "what curve, teacher?"
Did you hear the one about the teacher who drove off the road and smashed the car?
The teacher was "grading on the curve."
What are you trying to achieve? As far as I can tell, grading students has two main aims, and these don't vary much with the level of students:
First is to give the teachers, parents, prospective employers, higher schools, etc. an idea of what the student is good at and how they compare to other students.
Second is to give the student an idea of what they are good at and how they compare to other students.
OK, its the same thing.
The question is how well a grading system does this.
As you say, information storage is cheap. Also, the interpretation of grades and grading systems is up for dispute.
The answer may be to publish the test answer data raw and let those who need the info analyse it as they see fit.
This is already done with Ph.D theses.