-
""I've never had so much trouble giving away a million dollars," Chancellor Mike McKinney said, laughing.
That's because he's never spent it like this. McKinney plans to give up to $10,000 bonuses to instructors based on anonymous student evaluations" -
"With faculty (and administrative) searches being cancelled left and right, Iâm thinking that this dropoff might be the final nail in the coffin of the idea of âmeritocracy.â Simply put, the searches being cancelled now are no reflection of the quality of candidates, any more than the boom market of the sixties was a reflection of the quality of candidates then. The market is dramatically tougher now than it was just two or three years ago; to suggest that the candidate pool worsened in that time by several orders of magnitude is simply silly. The disconnect between âcandidate qualityâ and âmarket qualityâ is so dramatic, at this point, that the âmeritâ narrative is simply unsustainable. And that may actually be a good thing."
-
"Last week the American Astronomical Society had their big annual meeting, full of groundbreaking discoveries. [...] Now that the conference has finished up, I can go through and give you my five favorite discoveries. "
-
"The ancient civilization that thrived in the mountain valleys of Yemen was famed for cultivating bountiful crops with a system of cisterns and aqueducts, but today the country is threatened with a major food crisis that experts fear could destabilize the poor neighbor of the Middle East's oil and banking giants."
-
"Shortly after Newton proposed his new mechanics, the "shut up and calculate" approach of Newton, Halley and others produced the first astonishing results. However, it did not take long until the foundational debate about the interpretation of the new physics began. In particular, the true meaning of the position coordinates x(t) was heavily discussed."
-
"[T]he press release states that "Images taken by the Mars Exploration Rover Spirit show small rocks regularly spaced about 5 to 7 centimeters apart on the intercrater plains between Lahontan Crater and the Columbia Hills," which is where this image came from. How can you get the rocks to space themselves out evenly? "
-
More than you ever wanted to know about recessed drive screws.
-
Crimes that may not necessarily have paid, but that were amusingly clever. Vote for your favorite.
- Log in to post comments
More like this
Physics Buzz: 490 billion nanometers tall
"There are seven SI base units: meter (m), kilogram (kg), second (s), ampere (A), kelvin (K), mole (mol), and candela (cd). The other SI units are derived from these seven: acceleration is m/s^2, density is kg/m^3, magnetic field strength is A/m, etc.…
Bad Boys | Film | A.V. Club
"From this rich central dynamic, all manner of hilarity springs. Smith is fastidious about the upkeep of his expensive sports car. Lawrence is a slob who gets his messy hamburger all over its glistening interior! How will they ever be able to work together? My…
Confessions of a Community College Dean: Remedial Levels
"[T]he CCRC found that the single strongest predictor of student success that's actually under the college's control -- so I'm ignoring gender and income of student, since we take all comers -- is length of sequence. The shorter the…
The Virtuosi: Would a laser gun recoil?
"Let's motivate our question a little bit. I've wondered about this question since I saw star wars. Though I'm no firearms expert, the recoil in guns must come from conservation of momentum principles. Momentum is conserved in a system. The gun starts…
I would like to comment about the link to the story about bonuses for student evaluations. Clearly, this seems like a bad idea. Students (especially at a public institution like mine) already think evaluations are a trade off for grades. With this it would be doubly clear that would be so.
I think there needs to be feedback to the administration and to the instructor, but the evaluation method here (similar to most places) tells you basically nothing. In looking at student evaluations in regard to tenure situations, I take the stance that extremely low or high evaluation scores is a sign that something is going on and I should observe and examine that faculty member more closely.
Here is my favorite evaluation note (and I can't find the original link) - there is a correlation between good looks and evaluation scores (http://chronicle.com/jobs/news/2003/10/2003101501c.htm). But remember, correlation does not mean causation.
In the handful of discussions that I've seen about student evaluations, one thing has always been missing: What about weighting the evaluations according to the student's overall performance in all courses? I'd be much more interested in the opinions of a senior with a three-year record of earning As and Bs in tough courses than in the opinions of a sophomore about to flunk out and looking for somebody other than himself to blame. So, why not use a (heavily?) GPA-weighted average of the evaluation scores instead of just an unweighted average?
I don't claim this suggestion is perfect. In particular I'd _really_ like to hear from an honestly struggling student who did not get proper help. Alas, just looking at GPA doesn't distinguish well between the lazy-and-immature vs. the struggling-and-unsupported, so that's a problem with my suggestion. How do you fix this? How about attaching extra weight to evaluations from students who did worse in a particular class than what you'd otherwise expect from their GPA from other classes? If an A-B student suddenly gets an odd C-, something may be up with the instructor or curriculum design. Weight it.
Here's another neglected way to evaluate: Instead of polling students at the end of a class term, ask graduating seniors which instructors they learned the most from, or received the most help from, over their four years. Might lead to more mature and reflective evaluations. "When I was a freshman, I thought he was too demanding about esoteric stuff nobody really needs to know, but I later came to appreciate what this prepared me to do..." Comparing graduating-senior evaluations against immediate-feedback evaluations should be quite revealing, and might wash out some of the popularity-contest and weed-out-course problems. But do any schools do this systematically?
Also, I'd like to echo the first commenter: The standard deviation in an instructor's evaluations might say more about the teaching quality than the evaluation average itself.
In the handful of discussions that I've seen about student evaluations, one thing has always been missing: What about weighting the evaluations according to the student's overall performance in all courses? I'd be much more interested in the opinions of a senior with a three-year record of earning As and Bs in tough courses than in the opinions of a sophomore about to flunk out and looking for somebody other than himself to blame. So, why not use a (heavily?) GPA-weighted average of the evaluation scores instead of just an unweighted average?
Confidentiality.
In order to calculate the average for the weighting, somebody would need to know which students made which comments. Even if it's only the department secretary who knows, that knowledge might very well have a chilling effect on students, and discourage them from answering honestly.
There are things you can do to get at some of that information-- our course comment forms ask students to self-report their expected grades, for example, and also whether they're taking the class as part of their major, or as an elective. I know people who have calculated averages for different categories, and used those numbers to try to make some point or another.
But in the end, if you want students to give their real opinions, you probably need to let them comment relatively anonymously, so weighting comments by GPA is probably a non-starter.