Evaluate This!

Timothy Burke, my go-to-guy for deep thoughts about academia, had a nice post about student evaluations last week. Not ecvaluations of students, evaluations by students-- those little forms that students fill out at many schools (not Swarthmore, though) giving their opinion of the class in a variety of areas.

(Probably not entirely coincidentally, as this is the time of year when semester-school faculty fret about evaluation scores, Inside Higher Ed offers yet another RateMy Professor.com article, showing a positive correlation between "hotness" and positive evaluations there...)

Those evaluations are still a few weeks off for me, but they remain a constant back-of-the-mind concern, especially for the untenured. Particularly with intro classes, it's really hard to get a read on exactly what the students think of you, and more or less impossible to predict what they'll put on the evaluations. I've gotten bad evaluations from classes I thought liked me, and good ones from students I was sure hated me. It's distressingly random given the amount of weight those evaluations carry.

I do, however, have a secret trick for improving the usefulness of the evaluations, which also seems to improve the numerical scores somewhat, and which I will present below the fold:

The trick to getting useful information out of student evaluations is to ask for it. We have a form that we use that is half numerical (rate things on a scale of 1-5), and half written (respond to these general questions), and the questions on the written part are incredibly vague, because they need to be used for all the classes. The responses to these questions also tend to be awfully generic ("Fine" or "Yes" are common answers, and none of them are yes-no questions...)

So what I do is I ask more specific questions. Before handing out the forms, I explain that the written comments are transcribed by the secretary and given to me, and ask them to use the written forms to help me improve my teaching. I make a list on the blackboard of things that I did differently that term (or that I do differently than other faculty teaching the course), and ask them to comment on those specific items somewhere within the school's generic questions.

Weirdly, this not only gets me better responses to the written questions, it also seems to improve the scores on the numerical side, I think because they have to think a little more carefully about the course and its larger context, and don't just write down random numbers. (Though, to be fair, I didn't really hit on this until after I'd been teaching a couple of years, so the improvement might just be a general improvement in my teaching, not a result of the questions.) And it does get me some actual useful information on the written side-- I've had students pick up on things that I didn't think they'd really notice, and some of them have spontaneously said some really nice things.

On the larger question of whether these evaluations actually measure anything useful, I don't really have much to say. I think the numerical scores are probably useful as a sort of threshhold indicator-- if your scores are mostly above the historical course average, you're fine, if they're mostly below, there were some problems-- but beyond that, I wouldn't put much weight on them, if I were designing a college. Of course, I'm stuck with them, so I'm just happy to have an ethical way to improve the quality of the evaluations.

Credit where due: I stumbled on this idea more or less on my own, but after the first term asking additional questions, a senior faculty member in another department mentioned that she does the same thing, and confirmed my observations. So it's not just me.


More like this

One of my least favorite end-of-term rituals for faculty is the dreaded student course evaluations. These have two components: the numerical bubble-sheet evaluations, which provide the pseudo-quantitatvie evaluation used to compare courses, and written responses to a half-dozen very general…
Over at evolgen, RPM is indignant about being rated by students, citing some pig-ignorant comments from RateMyProfessors. Interestingly, someone brought this up to the Dean Dad a little while ago, and he had an interesting response: A reader wrote to ask a dean's-eye perspective on ratemyprofessors…
As I said last week, I recently wrapped up a term experimenting with "active learning" techniques in the two intro courses I was teaching. The diagnostic test results were a mixed bag-- one section showed really good improvement in their scores, the other was no better than the same class with…
If you've been a student or faculty member at an American college or university in the past twenty years or so, you've almost certainly run across student course evaluation surveys. They're different in detail, but the key idea is always the same: toward the end of the term, students in every…

Years ago I was a visiting prof at a college. At end of semester I was told I had to hand out a sheet for student evaluations of me. I was not to be in the room while this was being done. I had student hand out forms, smiled, and, pausing at the door said: Be honest. Give your true opinion. What you say doesn't effectd me personally, despite the fact that my wife is giving birth next week; I have a large debt on my home, and one of my three kids is very will with ...well, very ill and needs many treatments...If things don't work out, I can try to get badk into the army.

By fred lapides (not verified) on 08 May 2006 #permalink

I just asked for evals in my intro class last week, with a great deal of trepidation. This has been a very difficult class; students don't say a thing in the discussion sections, and I've been nagging and cajoling and standing up and singing trying to get them to do the readings, form opinions, and express them in class. It's been a constant struggle to get them motivated.

What are the students going to say on the evals? That I'm an annoying, pushy scold, or that I beat my brains out trying to help them understand? It's a complete tossup. Unless they make some specific suggestions, no matter what way the numerical scores go, I don't think I'm going to learn anything from this process.

(And no, it's not just that I'm incompetent at running discussion sections. Most semesters, they're lively and interesting...but for some reason this term, and unusually in my experience, the chemistry wasn't there with the students. )

While I was teaching a class as part of my graduate studies, we had student evaluations - and though they didn't affect our jobs, the results still felt important. And the one thing we graduate students could find that really seemed to correlate well with the evaluations was - gratuitous strictness.

And no, being stricter didn't decrease the scores; it tended to increase it. One year I was, for one reason or another, not very happy about teaching the class (we had some scheduling problems that ended up with me taking the class on short notice despite promises not to do so), and I unfortunately did take it out on the students to some degree. I would start every class by locking the door - if you were seconds late you'd miss the entire class. I threw out people talking in class, or using their cellphones. I'd publicly refuse handing out notes about earlier classes to people who missed them without a real reason (doctor's note, please), and I'd refuse to answer _anything_ about the exam, saying that everything in the class is important. I never repeated what was in the literature; the assumption was that everybody had read it so I always went straight on the discussion. In short, I was being a bit of an a**hole.

At the end I have never gotten such good evaluations, ever. Of course, I haven't had the courage to repeat that performance again, so I don't know if it was a fluke. :)