In the latest issue of The Scientist, there's an article (free registration required) by C. Neal Stewart, Jr., and J. Lannett Edwards, two biologists at the University of Tennessee, about how they came to teach a graduate course on research ethics and what they learned from the experience:
Both of us, independently, have been "victims" of research misconduct - plagiarism as well as fabricated data. One day, while venting about these experiences, we agreed to co-teach a very practical graduate course on research ethics: "Research Ethics for the Life Sciences." The hope was that we could ward off future problems for us, our profession, and, ultimately, society.
They go on to describe their approach to this course -- how they backed away from focusing on the topics you might find in an official "bioethics" course because neither of them was an ethicist, sticking instead to a detailed discussion of "best practices" around scientific publications and grants, scientific ideas, and working interactions with others in scientific research groups and disciplinary communities.
In other words, they designed the class to tackle some of the nuts and bolts of being a grown-up scientist.
I like very much that they trusted themselves to be able to say something useful about how to behave ethically in their fields even though neither is a card-carrying ethicists. It's important to recognize that being ethical is just a normal part of being a good scientist. (We'd be in real trouble if only ethicists behaved ethically.) And, as I've noted before, it's quite likely that students will take ethics more seriously if they're getting explicit instruction from people in their own field rather than from some random philosophy professor.
Even more, I like that they conceived of this course as a way to do something good for the profession by paying attention to the training of its new members. And, their description of how the class actually went makes it clear that they established an atmosphere where the grad students got in the habit of talking with each other about good ways to do things. To the extent that these conversations become a regular part of these scientists' professional lives, the community can only benefit.
There are a few bullet points in their rundown of what they learned where I'd like to offer some comment of my own:
- Case studies are a powerful tool. They personalized real events and problems. They helped us all empathize with wrongdoers and victims, roles we've found ourselves in from time to time.
I'm also a fan of case studies. In my experience, discussing them can be even more productive if you can also lay out something like a strategy for approaching them and making reasonably objective ethical decisions in a particular set of circumstances. I don't know from the article whether Stewart and Edwards do this, but there are good resources out there that they could draw on.
- Keep class size small, with a limit of 20 students. In larger classes, shy students might not feel comfortable with sharing.
In an ideal world, this would be the way to go. Of course, some of us teach in university systems where "resources follow enrollments" -- which means small class sizes stand a good chance of getting your budget for the next academic year cut.
Also, there's the issue of supply and demand. In programs whose graduate cohorts number 20 or fewer per year, and where there exist faculty committed to teaching a class like this every year, a small class size is feasible. In places where there's just one person who feels competent to teach such a class, and many students wanting or needing to take it, once again the class can get big.
This term, I'm teaching two sections of my "Ethics in Science" course, each capped at 50 students (and full). It's a major requirement for many of my students, so a lot of them might end up with a longer time to degree if we set a lower cap. My department is still working on the question of who else could teach the course (a question that is pressing given that I hope to be taking a sabbatical next year).
- Don't focus on morality. Focus on ethics. One of our students thanked us for that specifically.
I kind of wish Stewart and Edwards had explained what they take to be the difference here; I'm not sure I know what precisely the difference is supposed to be. If their main concern is that they ground "best practices" in what's good for the scientific community (rather than in divine prescriptions or something), that makes sense. Of course, given that scientists live in societies full of non-scientists (and appeal to those non-scientists to fund scientific activities), a little bit of attention to the interests of these non-scientists is probably a good thing, too.
The article also mentioned that they talked about women in science as a "special topic", which I found a little worrisome, especially in biology (where there's a more even gender balance among graduate students than in most sciences). Why wouldn't this be an "everyday" topic? But, looking at their syllabus (PDF), it seems that the "special topics" indicate topics chosen by the students in the course for the last three class meetings. In other words, this was an issue the students wanted to talk about -- making it central rather than peripheral to the conversation.
I'm glad Stewart and Edwards shared their experiences around this class, and I hope more scientists will follow suit. It seems like a good strategy for creating the positive changes one would like to see in one's professional community.
Hat-tip to Zuska for pointing me to the article.
- Log in to post comments
Wow, that sounds like they've worked out a good approach.
A thought based on my own experience in ethics class- case studies are great, if they are at all realistic. When they describe cases that are 'too' clear cut, it can be really dull to discuss them. On the other hand, some of the most interesting moments in the class were when someone thought a case was clear-cut only to have to rethink it.
I don't think this clarifies what *they* meant, but to *me* morality tends more toward appeals to right/wrong, as opposed to ethics being more to good/bad (useful/not useful or positive/negative consequences). In that context, morality can be very frustrating to discuss.
As a very simplistic example a moral argument against plagarism is "plagarism is a form of lying, which is Just Wrong". An ethical argument against plagarism is "plagarism results in distrust of our colleages, so it creates a Less Fun and Less Functional working environment".
*note, I don't claim these are the actual definitions of "moral" and "ethical"- just connotations I associate with them which may or may not be solely part of my own ideolect.
I saw that at Writedit this morning, which also has a lengthy discussion and begins with a link to an earlier post on what I thought was more interesting news on this subject.
I also was unclear on the distinction between "ethics" and "morality" -- I'm glad to see a specialist is also confused. My guess is that they tried not to accuse participants of wrongdoing or acceptance for wrongdoing, even when students' anecdotes or reaction to case studies conflicts with the "best practices" they advocate.