Where do you want scientists to learn ethics?

Because I am engaged in a struggle with mass quantities of grading, I'm reviving a post from the vault to tide you over. I have added some new details in square brackets, and as always, I welcome your insight here.

I just got back [in Octiber of 2005] from talking with an outside evaluator about the federally funded training grant project at my university that tries to get more of our students to graduate school in science. The evaluator is here not at the behest of the funding agency, but rather at the request of the science professor here who oversees the program. Because, you know, he wants to know how good a job we're doing at what we think we're doing, so we can make improvements if needed, and he figured an outside guy who has evaluated other such programs could give us some good insight here.

Let me pause to note that the folks who are this serious about making their efforts successful are a big part of why I love working here.

Anyway, I was on the agenda because I teach the ethics course the federal funding agency requires of students supported by training grants like this one. (That is to say, they require some course on research ethics; I developed the particular course the students in the program are taking.) We had an interesting discussion, the evaluator and I, about the genesis of the course, the enrollment, the syllabus, and such. And, in the course of this discussion, we arrived at one of the nagging worries I have about courses like this:

It is possible that learning ethics (even ethics-for-scientists) from a class in a philosophy department will have less of an impact on science students than learning ethics from their science professors would have.

Part of this, I'm afraid, is the curse of the required class. Kids hate required classes, even if what they're required to take has potential value for their lives down the road. Anyone who has taught a class with prerequisites has probably had experience with students who took a class they were required to take and promptly forgot almost all of it ... because check the transcript, you did that class. That shouldn't mean you need to waste valuable brain real estate remembering anything you learned.

And seriously, a class from a philosopher? What the heck does that have to do with learning to be a good scientist? (Set aside the fact that a science professor approached me to develop this course, and that one of the science departments here decided, without consulting me about it, to make this class a requirement for their majors, with at least one other science department thinking about following suit.)

In my case, I actually have some ammunition by way of my misspent scientific youth. Y'all are looking at going to grad school in a science? Been there, done that, wrote the dissertation, got a Ph.D. But, there are other schools where the science majors take their ethics from philosophy departments and the philosophers can't necessarily throw down so effectively.

[This may actually become an issue soon for the ethics in science course in my department. Demand for the course has grown, which means it's likely we'll have to offer multiple sections, and I may not be able to teach all of them. My colleagues are great, but not all of them have a history in research labs that they can draw on -- something that, in my experience, makes my discussions of the ethical landscape of the working scientist a lot more credible to my students.

Any suggestions for how to get a philosopher without a misspent scientific youth up to speed on this?

The analogous issue of my lack of experience in the tribe of engineering is part of why I've been picking your brains about how engineering is similar to and different from engineering. Note, though, that mmy gig teaching ethics to frosh engineering students will be as part of an actual engineering course, where it's been deemed important enough to devote four whole class meetings to it. Will this kind of commitment of scarce instructional time signal serious faculty buy-in to the students? Stay tuned!]

There's still the worry that, if you put all the discussions of ethics in a one semester course over in some other department, you convey the distinct impression that: (1) thinking about these issues for a semester is sufficient, and/or (2) no one in your home department can teach you what you need to know about ethics, and/or (3) grown-up scientists don't actually need to pay attention to ethics. [We've seen some evidence of this last attitude, haven't we?] Obviously, I think all of these are misapprehensions. Indeed, the science professors I've talked to here are really good at highlighting responsible conduct of research (RCR) issues in all kinds of contexts. These professors understand that RCR is still relevant in their work, and they even seem to talk to each other about the best ways to conduct their research rather than letting things blow up and calling in the ethicists in Haz-Mat suits.

It has also been pointed out to me that a bunch of science professors would actually enjoy coming to the ethics class but they don't for fear that it would stifle the class discussions. Given how discussions in this class tend to be (brutally honest, with lots of critical examination of how things are done in real labs -- some good, and some bad), it's probably true that the presence of an authority figure from a science department would change the dynamic. So I guess this is a real advantage of offering the course in the philosophy department: students from different scientific disciplines get to discuss their experiences among different branches of the tribe of science on neutral turf.

Still, I can't help but think it would be better if there were some kind of forum for discussions of ethics back in the students' home departments -- a lunch group, a once a month seminar or group-meeting type thing, something. This would help the students understand that their professors really care about this stuff, too. And, it would give the faculty a regular channel for talking together about RCR issues -- because people seem to do better with ethical decisions when they can chew them over with a group.

I imagine I'll keep thinking of ways to optimize this. If my history here is any guide, the science professors I'll be looking to for help implementing my harebrained schemes will be receptive.

They provide a nice contrast to the chair of another department here, who showed up, frantic, at our department the other day. Their department failed to get accreditation because the course (in our department) they had been using as an ethics course didn't satisfy the acrediting agency. So, they wanted us (of course) to whip out a specialized ethics course that would satisfy the acrediting agency. Immediately. In trying to impress upon our department chair just how badly they needed us to solve this problem for them, the chair of the other department exclaimed, "We don't know anything about ethics!"

Sugar-dumpling, that's what scares me.

More like this

The messenger is irrelevant. This is not the problem. The problem is the message of the "scientific ethics course". Nobody (or at least, very few people) start off in science because it is a great place to cheat, fake data, sit on papers to rush one's own work out, etc. So most people know, at some level, what the ethical conduct is supposed to be. Therefore the "ethics" class which repeats "don't cheat" ad nauseum loses the audience.

The real question is why do otherwise well meaning scientists start to slip down the slope that ends up with outright data faking and other bad behavior? And then continue to self-justify with all the the usual garbage?

It is quite simple. because cheating pays off in this biz and one's chances of getting caught are minimal. Notice how the cheaters who actually get driven out of science seem to be the ones with a huge track record of multiple fakes? Notice how when a lab is caught pretty much dead to rights with fakery, they just get away with saying it was a "mistake" or blame it on some postdoc who cannot (conveniently) be located or vigorously protests the retraction?

Is this cynical? no this is realistic. Does it mean that everyone cheats? no, probably it is still just a minority but really who knows? much of modern bioscience is essentially unreplicable, in part because novelty is so revered. until we get to the point where rigorous, meticulous, internally consistent, replicable and incrementally advancing science is respected more than the current Science/Nature type of paper, all contingencies drive towards bad behavior rather than good behavior.

when ethics classes start to deal with the realities of life and career and the motivations and contingencies at work, well, then they will be relevant. it won't matter who teaches them...

[This may actually become an issue soon for the ethics in science course in my department. Demand for the course has grown, which means it's likely we'll have to offer multiple sections, and I may not be able to teach all of them. My colleagues are great, but not all of them have a history in research labs that they can draw on -- something that, in my experience, makes my discussions of the ethical landscape of the working scientist a lot more credible to my students.

Any suggestions for how to get a philosopher without a misspent scientific youth up to speed on this?

I think that you just have to find a way to tap into the heads of the students. Find a common goal they all have (graduate school, med school, etc) and on the first day or two of class drive home the reasons that being familiar with ethics will help them accomplish their dreams.
As to the multiple sections, would it be possible to team teach the courses along with faculty from the relevant science department? This might also help with any stigma that science students might have against taking an entire course (gasp) from a philosopher.

Drugmonkey, I think maybe you and I are seeing the question from different angles. Ethics and science is not, to me, about cheating -- it's about asking yourself whether the world needs [thing], whether [thing] could be used for ghastly misdeeds, whether [thing] is legitimately ethical against the larger human social framework.

It's an interesting question. Rather than have an "ethics of science" course per se, it might be preferable to develop science curricula that includes ethical dilemmas as thought/discussion experiments which are posed throughout the education course.

Warren: Well that is an interesting thought. I've certainly never run across this in the types of mandated-by-training-grant training in "scientific ethics" that I've been around. Most of the content is of the "how to conduct your science ethically" rather than questioning whether the topics are ethical to pursue. I suppose the closest we ever come to this is in discussing the animal and human subject use issues. I can imagine that issues similar to your concern may start sneaking in with the advent of gene therapy and whole-organism cloning. This is not the primary focus in my experience, however.

I will submit to you, however, that even if such content was a focus we would eventually come around to the points I've raised. In other words, if the ethics of, say, publishing a paper which would supply a critical advance toward whole-human cloning were put up against the inevitable fame/fortune/made scientific career/possible Nobel track... well, we're right back to discussing contingencies, aren't we?

The tale of
Egaz Moniz
is relevant. Although I won't suggest that we know that he was/was not sincere and of good character in advance, this history lesson illustrates my point.

I think whistleblowers in science should participate (at least in sections of) in the course on ethics and science by presenting their own stories. The misconduct in science today is an occurrance that spills beyond the unethical scientist him/herself. Many times, others (scientists and administrators) join in protecting the unethical scientist for unscientific reasons (friendship, protection of the scientist's institution's reputation, etc), such that the unethical activity becomes the bag of many. Additionally, university administrators or faculty who served on investigative committees of misconduct should also participate. There is no better attraction (for students)than the real stuff.

By S. Rivlin (not verified) on 24 May 2007 #permalink

umm...i pretty much agree with drug monkey.

and, to add, i wouldn't have cared who my ethics class had been taught by if it had been taught WELL. i suspect the same is true for most folks.

some kind of forum for discussions of ethics back in the students' home departments

team teach the courses along with faculty from the relevant science department

Those look like pretty good compromises, both of which preserve the neutral territory, keep the responsibility for course development where it belongs (with the experts -- the philosophers) and add the required "hey, this stuff matters" clout.

SRivlin: and what would be the conclusion that trainees would take away from your experience? that it pays to be SuperEthicalScientist or not? how about those poor schmucks at UWmadison? what would we take from their experience?

Drugmonkey,

They are poor, but they are not schmocks. You are either ethical or you are not. There is no superethical. As to student whistleblowers, it is the responsibility of the academic institution where they blew the whistle to not only assure the continuation of their graduate work, but to reward them for their ethical behavior. Many scientists who sit quietly on their behinds, knowing of unethical behavior of peers of theirs should learn a lesson from these students in Wisconsin. Yes, I would absolutely bring these students to my class on ethics and science, let my students listen to their ordeals and then celebrate with them.

As long as we continue to pretend that misconduct in science is a minor issue that we should not be too concerned about, your attitude regarding these "schmocks" will be the prevailing one and the ethics police will not be far behind. Scientists themselves must do the policing job and each of these policing cadets in Wisconsin should get a medal.

By S. Rivlin (not verified) on 24 May 2007 #permalink

SRivlin: sorry got a little colloquial there. "poor schmuck" connotes one who is suffering the consequences of a bad situation not of one's own making. to me anyway.

The point of this example is that it is all well and good to point out what "should" be. The sad fact is, that often what "should" happen does not. People cheat in science and the reap tangible benefits. The chances of getting "caught" (i.e., to the extent of seriously affecting one's career) are low. And this all influences behavior. More strongly than all the hypothetical scenarios and finger wagging presented in the typical science-ethics training class can ever do. The evidence from the Wisc students will be a powerful anti-whistle-blower inducement, stronger than all the black/white admonishment in a lecture can ever be.

look, finger wagging, tut-tut'ing and holier-than-thou-ing feels good. I've been known to
do a little myself
. but i'm not sure it is very effectual because it doesn't address the motivations. We need some new approaches.

Drugmonkey, with all due respect, you've gave in on educating our future scientists on the "rights" and "wrongs" in scientific research because, according to you, the low level of risk of getting caught compared to the potential gains from one's misconduct, override any benefit of tut-tuting. Although I agree that among the students who attend ethics in science class there will alway be those who would choose the "low risk of being caught" over ethics, the rest of the students should be taught the ethics of science so they will not go astray simply out of ignorance of the ethical standards.
Without the whistleblowers who continue to blow their whistles, we probably all still dream that science today is all about discovering new knowledge and its practitioners are all pure and innocent.

By S. Rivlin (not verified) on 25 May 2007 #permalink

If people don't care about discovering the truth, the only thing that will stop them from cheating is if they run a serious risk of being caught and facing harsh penalties.

So point out how effective science is at uncovering deception - not in the short term, but in the long.

By Caledonian (not verified) on 25 May 2007 #permalink

SRivlin:
I've not "given up". I think we need to do better. The current approach to "ethics training" for scientists isn't getting the job done.

I think the basic rules are obvious to most, although I could be wrong on this. I've just never run into anyone in the early stages of training that thought that data faking was acceptable. "You should only be an author on something you contributed to in a material way" and "Don't hose other labs for personal gain" aren't a surprise to people, in my experience.

I've heard people espouse nativist sounding positions that suggest that perhaps some educational systems outside the US do not generate as consistent a view on ethics. My area is not dominated by nonUS-trained postdocs and certainly not from the systems that seem to be the continued target of such nativist sounding views. So I don't really have a dog in that hunt. Could be the case, might not be. If so, it would argue the value of the basic training that I think is redundant.

By Drugmonkey (not verified) on 27 May 2007 #permalink