There's a nice commentary in the most recent issue of Cell about scientists' apparent aversion to thinking about ethics, and the reasons they give for thinking about other things instead. You may not be able to get to the full article via the link (unless, say, you're hooked up to a library with an institutional subscription to Cell), but BrightSurf has a brief description of it.
And, of course, I'm going to say a bit about it here.
The author of the commentary, University of Pennsylvania bioethicist Paul Root Wolpe, identifies seven main reasons scientists give for not thinking about ethics. These seven (deadly?) excuses are:
- "I'm not trained in ethics."
- "My scientific work has little to do with ethics."
- "Ethics is arbitrary."
- "Ethicists mostly say 'no' to new technologies."
- "Others will make the ethical decisions."
- "The public does not know what it wants."
- "Knowledge is intrinsically good."
- "If I don't do it, someone else will."
Wolpe has it right: I have heard all of these before from scientists. (Not all scientists, mind you -- I know quite a few who are very engaged in thinking about how to do science ethically, and about the larger ethical implications of the science they are doing.) But, Wolpe makes the case that none of these is a very good reason not to think about ethics.
Take the first two reasons. Even scientists who have never taken an ethics class have learned something about ethics by observing the conduct of their advisors and mentors. Indeed, part of what they learn is how a certain kind of ethical conduct matters to scientific practice -- reporting results accurately rather than making them up, sharing materials, not plagiarizing, etc. To that extent, good scientific work has quite a lot to do with ethics. Moreover, I think it's a mistake to think that ethics coursework is necessary or sufficient for engagement with ethical issues. As Wolpe writes:
To remain true to the highest goals of science, scientists should periodically revisit the big questions: What is science for? What are the values I bring to my scientific work? Why did I become a scientist, and why am I one now? What are the moral motivations, inclinations, and principles at the heart of my scientific pursuits? How do I advance the cause of scientific progress? Whom does my research serve? Serious consideration of those questions qualifies a scientist for participation in the ongoing discussion of scientific values, even without a specialized training in ethics. (p. 1023)
The "ethics is arbitrary" excuse is another one I hear a lot, especially from people who have had ethics coursework. A standard feature of such coursework is usually the Battle of the Ethical Theories, where "problem cases" are trotted out to illustrated instances where, say, Kantianism and utilitarianism disagree wildly about the right thing to do. You might conclude from such examples that the existing ethical theories are all deeply flawed, no one actually knows which one is right, and at bottom ethical theories are a matter of taste -- unlike science, which is trying to latch onto Something Real.
Here, Wolpe notes that "there is widespread consensus on a host of ethical issues in science policy" (p. 1024) -- in other words, most of the time, we're not dealing with "problem cases" where different ethical theories give radically different recommendations. Moreover, he notes:
At the boundaries of the consensus are areas of ethical debate, but that is how it should be. The public discourse eventually may make its way to consensus, but in ethics, process is at least as important as product. (p. 1024)
The responses Wolpe gives to the other excuses are useful, and if you want to talk about a particular excuse from the list (or add another), give a holler in the comments.
In his conclusion, he points out that the scientist who opts out of thinking about ethics is taking a risk:
Science has become one of the most powerful and pervasive forces for change in modern society. As the professionals at its help, scientists have a unique responsibility to shepherd that change with careful ethical scrutiny of their own behavior and thoughtful advocacy of scientific research. If scientists find reasons not to do so, the public will find ways to do it for them, and the results may not always be in the best interests of science or society. (p. 1025; bold emphasis added)
Have we got you attention now, scientists?
Good. Let's talk about ethics.
-------
Paul Root Wolpe, "Reasons Scientists Avoid Thinking About Ethics." Cell, 125, June 16, 2006, 1023-1025.
Props to Abel for bringing the article to my attention, and to my better half for actually getting me a copy of the article.
Can we give other reasons why we don't like ethics?
Ethics is anti-individualist.
The idea of group ethics requires assent to some sort of group-think, which many ruggedly individualistic scientists dislike.
Power corrupts, and ethical power corrupts hypocritically.
Ethical guidelines, rules, and conventions make it harder for outsiders, particularly self-funded outsiders, to break into the system independently without running afoul of some ethical rule. So ethics can be used as a form of class or cultural repression.
And finally,
Denying ethics makes it easier to supress guilt.
I know I shouldn't have seduced Stephan's fiancee to gain access to his grant proposal, but if I deny any pretense to observing ethical behavior, I can convince myself that I'm bad in a good way, instead of bad in a bad way.
It's always fascinating to me when students -- engineers, mostly, for me -- use the "ethics is anti-individualist" response that Lab Lemming gives: "The idea of group ethics requires assent to some sort of group-think, which many ruggedly individualistic scientists dislike."
Yet scientists are quite willing, all the time, to assent to group think in terms that are not generally cast as negative: of the agreed upon social value of science, the formation of labs and universities, the belief in funding requirements, the need for autonomy from political persuasion, the trust in peer review, the perceived merit of independent confirmation, and so on.
So does it all come back, basically, to the first excuse given above: "I'm not trained in ethics." ? Because that then stands as an excuse to not address ethical issues, not as an excuse to do questinable or less valued things.
The key words are "willing" and "assent". Both of these involve choice. In my home, I have a choice over whether I trap rats, poison them, or subject them to predation by cats, terriers, boa constrictors, or thoer carnivorous pets. In the lab, I do not. Te reason is that these ethical guidelines on the disposal of animals have been handed down in an undemocratic fashion by an evangelical minority who wishes to impose their views on practicioners of science. If I was to doorknock my street, I'm willing to bet a slab of beer that nobody would say that they capture rats unharmed, then euthanize them. Fact is, most people don't even care how rats are killed. And that is the problem. By allowing rules to be established on subjects that are non-issues for most people, we open the door for obsessive extremists to impose inane conditions on our work. But even if their values were mainstream, that doesn't give them the right to force conformity on us, or anyone else.
Thank you for this post! I've linked to it over at my blog -- here: http://scientificallyminded.typepad.com/scientificallyminded/2006/06/th…