In my last post, I mentioned Richard Gallagher's piece in The Scientist, Fairness for Fraudsters, wherein Gallagher argues that online archived publications ought to be scrubbed of the names of scientists sanctioned by the ORI for misconduct so that they don't keep paying after they have served their sentence. There, I sketched my reasons for disagreeing with Gallagher.
But there's another piece of his article that I'd like to consider: the alternative strategies he suggests to discourage scientific fraud.
There are much better methods of subverting fraud. There is little one can do about the motive to commit fraud, but one can have an influence on opportunity, for instance by:
- Prioritizing the teaching of a research code of conduct
- Strengthening procedures to detect and handle misconduct within academic institutions, which are quixotic at present
- Increasing the funding for ORI and similar agencies to improve levels of surveillance
- Giving formal responsibilities to journal editors for the identification of fraud
- Increasing the penalties for researchers found guilty of misconduct during the finite period of their sentence
First off, I'm not sure I buy the assertion that the motive end of the crime is an immovable object. It's true that getting something for nothing is always a tempting alternative to actually doing the work involved in getting that something honestly. But this doesn't preclude the possibility that there are ways the institutional context of science (whether academic science or science in the private sector) could be adjusted to reduce the expected payoff from cheating.
I'll grant, however, that structural changes take the coordinated efforts of a lot of people, whereas maintaining the status quo mostly requires sloth. Still, if enough members of the tribe of science decided that such changes were a good idea, it could happen.
As for Gallagher's bullet points, I'm throwing it open to you all. Do you think scientific misconduct would be reduced with the adoption of a formal code of ethics? With more ethics coursework? With more policing by journal editors? In the middle of the university budgetpocalypse, should academic institutions overhaul their mechanisms for detecting and addressing misconduct (and how, precisely, do you think they should do that)?
Knowing that you commenters have lots of strong opinions, relevant experiences, and smarts, I look forward to your response to Gallagher's suggestions, and your preferred alternatives.
- Log in to post comments
Well, in computer science we have the Association for Computing Machinery, which does have a formal code of ethics. I don't know if it has reduced fraud by computer professionals. Google reveals nothing about instances of fraud by ACM members, but that doesn't necessarily answer your question.
However, an organization like the ACM, with a code of ethics, whose reputation is reflected by the conduct of its members, could be a good thing for research science. Kind of like a Good Housekeeping Seal of Approval for researchers.
I'm going to stop there, though, since I manifestly don't know what I'm talking about. I'm interested to see any more posts in this thread...
Maybe now that people know that what at first seems like a slap on the wrist is indeed a life sentence they will feel discouraged...
I don't think scientific fraud is really much different from business fraud (and both are probably more rampant and difficult to detect, than recognized)-- the only one of the above suggestions that even might have any real impact are significantly 'increased penalties.'
I think Patchi is probably on to something. Schools and journals are likely to be swayed by funding issues - if throwing graduate students or other (smaller) professors under the bus to keep the spigot flowing is necessary, they will do it. (Example - Columbia's "investigation" of the multiple retractions and alleged falsification from the Sames group). I don't know if there's enough peer pressure for people (at least in chemistry) to be honest. The knowledge that enough people might know that you were dishonest, that the consequences of a conscious misstep without clear explanation will last forever, might restrain people from misbehavior. I think it's analogous to having a close circle of people who have a common belief who hold each other accountable for their actions - if you don't think anyone will see you, you might do bad things, but the knowledge that people you care about might know what you do should make that less likely.
Formal codes of anything are useless without enforcement, and journal editors can't be expected to double as science cops (how are they supposed to investigate anything, for starters?).
Teaching ethics might help; teaching philosophy of science might help even more, by driving home the point that misconduct is bad science.
Ceteris (particularly budget concerns) paribus, it might also help to overhaul the ORI, particularly at the local level, rooting out political appointees whose primary concern is keeping the university's reputation intact by covering up everything that can be covered up.
But none of that is going to do much of anything in the long run until the incentives to cheat are reduced. The current system is winner-take-all: the only thing that counts is being First, preferably in a Prestige Journal (ugh, don't get me started). Perceptions of injustice are both widespread (I'd say almost universal except perhaps in the highest ranks) and increasingly accurate, and create a powerful backlash of resentment and sense-of-entitlement. Add to that the fact that it's incredibly easy to cheat and (unless they get careless) incredibly difficult to catch the cheats and you have a recipe for systemic corruption, of which the high-profile FFP cases are just the iceberg's tip.
I don't have any easy fixes for the system -- even injections of money won't solve the underlying problem of overemphasis on competition, as the Clinton NIH budget increase and the recent Challenge Grant fiasco have ably demonstrated.
But I do have a way to make it much less difficult to detect fraud: no more "data not shown". Every claim in every scientific manuscript should be backed with data, supplied in the rawest practical format (as well, of course, as the authors' preferred reductions). It would still be possible to cheat in such a system, but it would be such a pain in the ass that I figure most people would just do the damn experiments!
No, No, and No. I am also far from convinced that there is any compelling reason to do anything more than we do now to prevent, detect, and punish scientific fraud. This is because I have not seen any cost/benefit analysis of any proposed additional measures to be taken beyond those currently in place.
I am convinced that education of young researchers is the most important preventive measure. My (unpublished) results of an anquette among my students showed that even short -term education on science ethics (with special emphasis on plagiarism)considerably change the thinking and attitude of students.
I wish to present to your attention a document on ethics in science:
http://ca.geocities.com/uoftfraud/committee.htm