One of the big ideas behind this blog is that honest conduct and communication are essential to the project of building scientific knowledge. An upshot of this is that people seriously engaged in the project of building scientific knowledge ought to view those who engage in falsification, fabrication, plagiarism, and other departures from honest conduct and communication as enemies of the endeavor. In other words, to the extent that scientists are really committed to doing good science, they also have a duty to call out the bad behavior of other scientists.
Sometimes you can exert the necessary pressure (whether a firm talking-to, expression of concern, shunning, or what have you) locally in your individual interactions with other scientists whose behavior may be worrisome but hasn't crossed the line to full-blown misconduct. In cases where personal interventions are not sufficient to dissuade (or to make things whole in the aftermath of) bad behavior, it may be necessary to bring in people or institutions with more power to address the problem.
You may have to blow the whistle.
Here, I want to examine the case of a group of graduate students at the University of Wisconsin-Madison who became whistleblowers. Their story, as told in an article in Science , illustrates not only the agony of trying to act responsibly on your duties as a scientist, but also the price you might have to pay for acting on those duties rather than looking out for your self-interest.
Chantal Ly, 32, had already waded through 7 years of a Ph.D. program at the University of Wisconsin (UW), Madison. Turning in her mentor, Ly was certain, meant that "something bad was going to happen to the lab." Another of the six students felt that their adviser, geneticist Elizabeth Goodwin, deserved a second chance and wasn't certain the university would provide it. A third was unable for weeks to believe Goodwin had done anything wrong and was so distressed by the possibility that she refused to examine available evidence.
Two days before winter break, as the moral compass of all six swung in the same direction, they shared their concerns with a university administrator. In late May, a UW investigation reported data falsification in Goodwin's past grant applications and raised questions about some of her papers. The case has since been referred to the federal Office of Research Integrity (ORI) in Washington, D.C. Goodwin, maintaining her innocence, resigned from the university at the end of February. (Through her attorney, Goodwin declined to comment for this story.)
Although the university handled the case by the book, the graduate students caught in the middle have found that for all the talk about honesty's place in science, little good has come to them. Three of the students, who had invested a combined 16 years in obtaining their Ph.D.s, have quit school. Two others are starting over, one moving to a lab at the University of Colorado, extending the amount of time it will take them to get their doctorates by years. The five graduate students who spoke with Science also described discouraging encounters with other faculty members, whom they say sided with Goodwin before all the facts became available.
Fraud investigators acknowledge that outcomes like these are typical. "My feeling is it's never a good career move to become a whistleblower," says Kay Fields, a scientific investigator for ORI, who depends on precisely this occurrence for misconduct cases to come to light. ORI officials estimate that between a third and half of nonclinical misconduct cases--those involving basic scientific research--are brought by postdoctoral fellows or graduate students like those in Goodwin's lab. And the ones who come forward, admits ORI's John Dahlberg, often suffer a "loss of time, loss of prestige, [and a] loss of credibility of your publications."
(Bold emphasis added.) What is striking to me about this case is how well Goodwin's students dealt with the situation. Yet, most of them will either be seriously delayed in the time it takes them to get through graduate school and fully enter their scientific community, or have left that scientific community altogether.
I've had occasion before to recommend C. K. Gunsalus's excellent paper "How to Blow the Whistle and Still Have a Career Afterwards."  As I repeat that recommendation, let me enumerate her "rules for responsible whistleblowing":
- Consider alternative explanations (especially that you may be wrong).
- In light of #1, ask questions, do not make charges.
- Figure out what documentation supports your concerns and where it is.
- Separate your personal and professional concerns.
- Assess your goals.
- Seek advice and listen to it.
and her "step-by-step procedures for responsible whistleblowing":
- Review your concern with someone you trust.
- Listen to what that person tells you.
- Get a second opinion and take that seriously, too.
- If you decide to initiate formal proceedings, seek strength in numbers.
- Find the right place to file charges; study the procedures.
- Report your concerns.
- Ask questions; keep notes.
- Cultivate patience!
The story of the Goodwin lab, as laid out in the Science article, doesn't address every single one of these points, but there's enough detail that it's evident that the students were tracking Gunsalus's recommendations fairly well. When one of the students, Chantal Ly, was given a few pages of one of Goodwin's grant applications to help her get started on a new research project, she recognized data identified as unpublished in that proposal from a 2004 publication that had come out of the lab. She mentioned this worry to Garrett Padilla, another student in the lab who was already working in the area described by the grant application. Looking at the grant application, Padilla saw even more to be worried about:
"There was one experiment that I had just not done," as well as several published and unpublished figures that seemed to have been manipulated, he says. Two images apparently identical to those already published were presented as unpublished and as representing proteins different from the published versions. "I remember being overwhelmed and not being able to deal with it at that moment," says Padilla.
The students didn't go off half cocked. They didn't immediately place a phone call to the NIH (the funding agency to which the grant application in question had been submitted) or to the ORI or to the university president. Instead, they sought advice from other scientists from outside the department. Considering the evidence of the grant proposal and the relevant publications, those scientists counseled the students to bring their concerns to Godwin and ask for an explanation, which Padilla did. (They also advised the students to keep a log of the events as they transpired, advice I heartily second.) Padilla's first meeting with Goodwin didn't yield a clear explanation so much as an agitated denial and, later, a claim that a "computer file mix-up" was the source of the misattribution of a published Western blot as unpublished data in the proposal. In Padilla's second meeting with Goodwin, by which time all the other members of the lab were aware of the concerns and were trying to deal with them together, by Padilla's account:
Goodwin asked for forgiveness and praised him for, as he wrote in the log, "pushing this issue." She told him that the grant application was unlikely to be funded--an assertion that turned out to be untrue given that NIH approved it--but offered to e-mail her NIH contact citing some of the problems in the application. Goodwin subsequently sent that e-mail, on which Padilla was copied. He left the encounter relieved. "At that point, I was pretty content to leave it alone," he says. "I felt like we had compromised on a resolution."
But this apparent admission and attempt to address the mistake wasn't quite the end of the matter. When told that one of the students, Mary Allen, intended to switch to another lab, Goodwin reeled out further defenses against the students' concerns: files had come unlabeled or had been mixed up, but no data had been faked! This left the students in Goodwin's lab uneasy. They were committed during this time to keeping the matter confidential, and together they talked through whether they should bring their suspicions to the administration. Indeed, the students committed to involving the administration only by unanimous agreement. Eventually, they reached that unanimous agreement and contacted the chair of the genetics department, Michael Culbertson, who in turn alerted university deans who began an informal inquiry into the matter. At that stage, Goodwin offered yet another explanation:
She vigorously denied the charges against her, telling Culbertson and the students in a joint meeting that the figures in question were placeholders she had forgotten to swap out. According to Padilla's log of that meeting, Goodwin explained that she "was juggling too many commitments at once" when the proposal was submitted.
The informal inquiry recommended that a formal investigation of Goodwin be launched. Concerned about the potential for fraud in Goodwin's other two grant proposals (as well at the one which raised the students' suspicions), the university arranged to cancel all three. Within a couple months, Goodwin resigned. A few months after that, the results of the university investigation raised questions with three of Goodwin's published papers and "described 'evidence of deliberate falsification' in the three applications for the canceled grants, totaling $1.8 million in federal funds." And the university report noted problems beyond the papers and the grant applications:
"It appears from the testimony of her graduate students that Dr. Goodwin's mentoring of her graduate students included behaviors that could be considered scientific misconduct--namely, pressuring students to conceal research results that disagreed with desired outcomes and urging them to over-interpret data that the students themselves considered to be preliminary and weak," they wrote in their report.
In short, the university found evidence that the students' concerns were well-grounded. They blew the whistle on the problematic conduct, and a bad actor was taken out of the pool (or might well have been, had she not resigned first herself). Budding scientists who were concerned with scientific integrity in their community collected evidence, weighed it carefully, and brought their concerns forward rather than shrugging it off. You'd think this would be a good thing. Except, of course, for the fallout these budding scientists have had to endure. The research they had done in Goodwin's lab was not deemed usable from the point of view of writing doctoral dissertations. According to the Science article, the main problem was "Goodwin's relentless optimism that some now believe kept them clinging to questionable results.":
Allen, for example, says she sometimes argued but gave in to Goodwin's suggestions that she stick with molecular data Allen considered of dubious quality or steer clear of performing studies that might guard against bias. Ly, on her third, floundering project, says, "I thought I was doing something wrong experimentally that I couldn't repeat these things."
If you can't draw on the work you've already done, you have to start over. That may involve coming up to speed on an entirely new research system -- and there may be no good way to trim time off this "retooling" of the researcher. As well, it means finding a new advisor in whose lab you can make a new scientific home and get the guidance and support you need. Given the reactions of Goodwin's colleagues during the university's investigation (reflexively siding with their colleague, viewing her students as "informers" who might not be doing the right thing, murmuring that Goodwin might have been driven to fabricate data to compensate for unproductive students), the chances of Goodwin's former students relocating happily within their department did not look good. What kind of reaction might they expect from scientists in their field at other schools where they might try -- from scratch -- to earn Ph.D.s? Goodwin's students did everything we could reasonably ask them to do -- and more -- under the circumstances. They stood up to protect the integrity of scientific communications in their community (as well as to ensure that future graduate students would not be put in the same position) and, for their troubles, they lost the years they had put into their research, lost the goodwill of many of the faculty in their department, and, perhaps, lost their grounds for believing that their scientific community valued integrity and sought truth.
In the next post, I'll examine the community-level features that put the UW whistleblowers in a position to pay such a high price. I'll also try to sketch a picture of how the institutions -- especially the department -- might have responded differently. ------  Jennifer Couzin, "Truth and Consequences," Science, Vol. 313 (1 September 2006), 1222-1226.  C.K. Gunsalus, "How to Blow the Whistle and Still Have a Career Afterwards," Science and Engineering Ethics, Vol. 4 (1998), 51-64.
What a great post!
My field (stem cell biology) is one where several instances and more accusations of scientific misconduct have occurred. My advisor, who is known for publishing contradictory results that have dampened certain of the stem cell hype, is fond of pointing out that it was the junior scientists and trainees in Hwang Woo Suk's lab that brought that scandal to light. This fact has always offered me hope that the next generation of scientists will be less tolerant of misconduct. But maybe science just forces those ideals (or idealistic people) out the door?
As I read your post, I immediately was curious about what has happened with those whistle-blowing young scientists.
One thing that strikes me about this case. Dr. Goodwin was apparently convinced enough of her own fakery that she put all the trainees to work on stuff that couldn't possibly be productive. Because it was based on faked data. This is monumentally dumb if you are an assertive faker. What it suggests is that Dr. Goodwin was convinced, for whatever reasons, that she was onto something and was willing to do a little faking because she "knew" it should come out as hypothesized. The comment about bad grad students "driving her" to fake data is informative. Nevertheless this gives us insight into one of the slippery slopes- being convinced you are right about anything in advance of the empirical evidence. Scientific method 101 should teach us that it doesn't matter if you've spent two decades proving your theory, each experiment should be judged by the empirical result, not whether it fits the theory or not. Sadly, most humans are a little less rigorous than this...
So what happened to the publications that the whistleblowers know were clean ('casue they did the work), but have the advisor's name on them?
Excellent discussion. The Policy on Academic Misconduct at my school explicitly states something like "whistleblowers in scientific fraud and misconduct cases will be protected," but how much can that really help? It would seem as though it's getting tougher and tougher to resolve these situations, especially in this competitive era and in cases involving a superior and her students, but it sounds like Ly et al. handled this quite well.
Whistleblowing is dangerous to your career, to your health and even to your life. It is also perceived by many as an act unbecoming of a "colleague." Many whistleblowers in the scientific community are boycotted by their peers or being retaliated against by the administrators who see the whistleblower as a traitor. Experiencing it myself, I can confidently state that without my tenure being secured at the time of my whistleblowing, I would lose my job in a heart bit. I have been boycotted by most of the faculty members of the department of the chairman on whom I blew the whistle; a grievance was filed against me by these members for interfering with their departmental affairs, a move that was strongly supported by the dean (administrator) of the school, who demanded from the grievance committee (all scientists) to reconsider their decision to exonerate me from any wrongdoing and instead find me guilty. Attempts at reducing my research space were made by the administration and my department chair was forced out because he supported my freedom to blow the whistle by not succumbing to pressure from the dean to reign me in. The whole ordeal cost me in health (high blood pressure) lost time (a gap in my list of publications), money (I had to hire an attorney to provide legal advise during the grievance proceedings against me) and lost of trust in the system that is supposed to root out the bad guys. Even the publishing of this ordeal as a book had to be done in such a way that my identity and the identities of all those involved had to be changed mainly to protect myself from frivolous lawsuits and bodily harm (seriously):
Wow, S, that sounds like an extraordinarily malignant environment to be in.
How can this possibly be avoided? Creating a system of anonymous whistleblowing would inevitably lead both to shoddy investigations and frivolous accusations.
I stronbgly agree with S. Rivlin: "Whistleblowing is dangerous to your career, to your health and even to your life."
It is why I no longer have my (corrected for inflation) $120,000 per year job as a top Engineer on the Space Shuttle. It is why I was summarily booted after 5 semesters of rave reviews as adjunct professor of Math, a position that only existed because the previous person in that slot had blown the same whistle on the same alcoholic sociopath Dean.
It is why, this summer, I am to teach Math in a bottom-ranked High School, instead of still teaching Astronomy, Math, Computer Science, Physics, Biology, English Composition, or any of the other subjects which I am qualified to teach at the college level.
Yet I still would blow the whistle on fraud, corruption, plagiarism, or other existential threats from "enemies of the endeavor."
Because all we have as scientists or authors is our reputation, and our intellectual property. Those are both embedded in "the endeavor." We defend it, or have no civilization. But we defend at our personal peril.
I was personally impacted by fraud in my field, though it was far from me and my institution. It derailed a lot of my research plans, though, luckily, I had plenty to publish. The disappointment and dismay in my case were not career threatening, but quite disheartening. But it set me to thinking about what the scientific endeavor really is, outside the mythology erected around it.
Scientific method- I think that the very need for such a set of procedures and ethical standards are, frankly, an admission that people are not so ethical or logical or careful, left alone. People are good at believing bullshit, spouting nonsense, and feeling self-righteous tribal rage when anyone points out any of it in ourselves or any of our fellow tribespeople/colleagues. It's part of being human. Scientific thought and procedure is specifically supposed to be a check on these things- just because scientists are (hopefully) expert at using these tools, there is still no reason to think that they are immune to frailties that come with being human.
I think that the best way to enforce integrity is to keep the penalty for deliberate cheating essentially a scientific death sentence. There ought to be procedures for protecting those that blow the whistle, but I don't think science is alone in not having them in place. It will likely always be risky, because it is usually involves a power struggle of some kind.
Science is precious. As a human endeavor, it is nonetheless doomed forever to reel from scandals and betrayals perpetrated by those whom the community has given its trust. The need to be diligent and courageous will not go away.
bdf said: "Wow, S, that sounds like an extraordinarily malignant environment to be in."
Yes, it is. And it is even more dangerous because frequently it is either goes undetected until it bursts or stays undetected due to cover-up. There is also the general assumption that scientific misconduct is a minor problem that only very few are involved in and that the scientific method will eventually uncover it. This is a misconception and we all need to undersand that fraud in science today is a booming problem. Only when we, scientists, realize that science as an endeavor is in real danger and thus decide to kick out the administrators from the investigative process, police ourselves with strict rules against and harsh consequences to scientific misconduct, that we will be able to curtail this malignancy.