The Certainty Bias

Over at the academic blog Overcoming Bias, Arnold Kling makes a good point:

Before the Iraq invasion, President Bush did not say, "I think that there is a 60 percent chance that Saddam has an active WMD program."

Al Gore does not say, "I think there is a 2 percent chance that if we do nothing there will be an environmental catastrophe that will end life as we know it."

Instead, they speak in the language of certainty. I assume that as political leaders they know a lot better than I do how to speak to the general population. So I infer that, relative to me, the public has a bias toward certainty.

Another piece of evidence of that is an anecdote I cite about my (former) doctor. Several years ago, he said I needed a test. I did some research and some Bayes' Theorem calculations, and I faxed him a note saying that I did not think the test was worth it. He became irate.

I think that one reason that our health care system works the way it does is that it does not occur to anyone to say, "OK, I can live with that level of uncertainty." Instead, we must have the MRI, or the CT scan, or whatever. Even, as in my case, when the patient is willing to live with uncertainty, the doctor has a problem with it.

Another way that bias toward certainty shows up is in the way we handle disagreement. People don't say that there were differences within the intelligence community about the probability distribution for Saddam having WMD. They say that Bush manipulated the intelligence. And they are right, in the sense that he tried to make it sound certain.

Why are people so eager for certainty? I think part of the answer is revealed in an interesting Science paper by Colin Camerer.

Camerer's experiment revolved around a decision making game known as the Ellsberg paradox. Camerer imaged the brains of people while they placed bets on whether the next card drawn from a deck of twenty cards would be red or black. At first, the players were told how many red cards and black cards were in the deck, so that they could calculate the probability of the next card being a certain color. The next gamble was trickier: subjects were only told the total number of cards in the deck. They had no idea how many red or black cards the deck contained.

The first gamble corresponds to the theoretical ideal of economics: investors face a set of known risks, and are able to make a decision based upon a few simple mathematical calculations. We know what we don't know, and can easily compensate for our uncertainty. As expected, this wager led to the "rational" parts of the brain becoming active, as subjects computed the odds. Unfortunately, this isn't how the real world works. In reality, our gambles are clouded by ignorance and ambiguity; we know something about what might happen, but not very much. (For example, it's now clear just how little we actually knew about Iraq pre-invasion.) When Camerer played this more realistic gambling game, the subjects' brains reacted very differently. With less information to go on, the players exhibited substantially more activity in the amygdala and in the orbitofrontal cortex, which is believed to modulate activity in the amygdala. In other words, we filled in the gaps of our knowledge with fear. This fear creates our bias for certainty, since we always try to minimize our feelings of fear. As a result, we pretend that we have better intelligence about Iraqi WMD than we actually do; we selectively interpret the facts until the uncertainty is removed.

Camerer also tested patients with lesioned orbitofrontal cortices. (These patients are unable to generate and detect emotions.) Sure enough, because these patients couldn't feel fear, their brains treated both decks equally. Their amygdalas weren't excited by ambiguity, and didn't lead them astray. Because of their debilitating brain injury, these patients behaved perfectly rationally. They exhibited no bias for certainty.

Obviously, it's difficult to reduce something as amorphous as "uncertainty" to a few isolated brain regions. But I think Camerer is right to argue that his "data suggests a general neural circuit responding to degrees of uncertainty, contrary to decision theory."

If we could educate our leaders about this bias for certainty, perhaps they would be less confident in their assertions, especially when it comes to matters of war. Another possibility is to outsource our war decisions to somebody with a damaged orbitofrontal cortex, although that is probably a terrible idea.

More like this

"How We Decide" author Jonah Lehrer, fresh from a book tour of the UK, offers what he calls a "spluttering answer" (it's really quite lucid) to a question he says he's getting a lot these days: What decision-making errors were involved in our current financial meltdown?? The short version of his…
For everyone interested in how their brain works, I'd suggest checking out a book coming out soon called Picturing Personhood, by MIT anthropologist Joseph Dumit. Dumit shows how easy it is for brain scans to become cultural Rorschach tests. Scans of mental activity, such as fMRI or PET, are…
As promised, I'm sharing the rules for the card game that my extended family played all through my youth. The idea here is that a set of rules for a bunch of favorite card games, a deck or two of cards, and a promise to play some of these games could make for an inexpensive -- and personalized --…
There are two features of games that have always appealed to me. First, the good ones put you in a place where you are explicitly thinking out different ways the future could play out -- the possibilities that are more or less likely given what you know (and what you don't know). Second, many of…

Freeman Dyson's excellent book "Imagined Worlds" covers the more social dimension of this same topic. Your introduction made me think of this.

Conviction is considered a virtue among politicians, and political decisions are *never* actually made under conditions of complete certainty to begin with. Conviction, in a sense, is all you can have.

The world of scientists and engineers is quite different, in which conviction is almost a vice; one needs to be open to the data. And indeed, many scientific and engineering decisions can be made under conditions of near-complete certainty.

I saw this desire for certainty in my own work (military contractor). When people asked me a question, I told them about the uncertainties and gave a rational estimate of the "right" answer. A coworker simply gave them a yes-no answer. They preferred the simple answer even if it later turned out to be incorrect.

Scientific training vs. a lack thereof can definitely cause a disconnect. My wife will sometimes take a strong stand on, say, national economic policy, then ask me what I think. I'll say that I don't really know anything about the particular issue, and she'll complain, "But you must have some kind of opinion about it!" Well, no, I don't have data, so how would I have an opinion? But then I remember how much practice & training it took to make my default position 'reserve judgment' ...

By Scott Simmons (not verified) on 20 Feb 2007 #permalink

I know what you mean about doctors. I once asked a doctor about the error rates on a test and she said there were no errors. I found a new doctor shortly thereafter.

Politicians have the task of persuading people to follow their lead, otherwise we wouldn't call them leaders. So, they use rhetoric to convey the reasons for their taking specific directions in making decisions. Uncertainty implies lack of direction. Certainty implies direction. The latter is not merely, or mostly, an involuntary reaction to fear.

I fail to see how observations of brain activity indicate very much of anything about how people use their brains. They seem more like associations to me. Look up the Brain Ball experiment. It is an interesting example of how much culture infuses our perception of what kind of brain activity is associated with competition.

I'll buy and read your book from the title alone. Look forward to it.

I'm not sure I agree with Camerer's conclusions. How do we know he's not influenced by certainty bias?

--
Furry cows moo and decompress.