The Certainty Bias

Over at Mind Matters, I've got an interview with Dr. Robert Burton on the danger of certainty and its relevance during a presidential election:

LEHRER: To what extent does the certainty bias come into play during a presidential election? It seems like we all turn into such partisan hacks every four years, completely certain that our side is right.

BURTON: The present presidential debates and associated media commentary feel like laboratory confirmation that the involuntary feeling of certainty plays a greater role in decision-making than conscious contemplation and reason.

I suspect that retreat into absolute ideologies is accentuated during periods of confusion, lack of governmental direction, economic chaos and information overload. At bottom, we are pattern recognizers who seek escape from ambiguity and indecision. If a major brain function is to maintain mental homeostasis, it is understandable how stances of certainty can counteract anxiety and apprehension. Even though I know better, I find myself somewhat reassured (albeit temporarily) by absolute comments such as, "the stock market always recovers," even when I realize that this may be only wishful thinking.

Sadly, my cynical side also suspects that political advisors use this knowledge of the biology of certainty to actively manipulate public opinion. Nuance is abandoned in favor of absolutes.

Why are people so eager for certainty? I think part of the answer is revealed in an interesting Science paper by Colin Camerer and colleagues. His experiment revolved around a decision making game known as the Ellsberg paradox. Camerer imaged the brains of people while they placed bets on whether the next card drawn from a deck of twenty cards would be red or black. At first, the players were told how many red cards and black cards were in the deck, so that they could calculate the probability of the next card being a certain color. The next gamble was trickier: subjects were only told the total number of cards in the deck. They had no idea how many red or black cards the deck contained.

The first gamble corresponds to the theoretical ideal of economics: investors face a set of known risks, and are able to make a decision based upon a few simple mathematical calculations. We know what we don't know, and can easily compensate for our uncertainty. As expected, this wager led to the "rational" parts of the brain becoming active, as subjects simply computed the odds. Unfortunately, this isn't how the real world works. In reality, our gambles are clouded by ignorance and ambiguity; we know something about what might happen, but not very much. When Camerer played this more realistic gambling game, the subjects' brains reacted very differently. With less information to go on, the players exhibited substantially more activity in the amygdala and in the orbitofrontal cortex, which is believed to modulate activity in the amygdala. In other words, we filled in the gaps of our knowledge with fear.

I'd argue that it's this subtle stab of fear that creates our bias for certainty. Not knowing makes us uneasy, and we always try to minimize such negative feelings. As a result, we pretend that we have better intelligence about Iraqi WMD than we actually do, or we make believe that the subprime debt being bought and sold on Wall Street is really safe. In other words, we selectively interpret the facts until the uncertainty is removed.

Camerer also tested patients with lesioned orbitofrontal cortices. (These patients are unable to generate and detect emotions.) Sure enough, because these patients couldn't feel fear, their brains treated both decks equally. Their amygdalas weren't excited by ambiguity, and didn't lead them astray. Because of their debilitating brain injury, these patients behaved perfectly rationally. They exhibited no bias for certainty.

Obviously, it's difficult to reduce something as amorphous as "uncertainty" to a few isolated brain regions. But I think Camerer is right to argue that his "data suggests a general neural circuit responding to degrees of uncertainty, contrary to decision theory."

More like this

Over at the academic blog Overcoming Bias, Arnold Kling makes a good point: Before the Iraq invasion, President Bush did not say, "I think that there is a 60 percent chance that Saddam has an active WMD program." Al Gore does not say, "I think there is a 2 percent chance that if we do nothing there…
Brian Knutson, a very clever neuroeconomist at Stanford, sheds light on some of the cognitive biases currently holding back the economy over at From the perspective of the brain, uncertainty is hell: The brain responds to uncertain future outcomes in a specific region, and ambiguity (not…
Why is terrorism so frightening? After all, if you just look at the numbers, being blown-up on an airplane is far less likely than dying in a car crash on the way to the grocery store. A Cato report makes this abundantly clear: In almost all years, the total number of people worldwide who die at…
"How We Decide" author Jonah Lehrer, fresh from a book tour of the UK, offers what he calls a "spluttering answer" (it's really quite lucid) to a question he says he's getting a lot these days: What decision-making errors were involved in our current financial meltdown?? The short version of his…

Makes a nice contrast with the idea of a "climate of fear" and government control techniques.

By Richard Eis (not verified) on 15 Oct 2008 #permalink

This makes me wonder what happens when our certainty is confronted with disconfirming evidence? We discount it presumably, but what is going on neurally there is an interesting question.

Thank you for this fascinating paper.

Before I was hired by a large international company, I was working on a doctoral thesis about "information theory and value theory in economy". At the time, I choose "certainty" so I concentrated upon my new job...

My thesis was that economic agents were seeking "information" which, in the sense of "information theory", is equivalent to fighting against uncertainty. It seems that as economic agents we value goods and services that we believe to be capable of reducing our own uncertainties.

Over my career I kept observing human behaviors through the filter of this hypothesis. As an internal consultant for the company that hired me I had plenty of opportunities to test it...

I wont develop any more here but if some people are interested I'd be glad to communicate with them.