Uncertainty and Panic

Brian Knutson, a very clever neuroeconomist at Stanford, sheds light on some of the cognitive biases currently holding back the economy over at Edge.org. From the perspective of the brain, uncertainty is hell:

The brain responds to uncertain future outcomes in a specific region, and ambiguity (not knowing the probabilities of uncertain outcomes) provokes even greater activation in this same region. Further, insular activation precedes risk avoidance in investment tasks, and is even more pronounced before people "irrationally" avoid risks (i.e., or violate the choices of a risk-neutral, Bayesian-updating "rational" actor). Inflict enough ambiguity on enough people and you can immediately sense that they might lean towards risk aversion.

What are some implications of these findings for the current crisis? Presently, we need to put a price on ambiguous derivatives (a job for the economists). As long as the value of these contracts remains unresolved, this could generate ultra-uncertainty, which will promote fear, which will keep money in peoples' mattresses and out of the market. In the future, we should regulate (or incentivize) against contracts that resist pricing.

Colin Camerer has done of the best investigations of this uncertainty circuit. His experiment revolved around a decision making game known as the Ellsberg paradox. Camerer imaged the brains of people while they placed bets on whether the next card drawn from a deck of twenty cards would be red or black. At first, the players were told how many red cards and black cards were in the deck, so that they could calculate the probability of the next card being a certain color. The next gamble was trickier: subjects were only told the total number of cards in the deck. They had no idea how many red or black cards the deck contained.

The first gamble corresponds to the theoretical ideal of economics: investors face a set of known risks, and are able to make a decision based upon a few simple mathematical calculations. We know what we don't know, and can easily compensate for our uncertainty. As expected, this wager led to the "rational" parts of the brain becoming active, as subjects computed the odds. This is how we think when we know (or at least pretend we know) what the value of some exotic mortgage security is, or how much money GM or GE will make in the next year. In other words, this is how investors are supposed to make decisions.

Unfortunately, this isn't how the real world works, especially during a financial downturn. Right now, the gambles of financial investors are clouded by ignorance and ambiguity; we know something about what might happen, but not very much. Will these mortgage securities have any value in twelve months? Will GM go bankrupt? How long will the downturn last? When Camerer played this more realistic gambling game, the subjects' brains reacted very differently. With less information, and thus more uncertainty, the players exhibited substantially more activity in the amygdala and in the orbitofrontal cortex, which is believed to modulate activity in the amygdala. In other words, they filled in the gaps of their knowledge with fear. That's also what's happening in the markets right now - nobody knows what will happen next, and so the end result is mild panic. Uncertainty sucks.

Camerer also tested patients with lesioned orbitofrontal cortices. (These patients are unable to generate and detect emotions.) Sure enough, because these patients couldn't feel fear, their brains treated both decks equally. Their amygdalas weren't excited by ambiguity, and didn't lead them astray. Because of their debilitating brain injury, these patients behaved perfectly rationally. They weren't scared by uncertainty.

Obviously, it's difficult to reduce something as amorphous as "uncertainty" to a few isolated brain regions. But I think Camerer is right to argue that his "data suggests a general neural circuit responding to degrees of uncertainty, contrary to decision theory."

More like this

Over at Mind Matters, I've got an interview with Dr. Robert Burton on the danger of certainty and its relevance during a presidential election: LEHRER: To what extent does the certainty bias come into play during a presidential election? It seems like we all turn into such partisan hacks every four…
Over at the academic blog Overcoming Bias, Arnold Kling makes a good point: Before the Iraq invasion, President Bush did not say, "I think that there is a 60 percent chance that Saddam has an active WMD program." Al Gore does not say, "I think there is a 2 percent chance that if we do nothing there…
Why is terrorism so frightening? After all, if you just look at the numbers, being blown-up on an airplane is far less likely than dying in a car crash on the way to the grocery store. A Cato report makes this abundantly clear: In almost all years, the total number of people worldwide who die at…
The amygdala is an almond shaped chunk of flesh in the center of your brain. It's long been associated with a wide variety of mostly negative emotions and behaviors, from the generation of fear to the memory of painful associations. (There's some suggestive evidence that sociopaths have a broken…

Knutson and collaborators, specially Camelia Kuhnen, had made very interesting observations about the neurofinance of risk (you can diagnosticated if somenone is predisposed to take risks or if he is aversive to risk by assuming the pattern of activations of certain brain areas) and even Kuhnen had pronosticated some of the causes of the actual global recession: the hyperbolic pay schemes of CEO´s.
Good stuff!

I have written extensively about the part criteria play in decision making. While I've put my Decision Facilitation model (called Buying Facilitation) into the field of sales, it can be universally applied. That said, I have a model and a theory based on the belief that underlying all decisions are internal, idiosyncratic, and often unconscious criteria that actually filter in or out how our experience will bias our responses and actions.

I have found that much of the current experimentation in decision making does not address this bias, and instead uses 'information' as the basis of how psychologists explain the results of experiments and behaviors.

Criteria can be shifted, but only by shifting values and beliefs, not by offering different information, since the information will be filtered out if runs counter to the closely held criteria.

To say that the orbitofrontal patients "behaved perfectly rationally" is not really accurate. Patients with orbitofrontal lesions have difficulty using emotional information to help make rational decisions. These patients on the contrary are known for taking unreasonable risks and not learning from past negative outcomes - see the extensive research by Antonio Damasio, Antoine Bechara and others for ample documentation of this.

To adapt ones behavior in the face of uncertainty is rational It is not being able to modify one's behavior based on emotional input from the frontolimbic system that leads to problems - at least for the individual if not the economy.