Are we psychologically prone to be more hawkish?

Daniel Kahneman and Johnathan Renshon, writing in Foreign Policy, put forward a fascinating thesis that because of the way human beings are organized psychologically we are prone to be more hawkish. Basically the thrust of their argument is that social and cognitive psychologists have documented numerous psychological errors that human beings make consistently. In this sense human beings are not exactingly rational. Where our judgment consistently differs from pure rationality we make mistakes. With respect to foreign policy these mistakes make us more hawkish because they place a higher burden of proof on the doves.

Money quote:

Social and cognitive psychologists have identified a number of predictable errors (psychologists call them biases) in the ways that humans judge situations and evaluate risks. Biases have been documented both in the laboratory and in the real world, mostly in situations that have no connection to international politics. For example, people are prone to exaggerating their strengths: About 80 percent of us believe that our driving skills are better than average. In situations of potential conflict, the same optimistic bias makes politicians and generals receptive to advisors who offer highly favorable estimates of the outcomes of war. Such a predisposition, often shared by leaders on both sides of a conflict, is likely to produce a disaster. And this is not an isolated example.

In fact, when we constructed a list of the biases uncovered in 40 years of psychological research, we were startled by what we found: All the biases in our list favor hawks. These psychological impulses--only a few of which we discuss here--incline national leaders to exaggerate the evil intentions of adversaries, to misjudge how adversaries perceive them, to be overly sanguine when hostilities start, and overly reluctant to make necessary concessions in negotiations. In short, these biases have the effect of making wars more likely to begin and more difficult to end.

They go on to give some examples of how these biases affect our evaluation of world events and what to do about them. One bias is fundamental attribution error:

Imagine, for example, that you have been placed in a room and asked to watch a series of student speeches on the policies of Venezuelan leader Hugo Chavez. You've been told in advance that the students were assigned the task of either attacking or supporting Chavez and had no choice in the matter. Now, suppose that you are then asked to assess the political leanings of these students. Shrewd observers, of course, would factor in the context and adjust their assessments accordingly. A student who gave an enthusiastic pro-Chavez speech was merely doing what she was told, not revealing anything about her true attitudes. In fact, many experiments suggest that people would overwhelmingly rate the pro-Chavez speakers as more leftist. Even when alerted to context that should affect their judgment, people tend to ignore it. Instead, they attribute the behavior they see to the person's nature, character, or persistent motives. This bias is so robust and common that social psychologists have given it a lofty title: They call it the fundamental attribution error.

Another example is aversion to loss:

It is apparent that hawks often have the upper hand as decision makers wrestle with questions of war and peace. And those advantages do not disappear as soon as the first bullets have flown. As the strategic calculus shifts to territory won or lost and casualties suffered, a new idiosyncrasy in human decision making appears: our deep-seated aversion to cutting our losses. Imagine, for example, the choice between:

Option A: A sure loss of $890

Option B: A 90 percent chance to lose $1,000 and a 10 percent chance to lose nothing.

In this situation, a large majority of decision makers will prefer the gamble in Option B, even though the other choice is statistically superior. People prefer to avoid a certain loss in favor of a potential loss, even if they risk losing significantly more. When things are going badly in a conflict, the aversion to cutting one's losses, often compounded by wishful thinking, is likely to dominate the calculus of the losing side. This brew of psychological factors tends to cause conflicts to endure long beyond the point where a reasonable observer would see the outcome as a near certainty. Many other factors pull in the same direction, notably the fact that for the leaders who have led their nation to the brink of defeat, the consequences of giving up will usually not be worse if the conflict is prolonged, even if they are worse for the citizens they lead.

Loss aversion is clearly involved in our behavior in Vietnam, and it is clearly involved in deciding what to do about Iraq.

I find this article fascinated, but I am a little skeptical of the conclusion for the following reasons:

  • 1) It gives doves a free ride. Hawks have been out of favor lately because the of the Iraq war, but doves have made numerous calculations that have proved to be incorrect. You could argue that to assume the effectiveness of sanctions on the Saddam regime was being unduly optimistic because of the widespread corruption of the Oil-for-Food program. Likewise in the First Iraq War, many suggested that it would be long and bloody, and it finished in several days when Saddam's military evaporated.

    The suggestion that hawks are more prone to psychological bias is unfair. The consequences of bias for hawks may be more dire, but I don't think they are more prone to it.

  • 2) The terminology of hawk and dove is questionable and a flagrant oversimplification. Daniel Drezner has similar concerns.
  • 3) Fundamentally, I question the utility of this information. Yes, people do often make mistakes of judgment, but does knowing that tell you what to do in any specific case. The authors suggest a cognizance of this problem at the end of their article:


    The clear evidence of a psychological bias in favor of aggressive outcomes cannot decide the perennial debates between the hawks and the doves. It won't point the international community in a clear direction on Iran or North Korea. But understanding the biases that most of us harbor can at least help ensure that the hawks don't win more arguments than they should.

    Basically, they seem to be arguing that if we recognize that humans have psychological biases, and we recognize that hawks are more prone to them (again I question that), we should place the burden of prove on the hawks instead. I guess that sounds reasonable if you accept their argument, but I still don't find it particularly enlightening if you are trying to figure out what to do about North Korea. Is this going to be the case when the hawks are right? You don't know.

Others have been commenting on this article, and I find several of their comments enlightening.

Daniel Drezner loves the article and has several comments:

4) This theory massively overpredicts war as an outcome. If one accepts this argument, then one would also have to explain why war has been such a historically rare event -- and it's been getting rarer. There are a lot of countervailing factors that the authors don't mention, including but not limited to bureacratic politics, domestic politics, regime type, balance of power, etc.

5) Organizations act as a particularly powerful constraint on cognitive limitations. This is one point of the original Carnegie school of organizational behavior.

6) I'm not sure Democrats want to be too enthusiastic about this finding. Let's have some fun and apply these cognitive biases to the domestic policy of liberals*. Hmmm..... so liberals will be likely to demonize their political opponents and misread their intentions.... they'll be excessively optimistic and prone to an "illusion of control" in their domestic policy ambitions.... and they'll double down on ambitious social programs that look like they're not working terribly well (cough, health care, cough). Run, run for your lives!!!

Matthew Continetti, also writing in Foreign Policy, has similar skepticism about attributing the errors to hawks:

First comes the "fundamental attribution error," when subjects attribute the behavior of others to their "nature, character, or persistent motives" rather than to the context in which they are forced to operate. What some see as a sensible approach to making decisions, Kahneman and Renshon view as a psychological error that may, in times of conflict, lead to "pernicious" results such as war. Hawks who fail to understand their adversaries' true motives may be too quick to resort to force. World War I is the quintessential example.

Yet why do only the fundamental attribution errors of hawks lead to "pernicious" effects? Doves share the same bias; it just works in different ways. If hawks treat hostile behavior at face value when they shouldn't, so too do doves treat docility. Those who championed the 1973 accords ending the Vietnam War saw them as a chance for the United States to leave Vietnam while preserving the sovereignty of the south. But to North Vietnamese eyes, the cease-fire was merely an opportunity to consolidate their forces for the final seizure of the south, which happened a mere two years later.

The second hawk bias Kahneman and Renshon identify is "excessive optimism," which the authors speculate "led American policymakers astray as they laid the groundwork for the current war in Iraq." Yet prior to the war in Iraq, some hawks worried that Saddam Hussein might set oil fields ablaze, as he had done in 1991. They worried that he might launch missiles against American allies in the region, that his removal might be long and bloody, and that post-Saddam Iraq would face humanitarian crises of great magnitude. Doves optimistically argued that Saddam could be "contained" even as the sanctions against him were unraveling and as America's military presence in Saudi Arabia became increasingly untenable.

Why Kahneman and Renshon limit the biases they identify to hawks is something of a mystery. Take "reactive devaluation," or "what was said matters less than who said it." They cite likely American skepticism over any forthcoming Iranian nuclear concessions as an example, albeit conceding that doubt may be warranted in this case. They could have cited a domestic case instead: Just as many Republicans opposed President Clinton's interventions in Haiti, Bosnia, and Kosovo, and at one point even accused him of resorting to force in order to distract from the Monica Lewinsky scandal, many Democrats now oppose Bush administration policies sight unseen because they don't like the messenger. Doves are just as susceptible to reactive devaluation as hawks.

Matthew Yglesias argues that the authors are being to generous and not taking the conclusion far enough:

Consequently, Kahneman and Renshon actually end up being unduly generous to the hawkish point of view. Sometimes, of course, war is necessary. But since there are two sides to every conflict, hawks won't always be right. Even in a case where an American president is rightly listening to his hawkish advisors (George H.W. Bush in the first Gulf War, say, or Bill Clinton over Kosovo), a foreign leader (Saddam Hussein, Slobodan Milosevic) is making a serious miscalculation in listening to his hawkish advisors.

In short, most decisions to go to war have been mistakes. Sometimes, as in World War I, both sides are making a mistake, and other times, as in World War II, only one side is, but the upshot is that the impulse to launch wars is more widespread than it ought to be. Indeed, hawks themselves recognize this fact. Pro-war arguments almost always contend that the enemy is irrationally aggressive, while overestimating one's own military capabilities. Where the hawks go wrong is in their belief that irrational exuberance about violence is the exclusive province of real or potential adversaries, rather than a problem from which they themselves may suffer.

Unfortunately, Kahneman and Renshon shy away from pushing their psychological analysis into the policy domain, writing that "the clear evidence of a psychological bias in favor of aggressive outcomes ... won't point the international community in a clear direction on Iran or North Korea." In fact, the implications are rather clear. As members of the Bush administration admit, the decision to rebuff repeated diplomatic initiatives from Tehran was probably influenced in part by irrational "reactive devaluation," defined by the authors as "the very fact that a concession is offered by somebody perceived as untrustworthy undermines the content of the proposal."

I think this article is an awesome example of applied psychology, and I look forward to reading more debate about it.

Categories

More like this

...from a psychological standpoint, that is.  This is the topic of an article in the current edition of style="font-style: italic;">Foreign Policy.  In it, the authors examine the effect of common systematic cognitive errors, or biases, on the process of evaluating the prospects for war.  They…
Here is the most depressing lede of the day: $1.2 trillion would pay for an unprecedented public health campaign -- a doubling of cancer research funding, treatment for every American whose diabetes or heart disease is now going unmanaged and a global immunization campaign to save millions of…
Daniel Kahneman and Jonathan Renshon have a great article in the new Foreign Policy magazine on "Why Hawks Win." They describe some of the mental biases discovered by Prospect Theory, and explain how these biases affect our foreign policy decisions. (Last week, I speculated on the influences of…
I'm not a reagular reader of Foreign Policy magazine, but thank goodness I check in regularly at The Thinking Meat Project, which draws attention to a fascinating piece by Nobel laureate Daniel Kahnemnn on on how common "error biases" in our thinking make us vulnerable to the strident certainty of…

"if we recognize that humans have psychological biases, and we recognize that hawks are more prone to them ..."

This makes it sound as if people-who-are-hawks are afflicted with a bad case of the psychological biases, rather than humans having biases that make us more disposed to form/accept/carry out hawkish ideas/opinions/actions/, as seemed to be claimed Or am I confused?

I agree with you on the dove/hawk dichotomy, it's way oversimplified. Dove/hawk is really not much more than good cop/bad cop extended to foreign policy. Most of those who whine and moan about Bush's interventions were invariably enthusiastic of those Clinton engaged in. Wilsonianism has been enjoying a vogue on both sides of the aisle since the end of WWII.

It is important to note that the article in FP was not intended to be comprehensive; it is a feature article that raises the main points of the author's main line of research. Of course it is oversimplified; it's a magazine article, not a thesis.

Furthermore, if you read the article carefully, it is not judgmental at all. It makes no judgment about the current Administration, or any past Administration. Yet many of the critics seem to be unable to avoid the false conclusion that the article is critical of certain politicians or political groups.

My main criticism of the article is that it does not mention the profit motive as one form of bias that can promote belligerence.

(Speaking as a "dove") Perhaps the worst dove-ish call in history: Appeasement.