Osama, Saddam, and Inferred Justification

href="http://www.researchblogging.org"> alt="ResearchBlogging.org"
src="http://www.researchblogging.org/public/citation_icons/rb2_large_gray.png"
style="border: 0pt none ;">

A fair amount has been written about the topic of motivated
reasoning
.  Jonah Lehrer href="http://scienceblogs.com/cortex/2008/09/motivated_reasoning.php">explains
the relationship between motivated reasoning and the political process;
Orac href="http://scienceblogs.com/insolence/2009/04/why_projection_isnt_all_its_cracked_up_t.php">addresses
the issue with regard to quantum woo. (Plus more at Mixing
Memory
, href="http://mixingmemory.blogspot.com/search?q=motivated+reasoning">here)



Getting back to the political realm, now comes a research article
about a study of the mother of all misperceptions: the supposed link
between Saddam Hussein and the attacks that occurred on 11 September
2001.  


href="http://www3.interscience.wiley.com/journal/122260824/abstract">"There
Must Be a Reason": Osama, Saddam, and Inferred Justification

Monica Prasad, Andrew J. Perrin, Kieran Bezila, Steve G. Hoffman, Kate
Kindleberger, Kim Manturuk, and Ashleigh Smith Powers


One of the most curious aspects of the 2004
presidential election was
the strength and resilience of the belief among many Americans that
Saddam Hussein was linked to the terrorist attacks of September 11.
Scholars have suggested that this belief was the result of a campaign
of false information and innuendo from the Bush administration. We call
this the information environment explanation. Using a technique of
"challenge interviews" on a sample of voters who reported believing in
a link between Saddam and 9/11, we propose instead a social
psychological explanation for the belief in this link. We identify a
number of social psychological mechanisms voters use to maintain false
beliefs in the face of disconfirming information, and we show that for
a subset of voters the main reason to believe in the link was that it
made sense of the administration's decision to go to war against Iraq.
We call this inferred justification: for these voters, the fact of the
war led to a search for a justification for it, which led them to infer
the existence of ties between Iraq and 9/11.



Although the article is not openly accessible, there is a href="http://sociology.buffalo.edu/documents/hoffmansocinquiryarticle_000.pdf">PDF
available on the Internet.



The authors note:


Ronald Reagan once remarked that "the trouble with
our liberal friends isnot that they are ignorant, but that they know so
much that isn't so" (Reagan 1964). His comment goes to the heart of one
of the most contentious issues in democratic theory: how should
democracies handle mistaken beliefs? False beliefs present a
potentially serious challenge to democratic theory and practice, as
citizens with incorrect information cannot form appropriate preferences
or evaluate the preferences of others.



As an aside, I will comment that one of the things that will cause me
to lose respect for a politician, is when it becomes known that people
are holding false beliefs about positions held by the politician, yet
the politician does nothing to dispel those errors.  Bush and
Cheney were notorious for allowing the persistence of mistaken beliefs,
when those mistakes were politically advantageous.  Note that
I am not accusing them of having deliberately spread misinformation.



The authors, likewise, sidestep the question of whether the widespread
belief in the Saddam-9/11 link was the result of propaganda.
 Researchers who explain the mistaken belief by attributing it
to propaganda, tend to view the mistake as a result of faulty access to
information, or to faulty reasoning:


We characterize this explanation as being about the
information environment: it implies that if voters had possessed the
correct information, they would not have believed in the link.
Underlying this explanation is a psychological model of information
processing that scholars have labeled "Bayesian updating," which
envisions decision makers incrementally and rationally changing their
opinions in accordance with new information (Gerber and Green 1999).



Hmmm.  I am reasonably confident that the puppets of the Bush
administration who regurgitated this garbage were not really too keen
on Bayesian updating.  So I agree with Prasad et. al.
already.  They present an alternative view:


We argue that the primary causal agent for
misperception is not the presence or absence of correct information but
a respondent's willingness to believe particular kinds of information.
Our explanation draws on a psychological model of information
processing that scholars have labeled motivated reasoning. This model
envisions respondents as processing and responding to information
defensively, accepting and seeking out confirming information, while
ignoring, discrediting the source of, or arguing against the substance
of contrary information (DiMaggio 1997; Kunda 1990; Lodge and Tabor
2000).



Some may see the connection between motivational reasoning and the
notion of cognitive dissonance.  The authors explain that
connection in their article.  Basically, the idea is that
people do not like to have the experience of conscious awareness of the
holding of contradictory beliefs.  It is OK to have
contradictory beliefs, so long as one is not aware of the
contradiction.  But it is unpleasant to have contradictory
beliefs AND to be aware of the contradiction.  When people
have this unpleasant feeling, they do various things to make the
unpleasantness go away.  If it works, they keep doing it.
 This is an example of negative reinforcement.  



One of the things they do, is to ignore contradictory evidence.
 The authors review this, and other cognitive errors that come
into play.  One such error is confirmation bias:


This confirmation bias means that people value
evidence that confirms their previously held beliefs more highly than
evidence that contradicts them, regardless of the source (DiMaggio
1997; Nickerson 1998, Wason 1968).



One could say that they impute greater salience to evidence that
confirms their existing belief system.  



The authors then discuss the fact that people tend to be heavily
influenced by social cues.  A given social situation often
will influence a person to make use of situational heuristics.


Decision making is not only a process of matching
external heuristics such as party or appearance to preferences;
important cues about how to act are revealed to the agent by the
situation itself.



They discuss various studies that explore and confirm this phenomenon.
 This leads to their hypothesis:


We argue that some citizens believe leaders would not
take an action as drastic as war if it were not justified. They then
develop affective ties to this conclusion and seek information that
confirms it while dismissing information that contradicts it, producing
the correlation between information and belief. These social
psychological processes

were an important feature of the misperception of a link between Saddam
and 9/11. The belief in this link was so resilient because it made
sense of the administration's decision to go to war against Iraq.



They used surveys and interviews to do their study.  First
they did a survey of voters about their beliefs regarding the link
between Saddam and September 11.  They followed up with
interviews, in which voters were presented with information that
contradicted their beliefs.  



In order to interpret the results, it is important to know how subjects
were selected.  They provide this information:


We selected counties in the researchers' home states
that had below-average income, a majority white population, and had
voted for Bush in 2000.  We then selected the precinct within
each county that had gone most heavily Republican in 2000 and
identified potential respondents using publicly available voter
registration data.



It may seem as though they were deliberately picking on poor white
conservatives, but there was a rationale.  They take pains to
point out that they expect that they could develop similar results with
surveys of Democrats: "Previous research on motivated reasoning has
found it among respondents of all classes, ages, races, genders, and
affiliations."



In order to interpret the study, it also is necessary to know how the
interviews were conducted.  The interviews were conducted
using their own methodology, which they call a challenge
interview
.  They discuss that, and provide the
details of the information given to subjects.



Then, they discuss their methodology for assessing their hypothesis.
 That is, they explain how they tried to tell whether the
subjects were engaging in Bayesian Updating, or were using motivated
reasoning.  


To determine whether a respondent was showing
Bayesian updating (the willingness to change one's mind in the face of
contradictory information from a trusted source) or motivated reasoning
(resisting contradictory information), we analyzed our data in two
different ways. First, we examined whether our respondents deflected
the information, and we categorized the strategies that they used to do
so. Second, to conduct a more stringent test of the motivated reasoning
hypothesis, we examined whether respondents attended to the
contradictory data at all.



What did they find?  About 2 percent of the subjects used
Bayesian Updating.  Fourteen percent denied that they believed
in a link between Saddam and 9/11, despite having endorsed the link in
the survey.  But the most common response was this:


The most popular strategy was to quickly switch the
topic to other good reasons that the war in Iraq was justified. This
strategy was used by nearly one in three respondents and was the single
largest category of responses to our questions about the perceived
link. This interviewee downplays the significance of the question about
the link without actually responding to the question: "There is no
doubt in my mind that if we did not deal with Saddam Hussein when we
did, it was just a matter of time when we would have to deal with him."
Another respondent brushes aside the issue of a link between Saddam and
9/11 by saying that the decision to invade Iraq was good for other
reasons: "We were under the pretense that he had nuclear weapons. That
to me is why we went; it wasn't related to him so much. I think it had
more to do with the weapons of mass destruction."



The authors looked at all the strategies that the subjects employed to
deal with the cognitive dissonance.  They describe what they
call Inferred Justification:


Finally, our interviews revealed an interesting and
creative reasoning style that we call inferred justification:
recursively inventing the causal links necessary to justify a favored
politician's action. Inferred justification operates as a backward
chain of reasoning that justifies the favored opinion by assuming the
causal evidence that would support it.



That is, they try different strategies, one after the other, until they
find a line of reasoning that makes the unpleasantness of the cognitive
dissonance go away.  It does not have to be logical; in fact,
it plainly is not.  That is why they call it backward
reasoning.  



After a discussion of the limitations of the study, they come to their
conclusion:


The main theoretical implication of our research is
that "knowledge" as measured on surveys is partly a by-product of the
attempt to resolve the social psychological problem of cognitive
dissonance. The practical implication of this is that, although
scholars have shown a correlation between the perception of links
between Iraq and Al Qaeda and support for the war in Iraq, we cannot
conclude from this correlation that misinformation led to support for
the war. Rather, for at least some respondents, the sequence was the
other way around: support for the war led to a search for a
justification for it, which led to the misperception of ties between
Iraq and 9/11. This suggests a mechanism through which motivated
reasoning may be strongest when the stakes are highest.  It is
precisely because the stakes of going to war are so high that some of
our respondents were willing to believe that "there must be a reason."



In other words, on a more abstract level: the more powerful the
emotional state, the more likely it is that people will engage in
cognitive errors.  



Now, a question: if you happen to be liberal, do you find yourself
trying to find some way to use this research to strengthen your belief
that liberals are cognitively superior to conservatives?  If
so, why?  If you are conservative, do you find yourself
wanting to refute this study?



____________

title="ctx_ver=Z39.88-2004&rft_val_fmt=info%3Aofi%2Ffmt%3Akev%3Amtx%3Ajournal&rft.jtitle=Sociological+Inquiry&rft_id=info%3Adoi%2F10.1111%2Fj.1475-682X.2009.00280.x&rfr_id=info%3Asid%2Fresearchblogging.org&rft.atitle=%E2%80%9CThere+Must+Be+a+Reason%E2%80%9D%3A+Osama%2C+Saddam%2C+and+Inferred+Justification&rft.issn=00380245&rft.date=2009&rft.volume=79&rft.issue=2&rft.spage=142&rft.epage=162&rft.artnum=http%3A%2F%2Fblackwell-synergy.com%2Fdoi%2Fabs%2F10.1111%2Fj.1475-682X.2009.00280.x&rft.au=Prasad%2C+M.&rft.au=Perrin%2C+A.&rft.au=Bezila%2C+K.&rft.au=Hoffman%2C+S.&rft.au=Kindleberger%2C+K.&rft.au=Manturuk%2C+K.&rft.au=Powers%2C+A.&rfe_dat=bpr3.included=1;bpr3.tags=Psychology%2COther%2CNeuroscience">Formal
Citation:

Prasad, M., Perrin, A., Bezila, K., Hoffman, S., Kindleberger, K.,
Manturuk, K., & Powers, A. (2009). "There Must Be a Reason":
Osama, Saddam, and Inferred Justification style="font-style: italic;">Sociological Inquiry, 79
(2), 142-162 DOI: href="http://dx.doi.org/10.1111/j.1475-682X.2009.00280.x">10.1111/j.1475-682X.2009.00280.x


More like this

For a change of pace, I want to step back from medicine for this post, although, as you will see (I hope), the study I'm going to discuss has a great deal of relevance to the topics covered regularly on this blog. One of the most frustrating aspects of being a skeptic and championing critical…
In response to my post yesterday which argued that Democrats and Republicans are both vulnerable to what's politely referred to as "motivated reasoning" - in other words, we're all partisan hacks - some commenters objected. They pointed out that the actual study I was discussing found that…
Matt Yglesias makes an important psychological point about political debates: My read of what I see in these debates is so heavily colored by ex ante beliefs and information that it's hard for the debate to change anything. During the first 100 days question, for example, John Edwards gave his…
One theme that I keep revisiting again and again is not so much a question of the science behind medical therapies (although certainly I do discuss that issue arguably more than any other) but rather a question of why. Why is it that so many people cling so tenaciously to pseudoscience, quackery,…

This is a very important topic yet it seems to elicit little interest. I suggest that the ideas in this study are uncomfortable for most to contemplate and discuss. We all prefer to believe that our own decision processes are based on the soundest logic and objective deliberation, including those who agree with us, of course. This study shows otherwise; it shows that true logical deliberation (Bayesian updating) including our own, is very rare. Confronting evidence of this is psychologically uncomfortable if not painful.

In fact, the internet provides a very vivid testing ground for these ideas. One would only have to add the category "verbally abusive insult" as a response to being confronted with information that contradicts one's closely held emotional beliefs. That was not likely in the study where the interviewers were careful not to corner a subject too decisively and where the subjects were prone to defer to an authority they had agreed to cooperate with.

I suggest that the (lack of) responses to this article can be described as an example of the "selective exposure" described in the article - a response that is also very easy to provide in response to an internet article but more problematic for an interview.

Anyway, thanks for this article. I'd love to read about and discuss these ideas more.

By Ray in Seattle (not verified) on 27 Aug 2009 #permalink