Confirmation Bias and Political Groupthink

Michael Shermer writes of a fascinating experiment on how the brain processes statements and claims about which one has a powerful attachment to the truth being a certain way. It may well illuminate the sort of irrational thinking driven by political partisanship. I'll post his description of the experiment below the fold:

This surety is called the confirmation bias, whereby we seek and find confirmatory evidence in support of already existing beliefs and ignore or reinterpret disconfirmatory evidence. Now a functional magnetic resonance imaging (fMRI) study shows where in the brain the confirmation bias arises and how it is unconscious and driven by emotions. Psychologist Drew Westen led the study, conducted at Emory University, and the team presented the results at the 2006 annual conference of the Society for Personality and Social Psychology.

During the run-up to the 2004 presidential election, while undergoing an fMRI bran scan, 30 men--half self-described as "strong" Republicans and half as "strong" Democrats--were tasked with assessing statements by both George W. Bush and John Kerry in which the candidates clearly contradicted themselves. Not surprisingly, in their assessments Republican subjects were as critical of Kerry as Democratic subjects were of Bush, yet both let their own candidate off the hook.

The neuroimaging results, however, revealed that the part of the brain most associated with reasoning--the dorsolateral prefrontal cortex--was quiescent. Most active were the orbital frontal cortex, which is involved in the processing of emotions; the anterior cingulate, which is associated with conflict resolution; the posterior cingulate, which is concerned with making judgments about moral accountability; and--once subjects had arrived at a conclusion that made them emotionally comfortable--the ventral striatum, which is related to reward and pleasure.

"We did not see any increased activation of the parts of the brain normally engaged during reasoning," Westen is quoted as saying in an Emory University press release. "What we saw instead was a network of emotion circuits lighting up, including circuits hypothesized to be involved in regulating emotion, and circuits known to be involved in resolving conflicts." Interestingly, neural circuits engaged in rewarding selective behaviors were activated. "Essentially, it appears as if partisans twirl the cognitive kaleidoscope until they get the conclusions they want, and then they get massively reinforced for it, with the elimination of negative emotional states and activation of positive ones," Westen said.

Will Wilkinson, writing at the Cato Institute blog, highlights exactly how this sort of reasoning manifests itself in political argument. I think he really nails it:

Reinforcing and encouraging this specific kind of unreason is one way political coalitions assure their integrity and survival. The day-in-day-out of work of partisan political magazines is to explain to its loyal readers why there is basically no reason to take the other side's so-called arguments seriously. All you need to know about the minimum wage, say, is that there is someone good at math at Princeton who thinks it's good, and that everyone who dislikes it secretly wants to send the poor to forced labor camps. Or all you need to know about people who oppose the war is that they are flag-burning America-haters whose pusillanimous "post-modern" sense of moral equivalence leads them to secretly crave the reign of jihadist overlords. Etc.

One could easily keep listing similar examples. How often do we hear similar arguments? Hell, how often do we make similar arguments? After all, you don't really need to consider the arguments or the evidence for evolution. All you really need to know is that Darwinists are intent on destroying belief in God. Conversely, you don't have to deal with the actual arguments for ID, all you really need to know is that ID advocates are fundamentalists intent on establishing a theocracy in America (an argument I personally do not make and do not believe; there no doubt are theocrats among the supporters of creationism, and they surely deserve to be pointed out and criticized, but I do not for a moment believe that Phillip Johnson or Michael Behe are attempting to rebuild Calvin's Geneva or the Massachusetts Bay Colony here in America).

The pull of groupthink can be very powerful. Otherwise brilliant people fall easily into simple dichotomy-based thinking, where they define themselves by their enemies: whatever They believe must be wrong, and anything that undermines Them must be good. Conversely, if an idea scores points for the good guys, even if the support for it is a bit shaky, then run with it. I wrote about this phenomenon, without actually calling it this, a few weeks ago in a post about Stephen Carter and his take on the ACLU. I wrote:

It's a natural human tendency to evaluate arguments against a position we oppose in a more lenient fashion than we do arguments against our own position. If it sounds remotely plausible and it involves someone we deem our opposition, we'll tend to buy into it without demanding the same level of evidence we would demand if it was about us instead.

But rational people, people who care about truth and accuracy, must fight this tendency. We must try and evaluate every claim using the same criteria. Does the evidence support it? Are the conclusions drawn from the evidence logical? Any claim that fails to meet those criteria should be rejected, regardless of whether it supports our agenda or not. Likewise, any claim that withstands that scrutiny should be accepted as valid, regardless of whether it supports our agenda or not. None of us will ever be Mr. Spock, but we should strive to evaluate all arguments as though we have no stake in the outcome. Some, like the STACLU crowd, make no attempt at all to do so; we should not emulate them.

This habit of mind leads not only to sloppy and irrational thinking, it leads to the assumption of false dichotomies. We begin to imagine that there are only two ways of thinking - our way and Their way. We've seen this behavior from political partisans so often that we barely notice it. We've certainly seen it involving me and my perspective in just the last couple weeks. In one day, I was accused by one person of being an apologist for Republican crypto-fascists intent on destroying the earth, and by another of being an apparatchik for the Democratic party who changed what he wrote based on whether it would help that party's appeal to a group of voters I don't even think exists, on behalf of a party I've never voted for. Those are both examples of exactly the sort of simplistic groupthink that must be avoided. Both people making those accusations are incapable of conceiving of having an opinion on an issue outside of this tight little duality, and if you do not accept their way of thinking 100%, then you are obviously an agent of the enemy.

Gary Hurd's dishonest attack on me is actually a textbook example of this kind of (non-)thinking (he being the one who accused me, and several others, of being stooges for the Republican world destroyers despite our longstanding advocacy against the religious right's science agenda). In a follow up comment on that absurd PT thread, he writes:

Bryton (sic) is an active supporter of the Cato Institute. Nuf' said.

In fact, this is what caused the initial rift between Gary and I. It began when he posted a message slamming Cato that included false claims. For instance, he claimed that Cato advocated HIV denial, that they denied that germs cause disease, and that they're funded by big business. The mere fact that Sandefur and I denied those false claims was enough to paint us as whacko extremists (for the record, I support some things the Cato Institute says and don't support others - but none of those claims made by Gary were true). Because for Gary, everything is defined by that simplistic dichotomy - you either agree with him 100%, even if he's wrong, or you're a traitor to the cause, an agent for the other side, a closet supporter of the right wing mission to destroy the Earth in the name of Jesus - never mind that we are both heathens, that we've both spent many years advocating for science and battling the right's agenda on this, and so forth. None of that matters. Why? Because, as he goes on to write, everything is defined by our enemies:

And like Ed, Answers in Genesis isn't fond of me either. I don't mind being known by my enemies, AiG creationists and Ed Brayton are a great start.

Naturally, the fact that AIG isn't any more fond of me than of him never enters his mind. Or the fact that I write voluminously agaisnt the agenda of the religious right, both in terms of science and other issues, making me their enemy. No, none of that matters because I refused to accept his false claims about a group on his list of enemies. And if you aren't committed to the destruction of his enemies, then you must be the enemy.

The irony, of course, is that both sides in such dichotomies act exactly like one another while simultaneously slamming the other side for the very behavior they're exhibiting. When George Bush says "you're either with us or against us", Gary ridicules him. Yet he engages in identical behavior himself. This is exactly the sort of thing we all need to avoid, and though few of us take it to the extremes that someone like him does, we're all susceptible to the allure of simplistic thinking based on pre-defined group dynamics.

Will Wilkinson agrees, of course, but I think he adds something more to the discussion as well. He writes that cultivating this habit of mind is not only important because it makes us more likely to find the truth, but also because it frees us from internal constraints on our own liberty:

But confirmation bias matters not only because rationality matters, but because autonomy matters. Last week I attended an Institute for Humane Studies seminar at Stanford, and philosopher David Schmidtz gave a talk about psychological freedom--freedom from internal contraints. He remarked that there is something pretty depressing about the fact that what we believe is largely a function of the order in which we encountered new ideas--that our commitments are highly path dependent. But the fact that we can know that holds out hope for a kind of liberation.

So, this Independence Day, why not pick up a political book you know you'll disagree with. Or write a short essay giving the best argument you can think of for a position you find abhorrent. Or really listen to what your annoying brother-in-law thinks about the war at the family picnic. We could all be a little more rational, and a little more free, if only we really wanted to be. Dogmatic, whole-hearted commitment does feel good. But there is more to life than feeling good. There is truth, for one thing. And there is freedom--self-command. We're all jerked around by our own minds. But we can be jerked around less.

This doesn't mean that we shouldn't ever speak in terms of opposition to groups, of course. It also doesn't mean we shouldn't be pointed in our criticism when the situation warrants it. What it does mean is that we should not let our self-identified alliances trump our critical thinking. And yes, this is not only more rational, it's more liberating as well. The reality of internal constraints on our thinking can't be denied, but it can be fought.


More like this

It's nice to see that Will has been thinking about confirmation bias and how to avoid it, because I still remember when he was very much an example of someone who saw what he wanted to see instead of what was there. Welcome to the reality-based community, Will.

This is precisely why I stopped my involement with a political party. After volunteering on a couple of campaigns, I was disgusted with the "us v. them" mentality. There was no room for critical analysis or discussion. So, it is nice to see that there is a scientific name for his behavior but quite sad to understand how politics work.

By David C. Brayton (not verified) on 06 Jul 2006 #permalink

Some years ago on Salon's TableTalk, a wise woman posited that politics had been recast as a professional sport. Nn longer do most people seem to care about what's really good for the country in the long term, just whether or not their team wins. I think it's a prettty decent anaolgy.

Just look at the dittoheads this past week -- "Rush was unfairly singled out by liberal Customs agents!!" People, Rush is on probation for abusing drugs; naturally they're going to look in his shaving kit to see if he's living up to the terms of his probation. Surely you wouldn't object if Patrick Kennedy got similar attention, would you?


And notably, the media covers it all like a sport. The moment the president gives a speech, what do you hear on the news shows? All the talk is about whether the speech hurt him or helped him. They've got instant polls going on, focus groups and so forth. Virtually no discussion about whether the content of the speech made any sense or not. It's all a big game and the only thing that matters is who is winning.


The point of this post, and Will's, is that we are all prone to this sort of thing (some more than others, of course). I'm sure if someone dug back into my archives, they wouldn't have to dig far to find examples of me doing the same thing.

Ed, I disagree with your comment about Phillips and Behe "I do not for a moment believe that Phillip Johnson or Michael Behe are attempting to rebuild Calvin's Geneva or the Massachusetts Bay Colony here in America" and by implication their colleagues at the DI. No, I don't think they're conscious supporter of a theorcracy or, worse yet, Christian Reconstructionism, but their advocacy that holdings based on faith and not reality be taught in public schools and that government promote and support a particular viewpoint that happens to have roots in a particular brand of religious belief is to beg the necessity of governmental thought control--ergo, some kind of authoritarian state. It would be a theocracy in everything but name, and a decidely unpleasant society in which to live. When "truths" are revealed by the chosen few, or revealed in "sacred" texts, one has a theocratic autocracy.


I do not use the term "theocracy" nearly that broadly. Mere government advocacy of a religious position, though it should obviously be fought (and I do fight it), is not theocracy. Theocracy requires coercion and enforcement of religious orthodoxy, in my view.

I think this bias shows up in a different way. Ask a typical sports fan who will win a game or series or championship: a disproportionate number will name their own team. At the game, ask a fellow fan what they believe was the correct call on a disputed play: nearly all will make the call in favor of their own team.

People don't seem to be very good at telling the difference between what they want to happen and what they expect to happen (or thought did happen).

I read Shermer's column a month ago in Scientifiic American and thought he was right on. He describes the failing of all too much political dialogue, a failing to which all of us, including myself, succumb at different times. But I think it has risen to outrageous levels in recent years. During the '50's and '60's (I'm in my late '60's so I remember) Democrats and Republicans could oppose each other but work together in Congress to accomplish good ends with partisan rancor and ad hominem attacks rare. But since the mid '80's, the last half dozen especially, the rancor has risen to unprecedented levels. When a proposal or political idea is floated now, the first question is "Who's making it?" If it's from a person of the opposition party, it's immediately tarred as unworkable or worse. A case in point is the Democratic attempt to put in place some kind of withdrawal plan and timetable from Iraq. In lock step the Republicans, including Bush, rejected it as "cut and run" and "cowardly." Yet, essentially the same idea was laid out for Bush by General Casey more or less concurrently. The Republicans thought that was just fine. Such hypocrisy. With governance like that, we might be better off with anarchy.

Ed, I know you didn't use "theocracy" that broadly, but my point is that to push theological beliefs, founded on nothing more than the authority of a particular religious viewpoint, as education or public policy, wholly at odds with reality, will require "...coercion and enforcement of religious orthodoxy..." as you so phrased it. In my view that is theocracy, the very thing the Constitution was written (and as amended) to forestall.

Excellent essay, Mr. Brayton. The irrational in politics has been around forever, and has had both its greatest successes and failures in the last century.

Ideology can be a good thing, especially when rooted to principles that have stood the test of time. But it also shackles the mind to prior assumptions.

Despite the democritization of education, people toady are as susceptible to the big lie as always, perhaps more so. How else can you explain how a majority of Americans still believe Saddam was behind 9/11?

When I was a debater in high school, the rule used to be that if you wanted to win, you needed to know both sides of the question. For a time, that was the general rule in US politics as well, say from WWII until the Bork hearings. Bork was about the time essays started appearing on "the corsening of American politics" or similar articles. These keep appearing at an increasing rate, because it is getting worse.

This is a symptom that leads to authoritarianism. We're not so far gone yet, but the signs are not good. Democracy is getting squeezed by both ends, and it is up to those of us who believe in it to rescue its prospects.

The problem with overcoming this sort of reasoning is that it's just the way we think. Anytime we're emotionally invested in a conclusion (be it about our spouse, our party's candidate, our favorite sports team, or our pet scientific theory), motivated cognition (what is here being called "confirmation bias") will be at work. It's really our default way of thinking about the world.

The fortunate thing is that evidence can overcome bias, as the research on motivated cognition has repeatedly shown. You just have to make it impossible for motivation to overcome evidence and argument. And in political discourse, that's not very easy to do.

You are right to examine groupthink; it is a useful thing to pay attention to, and a general understanding of confirmation bias is also important. But Westen's study? Well . . .

Until Westen's paper is published, and the methods clarified, no conclusions can or should be drawn from this study. First and foremost, we do not know if it was blinded; we do, however, know that it was not randomised (for example they did not sample a selection of the population picked at random). There are significant problems with what we do know about the paper from the Scientific American article and the press reports.

First: Westen divides up fifteen Bush supporters and fifteen Kerry supporters. But are Westen et al actually comparing apples and apples? They have taken "strong" believers and compared them to . . . well, other "strong" believers. They have not compared them to people who are identified as "unsure", or "uncommitted", and so the only conclusion that can be drawn is that people who are "strong" believers appear to have this similar pattern of increased/decreased activity on fMRI. And even that tiny conclusion is uncertain: without a comparison group or control, we do not know if perhaps everybody who would have undergone Westen's test in this way would have a similar result, regardless of how strongly they felt about Bush or Kerry.

Because they have chosen "strong Bush" and "strong Kerry" supporters, you might be fooled into thinking that the two groups are different, that Westen et al are studying two different populations (Republicans and Democrats); you might then conclude that there is no difference between these two populations. But again, the actual studied group may essentially be the same population: people who self-identify as "strong" supporters of a particular campaign. All we can tell from this study, at this point, is that this particular Bush- or Kerry-affiliation does not lead to differentiation between the patterns of increased/decreased activity on fMRI on people who self-identify as "strong" supporters. Read that again; it does not mean we can tell anything about Bush supporters, or Kerry supporters, or Republicans or Democrats.

Furthermore, we still have to wait until we can see the data to draw even the tiniest conclusion: are there trends to significance? Are there differences but ones that are not statistically different? Or, more pertinently, are there differences that this study design or this technology cannot discern? After all, the "n" of this study was a mere 30 (i.e., n = 30 = number of study subjects). Was the study adequately powered? Could this have been a type-II error? We know none of this. And yet people want to draw conclusions from this study? Use this study to provide "evidence" for confirmation bias?

Second: Who are these men? All we currently know about them is their "strong" political belief. What demographics do Westen et al provide? Were all the men white? Were they all black? Hispanic? Were they urbanites or rural farmers? Do Westen et al provide IQs? IQs are problematic, but they give you an estimate of some types of intelligence. Remember, this may not be a study of 2 different populations (Bush supporters and Kerry supporters), but a study of one population, people who self-identify as having a "strong" belief. But is there any way of generalising from this sample of 30 to the millions, perhaps billions, of people who have "strong" beliefs? I'm not saying that race or socio-economic class would necessarily make a difference; and if it did make a difference, I've no idea what it would be. Knowing nothing about these men, how can we generalise, let alone to juries and CEOs as Michael Shermer does? This is a tiny study about 30 guys, and all we know about them is that they consider themselves "strong" supporters and they signed up to go into a big expensive machine and had their brains scanned.

Third: What does "strong" mean? This is one of the few things Westen appears to have controlled for. And yet, did Westen et al take any measures of the "strength" of the support (amount of money given to candidates, time spent campaigning, etc.)? Out of curiosity, did they all even vote? (It might be asking a bit much of Westen et al to find that out, but it would be an interesting measure of "strength" if we knew that 100% or, say, 47% of them ended up actually voting).

Fourth: One must be very careful about drawing conclusions about "function", even from functional magnetic resonance imaging. fMRI, a fabulous technology, basically measures blood flow; its images represent increased metabolic activity. This does not mean that you are looking at brain function; it means that you are looking at increased blood flow, which likely correlates with brain activity, which likely relates with brain function. There is excellent evidence that areas of the brain correlate with specific activities, processes, emotions, even characteristics. But it remains speculative, especially when coming to such complicated propositions as emotional experience. So, to say that one area lighting up means something concrete about the way the men are thinking is a very speculative jump. Consider this phrase from Scientific American, in which the author is naming out the areas that are "lighting up": "the posterior cingulate, which is concerned with making judgments about moral accountability". So, you think that if somebody gets a lesion in their posterior cingulate, they will not be able to make judgements about moral accountability? It is certainly possible that the posterior cingulate is involved in the neural processes that lead to judgement, but it is surely hasty and rash to say that when it lights up, the men are making moral judgements. I'm not saying it is impossible to speculate on function from an fMRI: it is a useful tool. But we cannot read thoughts from an fMRI.

Fifth: What is "bias"? 30 men, who have spent hours poring over the literature, reading books, going over the Federalist papers, reviewing speeches and records, may in fact become "strong" Bush and Kerry supporters, and, when confronted a few scattered quotes, may not need to draw upon significant cognitive resources to dismiss or support those quotes. If they are "strong" supporters, they might even be familiar with the quotes and already have carefully considered a variety of reasons why the quotes were fashioned the way they were, and so, at this point, will only respond emotionally to them. Yet Westin says that they are not using their "reasoning"! And the MSNBC online report comes to the conclusion: " The study points to a total lack of reason in political decision-making." Rubbish!

Why can we not challenge this study on another level? Is it possible that, say, the Bush supporters were the ones who were actually biased, whereas the Kerry supporters had formulated a valid understanding reason why there were Kerry contradictions? Or vice versa? And is it possible that an fMRI cannot tell the difference between an emotional response based on accurate assessments and one based on "bias"? Think about this. This point is crucial. It does not mean that everybody is equally wrong.

Sixth: Apparently these men were "assessing" statements by Bush and Kerry, and were doing so "critically", providing justifications for Bush or Kerry depending on their view - and yet they did not use their frontal lobes? So how does one assess critically without using the part of your brain that "normally engag[es] in reasoning"? Are they reasoning ("assessing . . . critically") or not?

In short, this study appears at best flawed, and at worst may be a load of rubbish.

But why is this all politically bogus? Because the study gets generalised: at this point, I hope from the criticisms above you will see that we need a ton more information before we could possible assess its generalisability. But clearly, people are already generalising. That's what the Scientific American article does and that's what MSNBSC ( does: "Democrats and Republicans alike are adept at making decisions without letting the facts get in the way, a new study shows." Um, no. Bogus generalisation.

People who think that there are no differences between Republicans and Democrats can point to this for a veneer of empiricism. Westen moronically puts it better: "Essentially, it appears as if partisans twirl the cognitive kaleidoscope until they get the conclusions they want, and then they get massively reinforced for it, with the elimination of negative emotional states and activation of positive ones." Um, no. The study doesn't actually show that; even if it filled in all the information I've said is lacking, and answered all the problems I've posed above, the study doesn't show that.

And the consequences? One respondent above writes, "This is precisely why I stopped my involement with a political party. After volunteering on a couple of campaigns, I was disgusted with the "us v. them" mentality. There was no room for critical analysis or discussion. So, it is nice to see that there is a scientific name for his behavior but quite sad to understand how politics work." So far, this study does not support "a scientific name for his [sic] behavior". And it's too bad the person found an excuse for not remaining involved, and has been able to label it.
In your blog, you assail groupthink and how we "fall easily into simple dichotomy-based thinking".

Westen's study appears to be neither a critique of the problem nor an explanation of it, but a symptom of the problem.

Thanks WiSH for pretty much - plus some - making the points on which I was pondering.

By wildlifer (not verified) on 06 Jul 2006 #permalink

This is a fascinating subject. I will read and consider this.

By Ginger Bush (not verified) on 08 Jul 2006 #permalink