In a world where the temptation to lie, deceive and cheat is both strong and profitable, what compels some people to choose the straight and narrow path? According to a new brain-scanning study, honest moral decisions depend more on the absence of temptation in the first place than on people wilfully resisting these lures.
Joshua Greene and Joseph Paxton and Harvard University came to this conclusion by using a technique called functional magnetic resonance imaging (fMRI) to study the brain activity of people who were given a chance to lie. The volunteers were trying to predict the outcomes of coin-flips for money and they could walk away with more cash by lying about their accuracy.
The task allowed Greene and Paxton to test two competing (and wonderfully named) explanations for honest behaviour. The first -the "Will" hypothesis - suggests that we behave morally by exerting control over the desire to cheat. The second - the "Grace" hypothesis - says that honesty is more a passive process than an active one, fuelled by an absence of temptation rather than the presence of willpower. It follows on from a growing body of psychological studies, which suggest that much of our behaviour is governed by unconscious, automatic processes.
Many studies (and several awful popular science articles) have tried to place brain-scanning technology in the role of fancy lie detectors but in almost all of these cases, people are told to lie rather than doing so spontaneously. Greene and Paxton were much more interested in what happens in a person's brain when they make the choice to lie.
They recruited 35 people and asked them to predict the result of computerised coin-flips while sitting in an fMRI scanner. They were paid in proportion to their accuracy. In some 'No-Opportunity trials', they had to make their predictions beforehand, giving them no room for cheating. In other 'Opportunity trials', they simply had say whether they had guessed correctly after the fact, opening the door to dishonesty.
To cover up the somewhat transparent nature of the experiment, Greene and Paxton fibbed themselves. They told the recruits that they were taking part in a study of psychic ability, where the idea was that people were more clairvoyant if their predictions were private and motivated by money. Under this ruse, the very nature of the "study" meant that people had the opportunity to lie, but were expected not to.
Based on their average results, the duo classified 14 of their would-be psychics as "dishonest", for they achieved improbable levels of accuracy of 69% of more. Everyone else was placed in either an honest or an ambiguous group. Of course, these labels referred to their overall behaviour during the task rather than personality traits - the 'dishonest' people didn't always behave that way and, in fact, that was critical to the first part of the experiment.
Greene and Paxton found that the honest people had the same reaction times whether they won or lost money, and whether they had the opportunity to lie or not. These results support the Grace hypothesis for they suggest that honest people aren't making any extra mental effort when they forgo the opportunity for cheating.
The brain-scanning data matched the pattern suggested by the reaction times. When interpreting the scans, Greene and Paxton focused on areas at the front of the brain that are associated with mental control, such as the anteriror cingulated cortex (ACC), dorsolateral prefrontal cortex (DLPFC) and ventromedial prefrontal cortex (VLFPC). These areas are active when, for example, we delay instant gratification to follow through on a plan, and Greene and Paxton refer to them as "the control network".
In the control networks of honest volunteers, there were no significant differences in activity when the do-gooders lost money because they had to (in No-Opportunity trials) and when they lost money because they gave an honest answer (in the Opportunity trials). Even when Greene and Paxton fine-tuned their analysis of the brain scans to the highest possible resolution, they couldn't detect any differences.
Both the reaction-time tests and the brain scans show that for people who mostly behave honestly, these experiments strongly favour the Grace hypothesis - these people don't seem to require any extra effort to resist the opportunity to lie. This isn't due to ignorance, for every one of the 14 honest people said afterwards that they knew they could cheat if they wanted too. They're not oblivious to the option - they just don't need to work to ignore it.
The dishonest group showed different patterns. Their DLPFC was more active when they won money in the Opportunity trials (which they often did by lying) than in the No-Opportunity ones (where they always had to win honestly). This suggests that the choice to be dishonest is associated with brain activity in the DLPFC.
But as a whole, their control network was more active in the few Opportunity trials where they lost money due to an honest response than in No-Opportunity trials when they had no choice in losing money. This matches the results from the reaction time measurements, where they took much longer to respond in Opportunity trials where they could cheat but didn't, than in No-Opportunity trials where their losses were forced. Both sets of results suggest that people who are usually willing to lie exert some extra mental control when they refrain from doing so.
Surely this is compatible with the Will hypothesis? Greene and Paxton argue otherwise - they describe these situations as "limited honesty" and they say that "the Grace hypothesis applies only to honest decisions in individuals who consistently behaved honestly and not to decisions reflecting limited honesty."
I'm not sure I really buy that distinction, but the duo offer up an interesting explanation for the fact that control network activity is associated with both limited honesty and the decision to lie. They suggest that this pattern reflects attempts by the dishonest players to resist temptation. The fact that they fail more often than they succeed explains why the network is active in trials where they end up lying as well as those few where they successfully resist.
To prove their point, Greene and Paxton created a mathematical model that predicted how many wins people would get in the Opportunity trials (a rough indicator of how often they lied) based on activity levels from 9 different control network areas. The predictions from the model matched the real figures with an almost 80% accuracy.
So, people who veer towards dishonesty try to resist it but fail more often than they succeed. People who are mostly honest don't really need to try. This result is fairly counter-intuitive, for we tend to believe that honesty is an act of will overcoming temptation. In a survey done before any of the experiments, Greene and Paxton found that given a choice, ordinary people believe that the Will hypothesis is the right one.
Obviously, the study has its limitations. Greene and Paxton couldn't work out how many of their dishonest volunteers were aware of their deceit, whether the honest lot wilfully pushed aside temptation well before the brain-scanning commenced, what the motivations of either group were, or whether their degree of honesty in the experiment carries over into their normal lives. Nonetheless, it's an intriguing start, and as the duo concludes: "The present findings do suggest, however, that some individuals can, at least temporarily, achieve a state of moral grace."
PS On a final note, one of the stranger results amid the data is the fact that VLPFC of honest players was more active during Opportunity trials where they had to accept money by admitting to choices they had made fairly, than in No-Opportunity trials where their win was a given. Why would people need to exert more mental control to accept a just outcome? Perhaps this activity reflects their "pride or self-doubt upon accepting legitimately won rewards"?
Reference: PNAS DOI: 10.1073/pnas.0900152106
More on consciousness:
- Log in to post comments
Nice post! I saw Greene give a talk about this work a few months ago, and you've done a good job of summarizing and analyzing the results.
"PS On a final note, one of the stranger results amid the data is the fact that VLPFC of honest players was more active during Opportunity trials where they had to accept money by admitting to choices they had made fairly, than in No-Opportunity trials where their win was a given. Why would people need to exert more mental control to accept a just outcome? Perhaps this activity reflects their "pride or self-doubt upon accepting legitimately won rewards"? "
Honest people tend to be insecure and have low self-esteem; the truthful people in this test didn't lie because their fear of getting caught, and the social consequences thereof, was greater than their desire for the money involved. Since their fear made the decision for them before the game even started, of course they'd think less about whether or not to lie.
But when the payout comes around, the situation reverses. Where a more confident person would just take the money, someone with less social confidence would have to consider whether or not they could safely claim the money, whether they'd be believed if they did claim it, whether they should give the money up anyway for the sake of getting along, and so on, before taking it. So brain activity in that region is higher, because, again, they're making a decision whether or not to tell the truth. Makes sense to me...
So it isn't that no temptation was offered. IE there was a possible pay-off to everyone. Honest people didn't have to work at resisting the temptation whereas dishonest people had some brain activity. However it is still unclear that the activity had to do with resistance. Maybe they were calculating the chances of getting caught, or imagining their satisfaction at not, or figuring out how much to cheat, or a lot of other possibilities.
Maybe they weren't lying, maybe they really were psychic!
Have to agree with Lilian, and overall say that while the results are cute, I don't put much stock in the interpretation. It's quite feasible that the "honest" group simply made one decision at the beginning to be honest, whether they felt temptation or no. And that the "dishonest" group wasn't thinking about temptation at that point, but rather thinking something more akin to "did I do this too many times in a row? Am I too obvious?", or somesuch.
I've often thought there is a lot to this concept. It's easy to resist something if it doesn't interest you that strongly, and it's certainly no reason to feel 'superior' or more moral. Everyone's mix of temptations is different.
Even for myself, there are certain things that serve as strong temptations for some people which I have no trouble at all resisting (to be honest, because they provide less psychological 'reward'), and to be equally honest there are some things that tempt me severely... passing them up is a real struggle.
Which category defines a person's moral stand?
Perhaps, given the last paragraph people who weren't expecting to win (ie low self esteem) didn't even try to cheat. They didn't need to in order to fulfill their belief that they wouldn't do well.
Why [Mad the Swine, Richard Eis] so sure about honest people having low self-esteem as if that must be the only possible explanation? That isn't tested by Green and Paxton! They do question themselves, but - according Ed's PS - think of PERHAPS pride or self-doubt [seemingly among other possibilities].
I can only talk for myself, why - being honest - mentally reacting when having to accept money. If gaining, I woudn't be sure if there wouldn't got something claimed upon me, something unexpected [not agreed in advance] afterwards. Not a low self-esteem, but some mistrust in other people by having experienced such behavior often enough. So, not self-doubt, but realistic doubt about other people. Accepting, makes me more vulnerable to them. In contrast to when I am the one to pay [and they depend on my obliging and fairness].
Toos: An interesting point about accepting money.
There's also one very interesting group of people who didn't get mentioned in your article -- people on the autistic spectrum have difficulty even learning how to lie! They're significantly delayed in their ability to manage even social lies, with more severe cases being more-or-less unable to do so even as adults. They have similar difficulty in learning to recognize, not only outright falsehoods (including "white lies"), but sarcasm and "prank lies".
And even as adults, folks fairly low on the autistic spectrum tend to have a deep dislike of, and reluctance to engage in, dishonesty. (The higher levels are often still unable to lie, and have great difficulty recognizing lies from others). I have NLD myself (roughly, "half an Aspie"), and willful falsehoods feel to me like a rip in the fabric of the world -- something not just "wrong", but fundamentally threatening. However, I'm pretty good at evasiveness, or just holding my silence. With some effort, I can manage misdirection.
David, I'm surprised by your comment: my husband has Asperger. And indeed he is absolutely not manipulative! Eureka why I appreciate that so much.
Joshua Greene, who studies morality will not share the data from this publication with the scientific community despite it being funded by public sources. I have requested this data numerous times and he has refused to provide it. How can we trust a publication if the author refuses to share his work?
With the scientific community, you say? Well since your comment appears to criticise Greene for lack of transparency, it behooves me to note that, according to your email address, you are Steve Laken, CEO of Cephos Corp. Your website says:
You also appear to be using Greene's paper as evidence of the validity of your services, on the top of your list of "Scientific Industry Articles Regarding fMRI Lie Detection".
And you can't imagine why he might not want to release his data to you?
For interested readers, here are some links on the use of fMRI in lie detection from New Scientist, Jonah Lehrer and ScienceLine
HAHAHAHA! Steve, what a shameless fuckwit you are!!!!
Reminds me of the Lenski Affair, at least in the level of fuckwititude.
Dude, "Steve." Just buy the paper for $30 and stop complaining like a douchecock.