Why are we so dishonest? Why do we bad things, even when we know we're doing something bad? Ever since Adam and Eve ate that apple, we've assumed that there is something inherently tempting about sin. If left to our own devices, we'd all turn into men at a Vegas bachelor party, indulging in sex, drugs and slot machines. We'd loot and pillage and lie. Immorality feels good, which is why it's so hard being moral.
Some people, of course, are made of stronger stuff, which is why they stay on the righteous path. Because they're better than us, they don't eat too much cake or cheat on their taxes. (Eternal heaven is their reward for avoiding such sins.) There is good and there is bad, and being good is about resisting the allure of the bad. It's about not listening to the snake, telling us to eat the forbidden fruit.
If only morality were so easy! A new paper demonstrates, once again, that the human brain is the ultimate category buster, blurring the lines of good and bad, black and white, until everything is gray. The reason is that our behavior is deeply contextual, profoundly influenced by our surroundings and immediate situations. Whether or not we're able to resist sin, then, might depend more on the details of the sin - and whether or not it triggers our automatic urges - then on the strength of our moral fiber.
That, at least, is the tentative conclusion of a clever new fMRI study by Joshua Greene and Joe Paxton at Harvard University, who argue that sometimes we do the right thing because the wrong thing simply isn't tempting, even if it leaves us better off. Consider a hypothetical wallet, stuffed full of cash, which you find on the subway. Our moral intuitions (influenced by Genesis) tell us that everyone wants to take the money and run, that we're all attracted by the possibility of unearned cash. But this latest study suggests that, at least for the people who take the wallet to the police, there is no temptation to resist. They don't steal because they don't want to steal; telling the truth isn't hard work. They are living, in other words, in a state of moral grace, at least when it comes to the wallet. (Interestingly, Greene and Paxton found that people who behaved dishonestly in the experiment exhibited more activity in brain areas, such as the prefrontal cortex, associated with self-control. In other words, they might be trying harder to resist, but it's doing no good.) Here's Piercarlo Valdesolo, describing the study in Mind Matters:
Greene and Paxton were interested in why people behave honestly when confronted with the opportunity to anonymously cheat for personal gain. They considered two possible explanations. First, there is the "Will" hypothesis: in order to behave honestly people must actively resist the temptation to cheat. In other words, returning the wallet depends on your ability to stifle your desire to take the cash and buy yourself something nice. Alternatively, there is the "Grace" hypothesis: honest behavior results from the absence of temptation. Returning the wallet requires no particular ability to control your treacherous urges - the urge simply isn't there.
These two hypotheses make competing predictions regarding the brain regions activated when acting honestly as well as the time it should take participants to decide to act honestly. If "Will" is correct then people who choose to act honestly should exhibit heightened activity in brain regions responsible for cognitive control (presumably resulting from the struggle to ignore immediate desires). But if "Grace" is right then no such increase should occur. Furthermore, people should take a longer time to decide to act honestly if doing so requires a conscious act of "Will," but a relatively shorter time to act if all you need is a bit of "Grace."
In order to test these possibilities the researchers measured neural activity in an fMRI machine while participants played a computerized game wherein they could gain money by predicting the outcome of coin flips. Correctly guess heads or tails, you get some cash. In one condition, participants recorded their predictions before seeing any of the flips, precluding the opportunity to cheat. In the other condition, participants were rewarded based on self-reported accuracy after the flips, and therefore could fudge their predictions in accordance with the outcome of the flip. I got 100 percent correct, Mr. Experimenter, must be my lucky day!
Consistent with the "Grace" hypothesis, those who acted honestly (who guessed wrong and self-reported as much) showed no increased activity in control-related areas relative to others who guessed wrong but did not have the opportunity to cheat. Honest reporting of scores, then, didn't require will-power, these participants simply did not feel the urge to cheat. Reaction time data further supported "Grace" showing that participants who acted honestly took no longer to do so, on average, when they had the opportunity to cheat than when they did not.
Why might such a state of temporary moral grace exist? The answer returns us to evolution, and to our history as social primates. One possibility is that we come pre-programmed for certain kinds of ethical behavior, as it might be more important to have an honest reputation within the group than to have a few extra dollars. And so we return the wallet, not because we've triumphed over our sinful urges but because, at least on this one subway ride, the urge did not exist. It will be interesting to conduct some follow-up studies, and see if it's possible to induce this state of grace in strangers. How can we make people think about their social reputation, and not the cash? We're so concerned about our credit history, but what about our virtue history?
Out of curiosity, did the experiment distinguish between people who actually cheated and those who had the opportunity to cheat? It seems as though that might make a difference.
If honesty has become your basic strategy in life, you experience little anxiety in choosing to follow that course, as resisting temptations will have become habitual. If dishonesty has become an acceptable element of your strategy, part of that strategy will require a heightened awareness of potential dangers before acting to carry it out. A controlled form of anxiety will be the habitual modus operandi of the brain.
Another way to put it is that honesty is less dangerous as a rule than dishonesty, and our emotional brains instinctively know that.
i could not have said it better. wonderful!
When people demand I explain where morality comes from, if not from religion, I usually turn to explanations like this. Humans are moral because we are social animals; our continued survival depends on the stability of that society, and the stability of that society depends on ethical behavior on its members.
It's nice to see some (more) experimental evidence to that effect.
I think it's helpful to distinguish between morality as defined by society, religion, and other external sources and morality as understood intuitively and embraced by an individual. I act with high responsibility regarding moral decisions that deal with my own understanding of right and wrong. However, if societal/religious morality demands behavioral restrictions I would not otherwise adopt, I act according to the consequences my actions would precipitate. I think religion and society can produce morally confused individuals who feel obligated to follow moral standards they wouldn't have otherwise considered valid. It's much easier to resist temptations you genuinely believe are wrong and harmful.
Is there such a thing as "theory of cooperation"?
For example, if I were to pick up a wallet,my first thought would be, what if it were MY WALLET, what would I like the other person to do?
Thus, naturally, I'd returned the wallet.
In my mind the issue is clear. Animals do not have
economic system, but they eat every day, what nature
provides, well, all animals "except man." To me this exception is the basic idea of where all human corruption part. We can study to blind us, but while this economic situation persists, it will persist the state of temptation, because nobody likes suffering from starvation.
Isn't it just the same as the marshmallow test? Think about the marshmallow, and it becomes hard to resist eating it... think about something else, and there is no effort required to avoid the marshmallow.. Same story with the wallet.. think about stealing the wallet (at all, even if you are thinking 'i mustn't steal the wallet) and it becomes hard to resist stealing it.. whereas think about something else ( i might get a reward, or they might be happy to get it back if i hand it in) and it becomes effortless not to steal it. So all we need to do is give everyone the opportunity to cheat, see who does and who doesn't and then ask 'what were you thinking about' ... i bet all the cheats though about cheating, and i bet most of them had been the children who thought that 'how not to eat the marshmallow is to stare at it'.
What we call "morality" is just a mixture of Kin-Selection and Reciprocal Altruism. We have evolved a sense of "right" and "wrong" because entirely amoral, depraved individuals happen to (historically) have fewer successful offspring.
Gines, to talk about "human corruption" is to assume that some idyllic non-corrupt state exists, which would be a false and superstitious assumption. What is, is, and no-one ought to pretend that non-human animals are any less "amoral" than humans. e.g. dolphins raping females.
> "Another way to put it is that honesty is less dangerous as a rule than dishonesty, and our emotional brains instinctively know that."
Depends on context. Our lives don't carry the risk of starvation, or even of deprivation. If they did, honesty (to strangers) could become _more_ dangerous.
It's like courtesy in traffic - we're nice when we're reasonably content, less so when aggravated (by previous traffic having hindered our progress).
One thing I wonder - how do systems engineers do, relative to the general public? Seems to me that if your daily goal, by education & vocation, is to make a system work smoothly, then helping to achieve this (by returning stuff to rightful owner) fits into this frame and thus would be inherently rewarding. Whereas someone whose job involves trying to break things...
This is exactly what happened to me. In 1980, I found a wallet on the side of the road while jogging. I opened it when I got home and found there was $100 (5 $20's) tucked away.
I thought, "Man, if I lost $100, I would hope that it would be returned." Not being religious, I didn't think "I will go to hell if I keep the $100."
As an aside, the woman who came to pick up the wallet didn't look like she made $100/week. The feeling I got was worth substantially more than $100.
"Grace" appears synonymous to "habit." A man who has been honest frequently enough that it has become a habit will find stealing a wallet difficult. Group dynamics come into play when our habits do not match up to group norms. For example, get the honest guy elected to the US Congress or other high level political office and see what happens to his honesty over time.
i really enjoyed this topic because i am not a religious person but i have very strong "moral" foundation (right vs. wrong) and i like that this article has challenged me to wonder why. on the topic of mortality being the result of social interaction/reputation/reliance even? - how is it that my moral foundation remains firmly in tact when every day i encounter individuals that oppose it, whether it is temporary from them merely having a bad day (eg. anna's response) or rather from their own perception of morality (or lack thereof)? so, why am i still honest when so many other people around me are not? how is this creating a successfully functioning society? why do i continue to return found wallets when so many other people do not - how does my honesty benefit the dishonest "group"? i hope this makes sense, it is hard for me to formulate into words. comments, t3knomanser? this is a great discussion to have.
"then on the strength of our moral fiber" = not a clever use of "then".
It occurred to me that this really isn't an effective test for morality as a whole. The issue is that the experiment tests only a person's strain to resist doing something wrong, rather than to actively do something right. I think that honesty is easier compared to lying just because it is passive in nature. What I would really like to see is a test for the strain involved in moral action, rather than immoral inaction, I think the results would look very different.
None of this is to say that this experiment wasn't a very effective test of resisting temptations. It tests that very well.