It’s hard to believe it’s been four years since the war began. If you missed Bob Woodruff’s important documentary on the epidemic of brain injuries caused by war, I highly suggest watching it. According to Woodruff, up to 10 percent of all veterans suffer some sort of brain injury – often caused by explosive shock waves – while in Iraq. Most of these injuries will go untreated.
And then there’s the psychological toll. Numerous reports indicate that the army still doesn’t get PTSD, and isn’t providing our veterans with a suitable level of care. It seems that, because mental illness is invisible, it often goes untreated. Sometimes, soliders with PTSD are hazed.
But even soldiers who manage to return from Iraq without any bodily injuries or mental illnesses still have to deal with the emotional trauma and stress of war. They are often haunted by memories of what they’ve done:
By the time he came home, Jeffrey Lucey was a mess. He had gruesome stories to tell. They could not all be verified, but there was no doubt that this once-healthy young man had been shattered by his experiences.
He had nightmares. He drank furiously. He withdrew from his friends. He wrecked his parents’ car. He began to hallucinate.
In a moment of deep despair on the Christmas Eve after his return from Iraq, Jeffrey hurled his dogtags at his sister Debra and cried out, “Don’t you know your brother’s a murderer?”
War isn’t natural. Humans are social primates; we need to live in densely clustered groups. As a result, we’ve evolved a set of powerful moral instincts that prevent us from hurting each other. Killing makes us feel bad, even when we are killing Sunni insurgents. It’s one of the more uplifting facts of human nature: each of us is born with a powerful moral compass, and this compass constrains our behavior.
Going to war forces soldiers to void this innate moral compass. Violence is normalized in battle, but violence isn’t normal, at least from the perspective of the brain. Thanks to fMRI research, we can now begin to see the neural source of this morality. Joshua Greene has done some of the pioneering work in the field. Greene has discovered that when we contemplate certain types of moral questions, a specific network of emotional brain areas become active.
What triggers this moral cortical network? Greene believes that our mind automatically distinguishes between “personal” and “impersonal” moral judgments. A personal moral situation occurs whenever we consider harming a specific person. (As Greene puts it, these actions can be roughly defined as “Me Hurts You,” a concept simple enough for a primate to understand.) When confronted with a personal moral dilemma, our emotional brain areas are activated, and we start to imagine what somebody else might experience if we pursued a certain course of action. Psychopaths seem to be unable to generate any emotions in response to personal moral situations.
To better understand this distinction, consider the trolley dilemma, a philosophical thought-puzzle first coined by Judith Jarvis Thompson in the early 1970’s:
Suppose you are the driver of a trolley. The trolley rounds a bend, and there come into view ahead five track workmen, who have been repairing the track. The track goes through a bit of a valley at that point, and the sides are steep, so you must stop the trolley if you are to avoid running the five men down. You step on the brakes, but alas they don’t work. Now you suddenly see a spur of track leading off to the right. You can turn the trolley onto it, and thus save the five men on the straight track ahead. Unfortunately, there is one track workman on that spur of track. He can no more get off the track in time than the five can, so you will kill him if you turn the trolley onto him. Is is morally permissible for you to turn the trolley?
In this hypothetical case, about ninety five percent of people agree that it is morally permissible to turn the trolley. Some moral philosophers even argue that it is immoral to not turn the trolley, since such a decision leads to the death of four extra people. But what about this scenario:
You are standing on a footbridge over the trolley track. You can see a trolley hurtling down the track; it’s out of control. You turn around to see where the trolley is headed, and there are five workmen on the track…What to do? Being an expert on trolleys, you know of one certain way to stop an out-of-control trolley: Drop a really heavy weight in its path. But where to find one? It just so happens that standing next to you on the footbridge is a fat man, a really fat man. He is leaning over the railing, watching the trolley; all you have to do is to give him a little shove, and over the railing he will go, onto the track in the path of the trolley. Would it be permissible for you to do this?
The brute facts, of course, remain the same: one man must die in order for five men to live. If our ethical decisions were perfectly rational, then we would act identically in both situations, and would be as willing to push the fat man as we are to turn the trolley. (Kant wouldn’t have seen any difference.) And yet, almost nobody is willing to actively throw another person onto the train tracks. The decisions lead to the same outcome, and yet one is moral and one is murder.
What accounts for this inconsistency? Thompson argued that pushing the fat man felt wrong because “it was an infringement on his rights”. We are using our body to hurt his body. The whole event is very visceral. In contrast, “turning the trolley onto the right-hand track is not itself an infringement of a right of anybody’s.” We are just shifting the trolley wheel: the ensuing death seems indirect.
This fuzzy moral distinction is built into our brain. When Greene asked people inside his fMRI machine to contemplate these two different trolley scenarios, he saw distinct patterns of activation. When they were asked whether or not they should turn the trolley, their standard decision-making machinery was turned on. Their brain treated the quandary like any other – it was an impersonal moral question – and quickly realized that it was better to kill one man than five men. That’s just simple arithmetic. It doesn’t take a moral philosopher to realize that it’s always better to kill fewer innocent people.
However, when these same people were asked whether they would be willing to push a fat man onto the tracks, a separate network of brain areas was activated. These are the same regions turned on by all personal moral questions, and they cause us to automatically imagine how the poor fat man would feel as he plunged to his death on the train tracks. We vividly simulate his own mind, and conclude that pushing him is a capital crime, even if it saves the lives of five other men.
By using his fMRI scanner, Greene easily distinguished between these two variations on the trolley dilemma. He could see how personal moral questions are processed separately from impersonal moral questions, and how these different patterns of brain activation lead to contradictory choices. People couldn’t justify these moral decisions but their certainty never wavered. Pushing a man off a bridge just felt wrong.
Why are we so sensitive to personal moral situations? The answer is evolutionary: thinking about each other is how we keep from killing each other. “Our primate ancestors,” Greene explains, “had intensely social lives. They evolved social mechanisms to keep them from doing all the nasty things they might otherwise be interested in doing. This basic primate morality doesn’t understand things like tax evasion, but it does understand things like pushing your buddy off of a cliff.” Our most visceral moral reactions are simply side-effects of natural selection, the necessary consequence of monkeys living in a group.
War violates our most fundamental moral instincts. It forces soldiers to consistently ignore some of their most basic emotions. These killings may be justifiable – sometimes war is necessary – but that doesn’t change the existence of our innate morality, which doesn’t comprehend these political distinctions.
It’s perhaps more convenient to believe that killing is a normal part of our biology. If men are Hobbesian brutes, then anything is possible. But our veterans know better. After four years of urban street fighting, after four years of close combat and tense gun battles and harrowing convoys, their psychological scars are tragic reminders that killing isn’t natural. Violating our innate moral code makes us feel bad, and those feelings linger.
Update: Thanks for the great comments. If I were re-writing this post, I’d take out that stupid line about war not being natural. Of course war is natural. It happens all the time. (I’m personally of the opinion that everything humans do is “natural”.) But I disagree that war or violence is somehow more natural than goodness, that humans and chimps are naturally predisposed to violence, that altruism and empathy are simply a veneer imposed by the dictates and legal codes of civilization.
Look, people and primates do evil things all the time. We are entirely capable of voiding our moral instincts, rationalizing away our evil acts. That goes without saying. My point was simply that this very basic primate morality – what Greene calls “Me Hurt You” – is built into the brain. It takes extra cognitive work to excuse our moral failings because it is our failings that make us feel bad.