Research on the role of emotion/intuition in moral judgments is really heating up. For decades (millennia, even), moral judgment was thought to be a conscious, principle-based process, but over the last few years, researchers have been showing that emotion and intuition, both of which operate automatically and unconsciously for the most part, play a much larger role than most philosophers and psychologists had previously been willing to admit. In this context, two recent papers by roughly the same group of people have presented some really interesting findings which, if you ask me (and if you’re reading this, that’s what you’re doing, damnit), really muddy the picture, but in a good way. One of the papers, a letter by Koenigs et al.1 that was published in last week’s issue of Nature has been getting a lot of attention in the press, so I’ll talk about it first and save the other paper for another post. The Koenigs et al. paper compares the performance of patients with bilateral damage to the ventromedial prefrontal cortex (VMPC; see the image below, which I stole from this site) on four types of moral dilemmas is compared to the performance of “normal” individuals and individuals with brain damage to other regions of the brain. First, a little about the VMPC.
As you can see in the image, the VMPC (red lines) butts right up against the amygdala, and the two regions communicate extensively. The amygdala is one of the main brain areas associated with emotion and brain’s the reward system. The VMPC is also connected to the brain stem and other areas associated with the reward system. So it’s thought that the VMPC plays a role in encoding the reward value of stimuli, as well as emotions like fear. In essence, the VMPC is part of the system that determines approach and avoidance behavior. So damage to the VMPC can make decisions related to the value of a stimulus more difficult. It stands to reason, then, that if emotion plays a role in moral judgments, damage to the VMPC could have a profound affect on those judgments.
So Koenigs et al. compiled 58 scenarios and placed them into three general categories: non-moral scenarios, impersonal moral scenarios, and personal moral scenarios. They further classified the personal moral dilemmas as high-conflict or low-conflict scenarios. Here are examples of each from the paper (all 58 are available in the paper’s supplementary material, which can be read here without a subscription):
- Non-moral scenarios (18 total):
You are in charge of scheduling appointments in a dentist’s office. Two people, Mr. Morris and Mrs. Santiago have called to make appointments for next Monday. The only available times for next Monday are at 10:00 AM and at 3:00 PM.
Mr. Morris’s schedule is rather flexible. He can have his appointment either at 10:00 AM or at 3:00 PM. Mrs. Santiago’s schedule is less flexible. She can only have her appointment at 10:00 AM.
Would you schedule Mr. Morris for 3:00 PM so that both he and Mrs. Santiago can have their appointments next Monday?
- Impersonal moral scenarios (18 total):
You are at the wheel of a runaway trolley quickly approaching a fork in the tracks. On the tracks extending to the left is a group of five railway workmen. On the tracks extending to the right is a single railway workman.
If you do nothing the trolley will proceed to the left, causing the deaths of the five workmen. The only way to avoid the deaths of these workmen is to hit a switch on your dashboard that will cause the trolley to proceed to the right, causing the death of the single workman.
Would you hit the switch in order to avoid the deaths of the five workmen?
- High-conflict personal moral scenarios (11 total):
Enemy soldiers have taken over your village. They have orders to kill all remaining civilians. You and some of your townspeople have sought refuge in the cellar of a large house. Outside you hear the voices of soldiers who have come to search the house for valuables.
Your baby begins to cry loudly. You cover his mouth to block the sound. If you remove your hand from his mouth his crying will summon the attention of the soldiers who will kill you, your child, and the others hiding out in the cellar. To save yourself and the others you must smother your child to death.
Would you smother your child in order to save yourself and the other townspeople?
- Low-conflict personal moral scenarios (11 total):
You are a young architect visiting one of your construction sites with your boss. Your boss is a despicable individual who makes everyone around him miserable including you.
It occurs to you that if you were to push him off of the building you are inspecting he would fall to his death and everyone would think it was an accident.
Would you push your boss off of the building in order to get him out of your life?
After reading the examples, you can probably imagine how most people would respond to each type of scenario. In case you can’t, though, I will tell you that in previous research, “normal” individuals have tended to respond “yes” to the impersonal scenarios (e.g., they say they’d flip the switch in the standard trolley scenario), and “no” in the personal ones (they wouldn’t kill the baby or push the boss off the building). The standard interpretation of these results is that in the impersonal scenarios, people are making the moral decision using conscious reasoning. Specifically, they are thought to be using utilitarian ethical principles to make the decision to flip the switch and kill one person to save five. In the personal scenarios, however, people tend not to make utilitarian decisions, and researchers therefore believe that they are basing their decision on the emotional response the situation elicits. Personally smothering a child is just too upsetting to even consider, despite the fact that not doing so will result in the death of the child and everyone else in the room, including yourself.
Since impersonal moral scenarios are thought to recruit conscious, principle-based moral decision processes, and since VMPC damage generally does not result in cognitive deficits, we’d expect both patients with VMPC damage and normal individuals (as well as patients with brain damage to other reasons) to behave similarly in these scenarios. However, if the decisions people make in response to personal moral scenarios are driven by emotion, then we might predict that patients with damage to the VMPC, who have trouble processing emotional value as a result of that damage, would behave differently in those scenarios than normal individuals (and other brain-damaged patients).
Consistent with these predictions, all three groups (VMPC-damaged patients, normal individuals, and patients with brain damage elsewhere) performed similarly in both the non-moral and impersonal moral scenarios. For the impersonal scenarios, all three groups said “yes” (that they would flip the switch, for example) between 50 and 60% of the time. There was also no difference between the three groups’ performance on the low-conflict personal moral dilemmas. All of the individuals in all three groups said no (they wouldn’t push their boss off the edge, e.g.) in response to the low-conflict scenarios. However, there was a difference between the normal patients (and brain-damaged elsewhere patients) and the VMPC-damaged patients for the high-conflict personal moral scenarios. The normal and non-VMPC brain-damaged patients said “no” (they wouldn’t smother the baby, e.g.) about 80% of the time in response to these scenarios, while the VMPC-damaged patients said no less than 60% of the time (in fact, their response rate was pretty close to 50-50).
What does this mean, then? Well, in one sense, the VMPC-damaged patients were more rational than the normal individuals. That is, while normal individuals responded to the impersonal moral scenarios based on moral principles, they responded to the personal moral scenarios based on their affective response to them, and therefore answered in a way that was inconsistent with the same moral principles. VMPC-damaged patients were significantly more likely to respond to the high-conflict personal moral scenarios in a way that was consistent with the principle, so it’s not a stretch to say they behaved more rationally. But why did they behave more rationally? And why didn’t they respond in a way consistent with the principle more than half of the time (it might also be interesting to note that the variance for VMPC-damaged patients in the high-conflict condition was much, much greater than for any group in any of the other types of scenarios). It almost seems as though they were just guessing, which would belie the notion that they were behaving more rationally. One could easily interpret the data as indicating that they just didn’t know how to respond to those scenarios. It’s as though they had the emotional reaction, which was telling them not to smother the baby, and the principle that was telling them to smother the baby to save everyone else, available at the same time, and for whatever reason (perhaps difficulty in integrating the emotional response with the rest of the decision process), they were unable to decide between the two options. This would imply that, in normal individuals, integrating the emotional response into the rest of the decision process automatically causes the principle to be overridden. In other words, it implies (to me, at least), that when people are making these decisions, both the emotional reaction and the moral principle are available at the same time, and one will win out over the other, depending largely on the strength of the emotional response (which is strong in the personal scenarios, and weak in the impersonal ones, at least when they’re just being read on paper). This would be inconsistent with strong intuitionist theories of moral judgment.
Unfortunately, this is pretty much as far as neuroscience can take us. It can’t tell us, for example, why VMPC-damaged patients have no problem saying no in the low-conflict personal moral scenarios. If it were just a matter of an inability to evaluate or integrate value information, why aren’t they willing to, say, throw the boss off a building to save themselves and others a great deal of stress? And it can’t tell us why they’re pretty much at chance on the high-conflict personal moral scenarios. It can’t even really tell us that it has to do with difficulty processing value. It could be that some other feature of the high-conflict scenarios that would be processed by the VMPC is causing the difference between VMPC-damaged and normal individuals. Behavioral data is the only way to tease these things apart, so much future research is needed to make real sense of the Koenigs et al. data.
1Koenigs, M., Young, L., Adolphs, R., Tranel, D., Cushman, F., Hauser, M., & Damasion, A. (2007). Damage to the prefrontal cortex increases utilitarian moral judgements. Nature.