Would we still obey? The first replication of Milgram's work in over 30 years

ResearchBlogging.orgTwo years ago, we linked to a post about an ABC news program that claimed to have replicated Stanley Milgram's controversial experiments from the 1960s and 70s about obedience to authority. The original study tricked unwitting paid study participants into believing that they had administered potentially deadly shocks at the bidding of an experimenter. The cover was a "learning and memory experiment," allegedly designed to see if administering shocks would improve people's ability to memorize a list of words.

The shocks progressively escalated from 15 volts ("Slight Shock") to 450 volts ("Danger: Severe Shock"). Most people continued to give the shocks long after their "victim" (who they believed to be another paid experiment subject but who was actually an actor who never received the shocks) asked to be released from the chair he had been strapped into for the study, and even after he apparently went unconscious from the treatment. It was a striking demonstration of the power of an authoritative voice -- even the temporary authority we grant to a stranger wearing a lab coat.

Shortly after Milgram published his results, new ethical guidelines (motivated in part by Milgram's work) made replicating the study impossible, leaving future generations to wonder if they would respond in the same way as Milgram's participants.

But funded by ABC, psychologist Jerry Burger felt he had come up with a way to replicate the study without unnecessarily endangering the experiment's participants. Here's a clip from the show:

Since the TV show wasn't a peer-reviewed work, we didn't comment on the study when it aired, but now Burger's results have been published in a peer-reviewed journal, so we decided to give it a closer look.

Burger's work was made possible by the observation that nearly everyone in Milgram's experiments who continued past the 150-volt level kept on administering the shocks right up to the maximum of 450 volts, when the experiment was stopped. It was at the 150-volt level in Milgram's studies that the "learner" asked to be released from the experiment. Many of the actual experimental subjects hesitated at this point, but usually they continued, and it seemed that this was the key decision point, when they convinced themselves that they'd persevere in the study no matter what.

To lessen the chance of lasting psychological damage, Burger stopped the experiment once it was clear that this crucial 150-volt decision was made. The actor (who had been hidden from view) immediately returned to the study room and showed that he was unharmed, and the actual experimental subject (who played the role of "Teacher") was told of the actual purpose of the study.

When I originally read about Burger's new experiment, one of the first questions that occurred to me was whether the "Teachers" were aware of Milgram's experiments, which are some of the best-known psychology studies in history. In fact, this had occurred to Burger, and when he recruited participants (via classified ads and flyers posted in public places), he rejected all who had taken more than two college psychology courses or who were undergoing treatment for serious psychiatric disorders. About 30 percent of potential participants were rejected in this way, and another 5 participants were rejected later on when it became clear that they were familiar with Milgram's work. After a clinical psychologist rejected 47 more people due to possible psychological problems that might be exacerbated by participating in the study, 70 participants remained.

These people were then subjected to nearly the identical procedure Milgram had designed 45 years previously. They were given $50 and told it was theirs to keep regardless of whether they completed the study. They were introduced to the actor who they thought was a fellow participant in the research. Through a rigged "random" drawing, they were assigned the role of "Teacher" and the actor was assigned the role of "Student."

They watched as the experimenter strapped the Student (a 50-year-old man) into a restraining chair and attached electrodes to his arm. The experimenter explained that the Student was to try to learn a set of 25 word-pairs read to him by the Teacher. He would be subjected to an electric shock every time he gave an incorrect answer. At this point the Student asked how dangerous the shocks were -- he was a little concerned because he had a slight heart condition. The experimenter said they were painful but not dangerous.

Then the Teacher was led into a separate room and the door was closed so they could not see the Student. They were shown the "shock generator" which was actually just a computer covered with a box designed to look like Milgram's original apparatus. The experimenter offered to give a sample 15-volt shock to the Teacher, which nearly every participant agreed to.

Once testing began, the Teacher could hear the Student through an "intercom" which was actually connected directly to the computer and played pre-programmed sounds in response to each "shock" administered by the Teacher. Every time an incorrect answer was given, the experimenter told the Teacher to give a shock, and as the voltage increased, the pained grunt was a little louder than the previous one.

The key moment occurred after the 150-volt shock was given. At this point, the Student said the following:

Ugh. That's all. Get me out of here. I told you I had heart trouble. My heart's starting to bother me now. Get me out of here, please. My heart's starting to bother me. I refuse to go on. Let me out.

Most people would like to think that if they were the Teacher, at this point they would also decide to opt out of the study. The experimenter had told them at the outset that they could opt out at any time and keep the $50. Prior to conducting his study, Milgram had interviewed psychiatrists, students, and middle-class adults, asking what level of shock they believed people would administer before refusing to continue. The psychiatrists predicted that most people would not go beyond the 150-volt mark. But of course, most people did continue, and in Milgram's study, over 60 percent administered the full 450 volts. Over 80 percent made it past the 150-volt mark.

How did Burger's 2006 participants compare? Seventy percent were willing to continue past the 150-volt mark, prompting the experimenter to halt the study. This result was statistically indistinguishable from Milgram's.

But maybe they continued so long only because there was no social pressure to quit. In the laboratory, with only an experimenter urging them on, and with shocks only increasing slightly from round to round, why not continue? Perhaps if there was social support for the idea of exiting the study sooner, people would be less likely to comply.

Thirty of Burger's participants did a slightly different task, this time with two actors. The same actor played the Student, but the second actor played the role of Teacher 1. Teacher 1 was told to administer the shocks, and Teacher 2 (the only real participant) was told to watch Teacher 1 and await further instructions. Teacher 1 administered the shocks up to 90 volts, at which point he refused to continue. The experimenter then told Teacher 2, the real participant, to continue. In this version of the task, 63 percent of participants still continued all the way past 150 volts, when the experiment was stopped. Again, this was indistinguishable from Milgram's results or Burger's other participants.

I don't think Milgram, were he still alive, would be surprised by this result. Milgram repeated his study in a variety of settings and found similar results. Milgram even once staged the study in a dingy, run-down lab in a depressed area of town instead of the staid Yale University campus, with similar results.

What does limit this type of unthinking obedience? Milgram himself offers some guidelines in Obedience to Authority:

  • The experimenters' physical presence has a marked impact on his authority. As cited earlier, obedience dropped off sharply when orders were given by telephone. The experimenter could often induce a disobedient subject to go on by returning to the laboratory.
  • Conflicting authority severely paralyzes action. When two experimenters of equal status, both seated at the command desk, gave incompatible orders, no shocks were delivered past the point of their disagreement.
  • The rebellious action of others severely undermines authority. In one variation, three teachers (two actors and a real subject) administered a test and shocks. When the two actors disobeyed the experimenter and refused to go beyond a certain shock level, thirty-six of forty subjects joined their disobedient peers and refused as well.

Of course, as Burger's study shows, the nature of the rebellious action counts. Just one rebel doesn't incite participants to join the cause. Milgram's staged "rebellion" was more dramatic than the one in Burger's study. Perhaps only when rebels outnumber authority figures can disobedience readily spread.

Jerry M. Burger (2009). Replicating Milgram: Would people still obey today? American Psychologist, 64 (1), 1-11 DOI: 10.1037/a0010932

More like this

This is from a news report of a paper I've not yet seen: More than two-thirds of volunteers in the research study had to be stopped from administering 150 volt shocks of electricity, despite hearing a person's cries of pain, professor Jerry M. Burger concluded in a study published in the January…
The advent of the release of an official government study warning that robots will soon be demanding their civil rights is a sure sign of the Christmas season. Senior editors and reporters are either at home with the family or spending too much time at the eggnog trough to bother with real…
One of the more popular theories of emotion during the 60s and 70s was Schachter and Singer's two-factor theory1. The theory is pretty simple. As the name suggests, it states that emotions have two components: arousal and a cognitive component that involves "labeling" the emotion based on context.…
The effect that violent films and games have on our minds, and the implications for their place in society, has been a source of much heated debate. Now, a new study looks set to fan the flames even further. Several studies have found that violent media can desensitise people to real acts of…

People are still nasty little buggers, aren't they?

This study is classic in more ways that one, Dave. Especially important is the way we can marshal controlled, clever study designs to answer behavioral questions that everybody has very firm (and all over the map) predictions about before the data come in.

By DrugMonkey (not verified) on 26 Jan 2009 #permalink

I wonder what would happen if one (or both) of the rebelling actors were punished...

Slight correction--the 450 volt level was beyond "Danger: Severe Shock", and was labeled "XXX". Also, if I am not mistaken, the 60% obedience rate (up to the full 450) was found in the first condition, in which there was no intercom. The most famous version is the second condition (the subject of the film), in which (if memory serves--I am away from my books) about 50% went all the way to 450. This is the version with the "I told you I had a heart condition! Let me out of here!", the refusal to answer, the bloodcurdling screams, and the horrifying silence.

So, people likely to have heard of Milgram's original experiment were screened out. It might almost be more interesting to see whether - in an experiment of similar nature, but different enough to avoid direct comparison - people who know about the original experiments are more likely to rebel than those who didn't. Is educating people about this a good way to fortify them against similar situations?

Rachel@ #4

There have been very different replications, in very different situations, which you may or may not feel are in the spirit of Milgram's experiment. If memory serves (still not near my books), these include obedience in hospital settings (an anonymous "doctor" ordering a nurse to give a dose of a controlled medication to a patient), airline situations (in a flight simulator, a pilot deliberately taking a path directly into a mountain, to test whether copilot and navigator would challenge the pilot's authority), and perhaps others I am forgetting. Obedience is common in those situations.

One of my students had a real-world example: she was a nurse in a pediatric burn unit. Doctors told her to scrub the open burn wounds with a brush--it is, I gather, to prevent scabbing, but she was not told that. She was, literally, being asked to administer a terribly painful procedure against the child's will, without adequate explanation.

Ahh- this explains why we shock people to subdue them as a society without getting too het up about the possible consequences. Well, we don't, but our law and order reps do, and no-one seems to concerned. Wow.

Anon --

You're right about the "XXX" marking, thanks for the clarification.

As you know, Milgram tested many different versions of his paradigm. The particular test Burger was replicating was Experiment 5, from 1974. In this version, 65 percent indeed went all the way to 450 volts, and an intercom was used.

Authority is powerful. Most of our behavior is directed by unconscious mental processing. Authority is just one example. There are many others. (I just wrote a book on it).

I have a neat little book on the variant obedience experiments - it's quite surprising how many there were. Paticularly interesting were those done on the military (with far higher obedience) and on nurses (again, with amazingly high levels of obedience). One wonders whether stricter practices, more power in the hands of nurses, and a willingness to question doctors may have lessened the effect in nurses - but I wouldn't hold my breath.

By Epinephrine (not verified) on 27 Jan 2009 #permalink

Was there any attention paid to the religious (or none) affiliation of the participants? I'd be curious to know if there was any difference between atheists and believers given how ready some believers are to describe atheists/atheism as evil!

What the new study shows is of course that people without good education are mean buggers, while us enlightened types may very well be much nicer. (-;

Milgrim himself did a lot of versions, figuring out what raised and lowered the number of compliant subjects, but were there many studies on what separated the rebels from the conformers? Like what traits allow one to predict whether a subject will continue, or stop? For instance, did anti-authoritarian political views translate into resisting the experimenter's wishes?

By swbarnes2 (not verified) on 27 Jan 2009 #permalink

This experiment is relevant to many current situations, such as Abu Ghraib, military atrocities and police misconduct. It isn't so much that we're 'nasty buggers,' it's more about how our decisions are highly context dependent and also dependent on certain types of cues. This context dependence however is not something we are aware of. Especially in the West where autonomy is so overvalued, this idea makes us pretty uncomfortable. Ambiguity about role and expectation exacerbates indecision and therefor willingness to "go along."

Another thought: Did anybody think to ask the participants what they were thinking? My guess is that they were thinking something like, "Well this is sanctioned research, presumably approved by some regulatory body, and the University wouldn't let something dangerous or deadly occur here, so I'll go ahead and play my role." The artificiality of the situation may also have created doubt about whether the "Student" was actually in danger.

By Mark Elliot (not verified) on 27 Jan 2009 #permalink

what do the writers think of the personality tests that were involved in the study?

I read about this paper a while ago via the 'dispatch'blog and became quite fascinated. I read the book about the Milgram experiments by Milgram himself and, as well as there were situations were 'obedience' steeply dropped, there was also a situation with very high obedience rates, in order of 90+%. It's experiment 18 and Milgram tried to replicate a 'bureaucratic' system. The fact that being a part of this little torturing group made such a big difference in comparison with exp. 13a, where the teachers where replaced by a member of the experimenting group and continued administering increasingly heavy shocks. In this experiment 16 out of 20 people refused to go past 150V and of these sixteen people, four litteraly attacked the new teachers or tried to sabotage the installation.
Do the writers also have a point of view on these differences, 'cause I tend to find them quite interesting/disturbing.

By beelzebub (not verified) on 27 Jan 2009 #permalink

So would it be possible to recreate the Stanford prison experiment today?

By Pseudonym (not verified) on 27 Jan 2009 #permalink

It wasn't even possible to complete the original Stanford Prison Experiment. Replicating it seems completely out of the question.

I think, or at least I'd like to think I would have started asking questions well before the 150 mark. At least to ask Why? or How? this was going to help.

Perhaps I am unique in that my parents never punished me without explaining in exquisite detail exactly why I was being punished and how it would help me. AND, I was allowed to ask questions during this time, through tears even.

My parents would probably be arrested for mental cruelty today... though I begged for physical cruelty because it was so much faster and required so much less thought.

I would love to know about the religious piece as well. I am sure it could be hypothesized that those with strong religious base were more likely to follow set rules mandated by authority. I would also love to see if what religion it was made a difference. For example, Catholics might be more obedient and Quakers or Unitarians less.

I suspect this replication result could also be achieved by having participants read about the study. A list of the voltage levels could be provided, and participants would check off at what voltage they would stop (with the actual feigned protests by the learner in the original study indicated at specific voltage levels). Perhaps most would check 150V; the reason being that there are likely similar (if not identical) emotional and intellectual responses that occur independent from the actual situation, and probably generally connected to moral thinking and especially, moral disengagement as described by Bandura.

For me, the Milgram obedience experiment represents an instance of moral engagement/disengagement (possibly triggered around 150V) that is induced by the situation; and it is most likely the same phenomenon that makes it possible for (presumably non-psychopathic) soldiers to kill.

A telling Milgram variation experiments in that regard, was one in which participants were free to choose the shock level. With this variation, 3% of participants chose to administer the maximum (450V) shock. An interesting question is whether these 3% were psychopaths, or, whether they were quite certain the experiment wasn't actually real.

By Tony Jeremiah (not verified) on 28 Jan 2009 #permalink

A less known aspect of Milgram's research is his extensive followup of his subjects. The vast majority were glad they had participated, indicated they had gained valuable insight, shifted in their attitude toward blind obedience, and none reported or demonstrated longstanding traumatic effects from participating.

Highly competent, humane, and ethical followup by Milgram, and yet he is cited as the paradigm of unethical social psychology research.

By Epictetus (not verified) on 28 Jan 2009 #permalink

Another thing to consider is that under these circumstances, most participants are very unlikely to truly believe that the "student" is in real danger because they know that no real scientist is going to risk killing a human test subject (in no small part because we live in a very litigious society).

A better assessment of people's willingness to do harm would be to have a different kind of authority figure directing them, one who may, in real life, risk the life of another - a law enforcement officer or a military figure, for example. As long as the authority figures in these experiments are wearing white coats and are supposedly running official scientific experiments, it is difficult (if not impossible) for any test participants to really believe that the experiment is actually going to result in someone's death and that the scientists are willing to go that far.

^ Although that was covered in Milgram's original studies, as he reported physical signs of the participants being genuinely stressed, shaking hands, sweating and many other signs which all showed that they believed the situation was real.

If the participants had been nonchalantly hitting the buttons then maybe the criticism would stand up more, but the physical response means that this would have been a minor footnote in the person's mind- and surely in such a case where the person knew it was being faked they would be more likely to stop due to perceived social values etc.

I'm not too familiar with all of Milgram's experiments but I wonder if he tried any where he varied the identity of the Student. I would imagine that many people would be less likely to administer pain to a delicate-looking girl than a robust-looking man (perhaps especially men who have been taught "not to hit girls"?).

Also, how much would the study's subjects know about it before they signed up? I know that if I saw a study in which I would be required to give anyone shocks, even mild ones, I wouldn't participate in it; I'm just that averse to inflicting/seeing pain. Is there a bit of a selection bias in the participants?

I'm not too familiar with all of Milgram's experiments but I wonder if he tried any where he varied the identity of the Student. I would imagine that many people would be less likely to administer pain to a delicate-looking girl than a robust-looking man (perhaps especially men who have been taught "not to hit girls"?).

Also, how much would the study's subjects know about it before they signed up? I know that if I saw a study in which I would be required to give anyone shocks, even mild ones, I wouldn't participate in it; I'm just that averse to inflicting/seeing pain. Is there a bit of a selection bias in the participants?

I'm involved in a discussion forum that focuses on the nature and perception of evil. Milgram's work is a focus, and in a small poll asking if people believed they would have delivered the maximum shock, 100% of the respondents said "no" in no uncertain terms. It was a small sampling, but there is really no reason to think anybody would ever responded differently. Even when we know the work, and see the results replicated time after time, we still maintain a sense of personal invulnerability to influence.

http://dealwiththedevil.yuku.com

I've been so intellectually appreciative of Milgram's experiments that I once spent over two hours speaking about them on the air. Most of what I could say now I either said or could have said then, aside for this. I've come to realize that I myself have undergone a radical change such that I could easily see my iconoclastic, authority-hating, self go through to the highest level of shocks. In fact, I don't even regard it as obvious that I shouldn't do so. Sick, right? Well, no less a crime than allowing five people to die of hunger and disease who otherwise wouldn't were you to send that money to get them the requisite mosquito net rather than getting yourself an ipod-mini.

Now, in most human cases, I would imagine that intuitive "morality" remains, and these "moral riddles", if you will, regarding mosquito nets and whether you'd flip the switch on the ever-careening train, remain nothing but occasional two-minute intellectual diversions of no real consequences. Apparently however, I may not be one of those normal cases. Varied circumstance has brought these moral questions to the attention of my 'deciding-self' (I'm sure there's a proper neuropsych term for it) and caused parts of my moral filters to have been replaced such that I, well, don't really wonder as to whether I'd do the shocks in a similar type of experiment where I was unawares, because I don't regard any particular outcome as being overly revelatory in some grandiose way. I realize that in attempting to describe this surprising bit of apathy residing in some part of my cranium I'm doing an injustice to truth in that I come across as sociopathic and lacking in human empathy, for the sake of accuracy I should clarify that this is hardly the case. But, well, - eh, I'll leave further introspection to my non-literary hours rather than spend them at the bottom of a slumbering comment thread.