Tuesday, June 5, 2012

Re-thinking the Milgram Study


Some fifty years ago, in August 1961, social psychologist Stanley Milgram conducted perhaps the most significant psychological study of the 20th century.  Participants were invited into his laboratory at Yale, supposedly for a study looking at the effects of punishment on memory.  The participants were told about the importance of the experiment:  “we know very little about the effect of punishment on learning, because almost no truly scientific studies have been made of it in human beings.”  The participants were then asked assume the role of the “teacher,” and were then told to administer an electric shock to a “learner” every time he made a mistake.  The shocks started at 15 volts but increased in 15-volt increments every time an error was made, going right up to 450 volts – enough to kill someone twice over.

If at any time the subject indicated his desire to halt the experiment, he was given a succession of verbal prods by the experimenter, in this order:

1.  Please continue, or please go on.
2.  The experiment requires you to continue.
3.  It is essential that you continue.
4.  You have no choice, you must continue.

Of course, there was no experimenter, no learner and no electric shocks.  Everyone but the subject was an actor.  Nobody cared about the effect of punishment on memory.  Milgram wanted to find out how much cruelty a person would be willing to inflict if ordered by an authority figure.   

In Milgram’s baseline study, every single “teacher” was prepared to administer “intense shocks” of 300 volts, and 65% delivered shocks apparently in excess of 450 volts (beyond a point labeled “Danger Severe Shock”).

Generally, people interpret these results as a demonstration of the “banality of evil.”  Ordinary people are willing to follow orders from someone in authority without giving much thought to either the moral issues or the consequences of their actions.  That is, people are willing to hand their souls over to people in authority.

But, I just heard an interview with Alexander Haslam, Professor of Social and Organizational Psychology at the University of Exeter, England, on NPR’s Radiolab, and Halsam says that a closer examination of the Milgram experiments suggests the opposite conclusion.

What people don’t know is that the 65% total obedience rate comes from Milgram’s “baseline” study.  He actually conducted over 20 variations of the experiment, each one with a different result.  When, for example, the experimenter was in another room, or the experimenter was not a scientist, obedience rates were low.  Or, when the location of the experiment was moved to a run-down office building in Bridgeport, Connecticut, obedience also went down. 

But most telling detail in the experiment was the reaction to the verbal prods.  In the one instance that Milgram recorded that the fourth prod was used – the only prod that was an order – the participant refused to go on.  Here was the exchange: 

Experimenter: You have no other choice, sir, you must go on.

Subject: If this were Russia maybe, but not in America. (The experiment is terminated.)

In a recent partial replication of Milgram’s study by Jerry Burger, a professor of psychology at Santa Clara University, every time this fourth prod was used, his subjects refused to go on.  In other words, every time the experimenter asserted his authority, the subject resisted.

Haslam believes that the subject’s refusal to follow orders while going along with appeals to follow the experiment is decisive.  It shows that people are not sheep cowed by authority.  Instead, it shows that they can be persuaded to perform inhumane acts if they believe they are accomplishing a greater good – in this case, scientific study.  Halsam argues that the subjects were making moral judgments; only those judgments were based on the experimenter’s effective persuasion that the experiment was a good thing that would expand scientific knowledge. 

From this perspective, people do not deliver electric shocks because they abdicate their moral responsibility but because they believe what they are doing is right.  As one writer put:  the moral danger we face is not that we are zombies, but that we are zealots.

It points to a possibility that is even more disturbing than Milgram's original conclusion.  If effective leadership can convince people that certain behavior is for the greater good, those people seem to be willing to sacrifice their own moral sensibilities to achieve that good.  And not only will they be willing to achieve the greater good, but they will do it with conviction.  As Blaise Pascal recognized over 400 years ago, when religion represented the greater good, “Men never commit evil so fully and joyfully as when they do it for religious convictions.”  James Carse echoes this idea when he says that the desire to eliminate evil “is the very impulse of evil itself.”  The atrocities in Rwanda, Kosovo and Darfur are not committed because people defer to the judgment of those in authority but because they believe what they are doing is the right thing.

Of course, that leaves the question of what is the greater good.  American soldiers, sailors and airmen committed a lot of carnage in order to defeat Germany and Japan in WWII.  And we now have our own “kill list” of targeted suspected terrorists.  Once again, we are back where we started – with the ambiguity of living.   We are constantly forced to ask ourselves, as Halsam puts it: “what is greater, and what is good?”

Here's the podcast.  The Milgram segment begins around the 10 minute mark.





No comments: