Thursday, June 7, 2012

Thinking about Re-thinking the Milgram Study

[Update: This article was written based on an NPR Radiolab show and an article in The Guardian by Alex Haslam. It turns out neither of these are very representative of Alex Haslam's work. The post stands as a criticism of those two publications, but should in no way detract from what I now see as excellent work by Alex Haslam. Please see Re-thinking Alex Haslam.]

Introduction
Not many things are more surprisingly instructive than the Milgram experiment. I welcome Haslam and NPR Radiolab's reinterpretation of the significance of the experiments, but caution against the dismissal of the common interpretation: that people are staggeringly willing to obey an authority who offers to relieve them of moral responsibility.

First of all, I urge you to view the post "Our wobbling morality". It is about 15 minutes, but I posted it specifically thinking of Milgram's experiments. Dan Ariely in his talk explains better than I some crucial ideas I will present here. At the beginning and end, he focuses on the importance of experimentation in order to develop, what he calls, intuition-free results. That is important, but the more interesting part is the middle of the talk where he discusses the factors that sway our moral compass or, as he calls it, the "personal fudge factor".

And that is precisely what the Milgram experiment showed, that our moral compass can be swayed—spectacularly. If Haslam wants to focus attention on the participants' wrestling with the moral questions of "what is greater, and what is good", I say fantastic. However, that is a fairly banal comment on any moral issue. And if Haslam wants to remind us that "the banality of evil" is too simple an explanation, then I say terrific. However, I think he is obfuscating the importance of Migram's experiment. The focus should remain on how authority skews our moral decision making. In other words, how authority, combined with other factors, skew how we think about "what is greater and what is good".

Haslam (and Reicher)
Myk does an excellent job summarizing the reinterpretation, but, for the record, it's important to report Alex Haslam and Stephen Reicher's own words. (They are the same duo who set up the 2001 BBC's prison study as a sequel to the famous 1971 Stanford prison experiment. I'm sorry to say it appears they have but one hammer and everything, including Milgram's experiment, looks like a nail to them.) Here is their conclusion:
From this perspective, people do not deliver electric shocks because they are ignorant of the effects but because they believe in the nobility of the scientific enterprise. For now, this can be no more than a provisional conclusion. But it points to a possibility that is even more disturbing than Milgram's original account.  
People don't inflict harm because they are unaware of doing wrong but because they believe what they are doing is right. We should be wary not of zombies, but of zealous followers of an ignoble cause.
[The link is part of their statement.]

Put simply, yes, we should be wary of zealous followers of an ignoble cause, but with regards to the Milgram experiment, this is bullshit. And, what creates zealots, if not relying on an authority to lessen the burden of our own moral decisions?

Here is the irony. They have little or no experimental data. To their credit they admit "this can be no more than a provisional conclusion", but they have reformulated the central issue from 'sheepishly obeying authority' to 'zealously following an ignoble cause'—in this case, science. Why would anyone embrace this reformulation? Hanslam and Reicher are relying on their well respected authority in psychology. Yes, they are pulling a Milgram experiment on us once again. Shame on them and shame on Radiolab.

Looking at the actual data and thinking
Instead of tossing mindless platitudes like "what is the greater and what is good" out of the air, or, perhaps, out of their prison experiment, let's look closely at the actual experiment. (Here are a few clips.)

Anyone who knows anything about the experiments, knows that the participants did indeed wrestle with their own morality. They balked, questioned and sweated with every flip of the switch. In fact Milgram spent significant time debriefing the participants to avoid lasting psychological effects. Indeed, the significant strain on the participants caused many to condemn Milgram. Today psychology experiments must be reviewed by institutional review boards. Milgram's experiment could not be completely duplicated today.

[A less severe replication was made in 2007. For those who think that today we don't follow authority like in the past, the results were essentially the same.]

They wrestled, yet they went on. Clearly they didn't wrestle nearly enough. Now, Alex Haslam and Stephen Reicher want us to believe that they wrestled not against the pressure of authority but about zealously defending science. Belief in scientific research surely was a factor, but certainly not as important as having an authority figure. In fact, when Milgram removed the actual presence of the authority figure, and all instructions were done via the telephone, the full compliance rate dropped from 65% to 20%. Are they saying removing the authority figure caused a significant loss of zeal towards science? Again, shame on them for calling themselves scientists.

Beyond that, If you've seen any of the experiments, you will notice how awful the scientist was. He acts like a former Gestapo officer (probably not a coincidence). I'm amazed this character could invoke 65% full complicity. Replace him with an attractive female scientist, who sympathetically reassures you "I know it's difficult, but I take full responsibility", and I could imagine 100% compliance.

How many participants spoke in the post interview about feeling conflicted delivering electric shocks versus the importance of the experiment; and how many spoke about delivering the shock versus doing what he or she was told? Why don't Haslam and Reicher provide this data? Looking at the participants react, do you really think they are worried about failing science or failing the scientist? Do any of them really sound like zealots of science? Haslam can dupe NPR, but not the rigors of In Progress.

But what really steams my hash is not just that Haslam on Radiolab is misleading, but he is shallow. Milgram and others (like Ariely) have done penetrating research into factors which sway our moral compass. Haslam glosses over these results, misreading their importance and goes so far as to suggest they invalidate Milgram's findings.
Haslam: Every experiment produces a different result.
Radiolab: Really?!
Haslam: Yes 
Duh?!…as if this should disturb us. That is the whole point of the research!

When the learner receiving the shock was in the same room (reference Ariely's proximity to money) as the participant, the severity of the shocking went down to 40%—forty percent of the participants went all the way when even watching the person being tortured! When Milgram asked his students, his fellow professors and the public how many would take it to the limit, all groups responded between 1 to 2%. That should be the base line. These are mind boggling results. But the point of the research is we learn that the closer we are to the victim or the pain, the less likely we will proceed.

When the participant actually had to make physical contact with the learner and press their hand down on a plate to receive the shock, there still was an incredible 30%! But, consistent with the last finding, at least it is lower.

Let's look at Haslam's clincher. When the experimenter uses the fourth prod, "You have no other choice, sir, you must go on", no one goes on. Haslam thinks this proves people are not sheep cowed by authority.

First of all, 65% never get to this prod. Secondly, on the contrary, what this prod does is similar to being reminded of the Ten Commandments in Ariely's experiment. They see, for the first time, that they do have a choice. They recognize that they have cowered to authority and realize they can make a moral decision on their own. Furthermore, they see the authority as a false authority. Everyone knows you have a choice. They have just been reminded of this. The authority is wrong and stupid—no authority at all. The participant gains confidence.

The 'best' results, other than after being reminded that you should not shirk your moral responsibility, comes when there are two others in the room, and they argue about going on. Complete compliance drops to 10%. Still 10 times what we think it should be, but significantly lower than 65%. This, surprisingly, is even better than when a second person in the room refuses to continue. (Again, refer to Ariely and the Carnegie Mellon/Pittsburgh sweatshirts.) This is an important point. Discussion brings us to our senses even better than someone demonstrating that we may rebel. It gets us thinking.

Final comment
There's more to be said here. Remember, Milgram's experiment shows discussion makes us wiser. But let me conclude with this. As far as I can tell no testing was made of what specific types of people or groups would perform more morally on the Milgram experiment. But from the real scientific research that Milgram and Ariely have done, I know what group I would bet on.

It is a fictional group I will call Followers of Myk's Religion. There are two tenets of this religion:
  1. Belief in a God "which destroys all gods that we might ever want to use as cover or justification for our actions." 
  2. Ritual… or as Myk sometimes calls it, play - specifically, ritual that involves daily invoking the first tenet. 
This group I feel would perform as well as Ariely's group when reminded of the Ten Commandments, i.e. no cheating; no torturing, and, because of ritual, they would not need to be reminded.

8 comments:

Alex Haslam said...

Hi James,

You make some very good points here, but also misconstrue the main points we are trying to pursue in this line of research.

The most important point you make is that our alternative analysis requires robust empirical support. This is definitely true, but we now have a couple of studies (one in press at Perspectives on Psychological Science, another that we are working on now) that appear to be far more supportive of our analysis than they are of Milgram's 'agentic state' model. Second, it is also true that in a radio interview some of one's points can come over as 'shallow' (something for which I apologize -- especially if this was seen as demeaning to the subject matter). Nevertheless, I think it is hard to sustain this as a criticism of our published work.

Where you are most wrong, though, is when you suggest that we are somehow dismissive of Milgram's contribution to the field. Nothing could be further from the truth. Milgram's work is of supreme importance, for many of the reasons you outline (and more). In particular, I agree with your observation that the studies are powerful demonstrations of the capacity for people's moral compass to be recalibrated in horrifying ways. What we do question, though, is whether this is a product of 'blind obedience' (as the agentic state account is typically understood to suggest), or whether instead it reflects a form of active *followership* predicated upon identification with those in authority. If it is the latter, then this has quite profound (i.e., not 'shallow') implications for our understanding of the psychology of tyranny.

I don't expect that it will be easy to win people round to this view (a view confirmed by the shrillness of your critique), but I do believe that the more one interrogates the relevant data, the more convincing this reinterpretation becomes. Certainly, this is what I would urge your readers to do (perhaps starting by reading our 2007 paper on the 'banality of evil' in Personality and Social Psychology Bulletin).

One of the things that made Milgram such a brilliant scientist is that he put so much of his own data into the public domain. One reason why he did this, I think, was because he realized (as do we) that his own analysis was not the end of the story, but the beginning.

Regards
Alex Haslam

James R said...

I am shocked—close to 450v worth. There are some very clever writers who follow this blog, but apparently Alex Haslam is the real Alex Haslam. This, to the best of my knowledge, is not a hoax.

First of all, I want to apologize for being unnecessarily harsh. It was a bit out of character and, furthermore, I don't want to end up in one of your publications as a zealot of an ignoble cause. Secondly, I apologize again because I know very little about Psychology and you shouldn't even be talking to me. Thank you so much for your response.

I will, as I'm sure others will also, respond after I have read more of your work and thought more fully on it and your letter.

I will say this. While I stand behind my post, if, in fact, the general interpretation—as you call it, the 'agentic state' model—means that the participants were zombies, not actively wrestling with morality, then we are closer than I originally thought.

james said...

James R- I think the distinction Haslam makes is that it is incorrect to view obedience as a passive act. The subjects in Milgram’s experiments were conflicted by what they were doing because two deeply-held, positive principles were at war with each other-- 1) don't inflict harm on others and 2) the pursuit of science is a noble endeavor. The second positive principle is given greater weight than the first, and the subjects find themselves willingly administering the Looney Toons-esque “XXX” shock to the confederate. So if one believes in a specific principle-- “Scientists are good”, for instance-- then scientists (and those claiming to be scientists) can leverage it to get people to do incredibly immoral things.

I think the slight confusion for me lies in the distinction Professor Haslam is making. Isn’t Arendt’s characterization of Eichmann as unthinking and blindly obedient the same thing as active ideological zealotry, just said in a different manner?

Haslam says that “People don't inflict harm because they are unaware of doing wrong but because they believe what they are doing is right. We should be wary not of zombies, but of zealous followers of an ignoble cause.”

I totally agree with this, but “zealous followers of an ignoble cause” are zombies. Ideologues by definition abandon the self-critical thought necessary to uphold even basic moral truths, like “it’s wrong to blow up innocent revelers in a Balinese nightclub.” I would argue that a rabid Hamas supporter who believes that the Protocols of the Elders of Zion is true is both blindly obedient and a zealot. Same with the Catholic bishop who chooses to relocate a child-molesting priest instead of turning him into the police. The list goes on.

I liked Haslam’s quote (paraphrased from memory): “Before you do something for the greater good, ask yourself if it’s great or if it’s good”. Hitchens used to express this sort of sentiment a lot-- always question your principles and be able to give good reasons for why you believe what you believe.

James R said...

Well said and very true, but calling "zealous followers of an ignoble cause" zombies may not be the best vocabulary in dealing with the problem. These are zombies who are vigorously energetic, politically shrewd, and mentally skilled—not the normal image.

However, I say it again, I believe this is NOT what is going on in the Milgram experiments. These are not zombies in either sense of the word. They are you and me, but are conflicted, not by ideology—most of them probably don't even know what the purpose of the experiment is—but by the desire to be a nice person! They don't want conflict with the experimenter/scientist. They don't want to do achieve any goal. They don't want to even be there. They want to be a good subject and do what they are told. They don't want to make a moral decision. And that desire to be nice to both the experimenter and to the confederate to whom they are administering Looney Toons-esque "XXX" shocks, is placing them in living hell.

James R said...

Let me add this, which may clear up some confusion. I know the Milgram experiment has been linked exhaustively to Eichmann, Arendt and "the banality of evil". If that is the common interpretation, then there needs to be reinterpretation. The Milgram experiment showed something very different than "normalizing the unthinkable" or "doing terrible things in an organized and systematic way".

The subjects of the experiment were not caught in a routine. This was something very new to them. Forget Eichmann. In my view the Milgram experiment shows we do, indeed, actively wrestle with our moral choices. This is obvious by watching the tapes. But the wrestling is not about an ideology, IN THIS CASE. It is about wrestling with authority. They are actively trying to do right, but are failing miserably.

Big Myk said...

There's a lot to be said here, but just two thoughts for now. One is, the enthusiasm and fervor with which people throughout history have engaged in human cruelty suggest something other than an agentic state or passive obedience. Take the persecution of heritics, for example. Because people identify with the the authority, they will relentlessly carry out his program.

Second, if we take Dan Ariely seriously, there is every reason to continue to explore and test Milgram's theories. It seems as if Jim has fallen into an agentic state with regard to Milgram's original conclusions. As Ariely concludes his TED speech: "We have very strong intuitions about all kinds of things -- our own ability, how the economy works, how we should pay school teachers. But unless we start testing those intuitions, we're not going to do better."

James R said...

Myk -I know you are being clever, but I can't tell if you are being sarcastic. In my post I criticize Alex Haslan for lack of testing, to which he assures us it is coming. I spent as much time as I dared focused on the actual testing—notably, what the tapes show and the actual data. The importance, I keep insisting, is the testing, the data, and the different results. I may be accused of a lot of things, but I think the overly long post of exploring and mining the data and the plea that discussion makes us wiser, proves me innocent in this case.

We see from the tapes that there is not passive obedience, but active wrestling. We see that the participants don't have the strength to stop doing what they are told, yet they fret terribly about it. The tapes also show that while there is concern for the importance of science, it's not nearly as powerful a force as the authority figure. Ideology matters less here than an authority. (Here, but not with rabid Hamas supporters or child-molesting priests or fervent persecutors.)

The post interview points this out dramatically, as one of the participant says over and over again in a heated tone, "Well, I wanted to stop, be he wouldn't let me stop. He told me to go on."

Perhaps I'm ignorant of Milgram's original conclusions. The conclusions I wrote in the post were "participants wrestled with their own morality" and "authority, combined with other factors, skew how we think about what is greater and what is good."

Also, I'm not looking at the cruelties of the world and trying to find them in Milgram's experiment. I'm looking at the experiment and see how that may be played out in the world. There are far too many and varied cruelties to link them all to Milgram's experiment.

Today we are mostly concerned with the cruelty inflicted by ideological zealots, but Milgram's experiments deal more with non idealogical issues. There was no common belief or purpose among the participants other than to carry out a task as instructed.

Big Myk said...

I concede that I was unduly harsh in my last comment and, yes, mostly I was trying to be clever. I really don't have a dog in this fight. I just thought it was interesting that two bright guys could look at the data and reach differing conclusions. At the very least, Haslam's theories are tantalizing.

But I think that both Haslam and Milgram would agree that circumstance and not character is the better predictor of the loss of moral compass. I am interested in what circumstances cause people to take disturbing actions and how they might be resisted. Sometimes it only takes a single person to break the cycle, like a black woman refusing to give up her seat on a bus.

As I have argued before, this is the role of religion: to show us what it means to be in but not of the world, to live apart from circumstances: "from now on those who have wives should be as though they had none; and those who weep, as though they did not weep; and those who rejoice, as though they did not rejoice; and those who buy, as though they did not possess; and those who use the world, as though they did not make full use of it." 1 Corinthians 7:29-30.

That's why its doubly tragic that religion has given itself over to one side of the culture wars.