In college I once sought to earn a little extra credit by participating in a psychological research study. Six of us -- three male and three female sophomores -- were taken into a conference room and seated around a table. A researcher told us that we would be asked true or false questions and if we got the questions wrong we would be administered a mild shock. That's right. Wrong answer? Zap! He then asked if we still wanted to participate. The six of us looked uneasily at one another. For a few seconds no one said anything. I remember thinking anxiously that I didn't want to be shocked, but neither did I want to look like a wimp, especially in front of three good-looking coeds. And I did need the extra credit. Besides, how bad could it hurt? Finally a male student with a pronounced rural accent said, "I don't want to participate." Asked why not, he replied, "I don't want to be shocked." He was allowed to leave the room. "Any one else?" the researcher asked. Well, I didn't want to be shocked either, but I felt it was too late to back out now. I would look ridiculous. The researcher then closed his file and said, "Okay. That's it. We're done."
What a sense of relief. Yet I left the building with a burning sense of shame. Had the student who backed out been a coward, or had he been the only sensible one, the only one brave enough resist peer pressure? Why was I about to allow these brutes to do something to me I obviously didn't want done to me? It wasn't like I had a chance of getting all the questions right. And if the questions involved mathematics I probably wouldn't get any right. I would have been roasted like a Christmas goose.
That was in the early '80s. By then psychologists had been trying to understand why ordinary men do beastly things to complete strangers ever since reports came back from Poland about the Nazi death camps. However, by the 1980s most of these experiments had been deemed unethical or too inhumane, though they were seldom as cruel as some of the hijinks that went on day and night at the frat houses next door.
When we consider the two major textbook experiments of that period we of course mean the 1963 Milgram Study (not coincidentally conducted during the trial of Adolf Eichmann) where ordinary men (called "teachers") ostensibly participating in a study assessing the effects of pain on learning, were told to administer shocks to a "student" (in reality an actor) at near lethal volumes. Despite horrible cries of pain from the "students," the majority of "teachers" pressed on with the experiment. Researchers concluded that "relatively few people have the resources needed to resist authority."
The other experiment was Philip G. Zimbardo's 1971 Stanford Prison Experiment in which 24 model college men were hired for a two-week period and randomly assigned to role play guards and prisoners. On the second day there was a revolt and the guards cracked down. The guards had been warned against using physical violence, so they quickly found ways to psychologically torment their prisoners.
By the end of the first week the guards began engaging in horrible mental cruelties, including forcing prisoners to simulate sodomy. Zimbardo was forced to abandon the experiment after a mere six days. God only knows what atrocities might have occurred had the experiment gone on another week.
"I wanted to know who wins -- good people or an evil situation -- when they were brought into direct confrontation," Zimbardo writes in his new book The Lucifer Effect: Understanding How Good People Turn Evil. His conclusion is that "the Situation controls you." If the Stanford Prison Experiment taught one thing, he writes, it is that we are all capable of being bad apples if placed in a bad barrel. He even quotes Aleksandr Solzhenitsyn's Nobel Prize address during which the novelist stated that, "the line dividing good and evil cuts through the heart of every human being."
Zimbardo was greatly annoyed at the Pentagon for blaming the abuse at Abu Ghraib on "a few bad apples," so irritated in fact that he agreed to appear as an expert witness for one the accused. He notes that the guards at Abu Ghraib were told to "take off the gloves" and "soften them up," which is exactly what they did. Indeed, the behavior of guards and prisoners at the Stanford mock prison and Abu Ghraib was eerily similar: prisoners were stripped naked, hooded, chained, denied food or bedding privileges, put into solitary confinement, and made to clean toilet bowls with their bare hands.
TO BE FAIR, Zimbardo does not completely exonerate evildoers nor does he exempt humans from taking personal responsibility for their actions, though he maintains that we "exaggerate the extent to which our actions are voluntary and rationally chosen -- or to put it differently, we all understate the power of the situation....The situation and the system creating it also must share in the responsibility for illegal and immoral behavior."
To his fiercest critics, Zimbardo is a charlatan who manipulates his research to fit his liberal theories in order to blame society, rather than the individual. William Saletan of Slate put it bluntly: "The point of the Stanford experiment...was to discredit personal responsibility." Indeed, follow-up studies have found that the subjects who refused to continue participating in Milgram or related experiments possessed a higher sense of personal responsibility for their actions, while those who continued to administer shocks felt little or no responsibility for their actions. They were simply following orders. In a recent ABC Primetime "low voltage" re-creation of the Milgram experiment, one of the "teachers" is heard to ask a researcher who is to blame if something goes wrong. The researcher says he will assume responsibility. The "teacher" then says, "That's all I need to know," and continues zapping his student.
Since the Abu Ghraib scandal broke, prison commander Gen. Janis L. Karpinski has steadfastly refused to accept personal responsibility for the abuses in her camp. Instead, Gen. Karpinski blamed Sec. of Defense Donald Rumsfeld and anyone else she could think of. If our top military generals refuse to accept responsibility for their lack of oversight and for the conduct of the guards under their command, how the hell can we expect their underlings to stand up? It now seems the "missing resources" the original Milgram subjects lacked was simply a highly developed sense of personal responsibility.
Share this Article
Like this Article
Print this ArticlePrint Article