<<

Volume 25 Number 5 October 2015

Contents

Special Issue: Unplugging the Milgram Machine Guest edited by: Augustine Brannigan, Ian Nicholson and Frances Cherry Articles Introduction to the special issue: Unplugging the Milgram machine 551 Augustine Brannigan, Ian Nicholson and Frances Cherry Coverage of recent criticisms of Milgram’s obedience experiments in introductory social textbooks 564 Richard A. Griggs and George I. Whitehead III Milgram’s shock experiments and the Nazi perpetrators: A contrarian perspective on the role of obedience pressures during 581 Allan Fenigstein Designing obedience in the lab: Milgram’s shock simulator and human factors engineering 599 Maya Oppenheimer Seeing is believing: The role of the film Obedience in shaping perceptions of Milgram’s Obedience to experiments 622 Gina Perry The normalization of torment: Producing and managing anguish in Milgram’s “Obedience” laboratory 639 Ian Nicholson Obedience in perspective: Psychology and the Holocaust 657 George R. Mastroianni Acting otherwise: Resistance, agency, and subjectivities in Milgram’s studies of obedience 670 Ethan Hoffman, N. Reed Myerberg and Jill G. Morawski

Essay Review When subjects become objects: The lies behind the Milgram legend 690 Diana Baumrind

Review Understanding the unthinkable 697 Augustine Brannigan, Beyond the Banality of : Criminology and Genocide Reviewed by Matthew P. Unger

Visit http://tap.sagepub.com Free access to tables of contents and abstracts. Site-wide access to the full text for members of subscribing institutions. TAP0010.1177/0959354315604408Theory & PsychologyBrannigan et al. -article6044082015

Article

Theory & Psychology 2015, Vol. 25(5) 551­–563 Introduction to the special © The Author(s) 2015 Reprints and permissions: issue: Unplugging the sagepub.co.uk/journalsPermissions.nav DOI: 10.1177/0959354315604408 Milgram machine tap.sagepub.com

Augustine Brannigan University of Calgary

Ian Nicholson St. Thomas University

Frances Cherry Carleton University

Abstract The current issue of Theory & Psychology is devoted to and his contribution to the study of obedience. It presents a decidedly critical evaluation of these well-known experiments that challenges their relevance to our understanding of events such as the Holocaust. It builds on recent investigations of the Milgram archive at Yale. The discipline’s adulation of the obedience research overlooks several critical factors: the palpable trauma experienced by many participants, and the stark skepticism of the deceptive cover-story experienced by many others, Milgram’s misrepresentation of the way in which the prods were undertaken to ensure standardization, and his failure to de-brief the vast majority of participants. There is also the cherry-picking of findings. The project was whitewashed in the film, Obedience, prepared by Milgram to popularize his conclusions. The articles contributed for this issue offer a more realistic assessment of Milgram’s contribution to knowledge.

Keywords experiments as theatrical devices, internal and external validity of the OTA studies, the Milgram machine, obedience and the Holocaust

Corresponding author: Augustine Brannigan, Professor Emeritus of Sociology, University of Calgary, 2500 University Dr. NW, Calgary, Alberta, T2N 1N4, Canada. Email: [email protected]

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 552 Theory & Psychology 25(5)

On August 6–8, 2013, Nestar Russell from Nipissing University and Gina Perry from the University of Melbourne convened the “Obedience to Authority Conference: Milgram’s Experiments 50 years on.” It was held at the Muskoka campus of Nipissing University to mark the 50th anniversary of the first publication of Stanley Milgram’s obedience research. Over the course of 3 days more than 30 presentations were given by over 50 speakers, and featured expert panel discussions on the ethics, science, politics, and his- tory of Milgram’s obedience to authority (OTA) research. The conference brought together an international group of attendees, most of whom were psychologists and soci- ologists by training. While there was no official proceedings of that conference, several recent retrospectives have probed the full range of celebratory to critical perspectives on Milgram and the OTA research program. These have included the American Psychologist (Burger, 2009), The Psychologist (Reicher & Haslam, 2011), “Stanley Milgram and the Ethics of Social Science Research” (Herrera, 2013), and the special issue, “Milgram at 50: Exploring the enduring relevance of psychology’s most famous studies” (Reicher, Haslam, & Miller, 2014). What distinguishes the approach taken in this particular look backwards is that it carves out a space at the critical end of the epistemological spectrum. The papers we have included in this issue argue for a narrative that runs counter to the more generally accepted celebration of Milgram’s OTA program. They invite readers to “unplug” from the usual story as told in traditional venues such as disciplinary textbooks and handbooks. In those sources, we see an uncritical celebration of Milgram’s flawed but daring brilliance, the 1963 and later publications as a crucial experiment revealing unanticipated obedience by ordinary people, behaving in ways that approximate the more extreme conditions under which Germans committed genocide in World War II. In this issue, contributors draw on a wider array of materials ranging from the Milgram papers at to more recent historical scholarship on the Holocaust. These authors bring together multiple ele- ments of the Milgram Machine in a way that creates a challenge to the standard narrative. For example, participants are heard in their own words challenging the ethics and the validity of the research. A study found in the Milgram archives, suggesting the power of “relationship” to moderate obedience, is left unpublished. The Obedience film and the shock machine both become part of a rhetorical strategy to convince readers and partici- pants alike of the “truth” of the 65%. Add to this mix the research undertaken by historians of World War II and the Holocaust, and the conclusion is that the excessively obedient are anything but banal. In textbooks, particularly in , there is rarely mention of longstanding challenges to the standard narrative; on the contrary, it continues to thrive in the context of an uncritical acceptance of the merit of high impact theatrical experi- ments and situationist explanations in advancing our scientific understanding of obedi- ence to authority. Taking the articles in this collection as a whole, it is our view that Milgram’s persuasiveness in shaping how we should understand obedience to authority has become dramatically diminished.

Into the archives: Fresh perspectives on Obedience to Authority The obedience research has been contentious from the outset. While the intensity of the debate has fluctuated over the years, the terms of the discussion have been drawn almost

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Brannigan et al. 553 entirely from Milgram’s own published accounts—his three papers published shortly after the study’s completion (Milgram, 1963, 1964a, 1965b) and more notably his 1974 book Obedience to Authority. For many psychologists, 50 years of discussion has given these accounts a kind of “Biblical” status, as if Milgram’s publications constitute the definitive version of what “really” happened. For example, responding thoughtfully to some of the newer scholarship on the Obedience studies, Carraher (2014) encouraged readers to “carefully examine the original publications by Milgram … [because] readers have much to gain from looking very closely at what actually happened [emphasis added]” and concluded that, “it is critical that people form their own judgements after they have examined the evidence for themselves.” Although Milgram’s published accounts remain an invaluable resource to anyone interested in the Obedience studies, we now have a much wider and richer range of mate- rials to consider. The most significant new source is the Milgram Archive at Yale University. This archive contains Milgram’s unpublished papers, correspondence, tran- scripts, audio recordings, and notebooks related to the Obedience research. Beginning with Blass (2004), a number of researchers have scoured this archive and in many cases have raised new questions about the ethics, reliability, and validity of the Obedience research (Gibson, 2013; Nicholson, 2011a; Perry, 2013; Russell, 2010, 2014). Interpretations among these scholars vary, but at the most general level all agree that Milgram’s published accounts are not the definitive or final version of what transpired over the course of the Obedience studies. The published accounts represent a partial and in some cases idealized version of what transpired. Thus, if we want to get at “what actu- ally happened” in Milgram’s work we must go beyond the received narrative as estab- lished by him and consider the fascinating and often troubling inconsistencies, misrepresentations, private thoughts, and participant reactions to be found in the various archival documents. There are several key themes emerging over the past decade that are reflected in the papers in this special issue.

Standardization Much of the compelling nature of Milgram’s work derives from its status as a scientific experiment. In his book Obedience to Authority, Milgram contrasted his work with phi- losophy, positioning himself as an “empirically grounded scientist” (1974, p. 2) whose role “is to vary systematically the factors believed to alter the degree of obedience” (1974, p. 26). Milgram was a master at deploying the rhetoric of science. However, recent historical scholarship has revealed that he did not live up to his own scientific ide- als. Gibson (2013), Perry (2013), and Russell (2010) have highlighted the wide variation that existed between Milgram’s account of the experimental procedure and the archival record. For example, in Obedience to Authority, Milgram provided readers with a list of standardized “prods” that were to be used in situations where participants questioned the experimenter about what to do. Milgram claimed that the prods were always “made in sequence” and in a tone that was “firm but not impolite” (1974, p. 21). While reassuring to the reader, Gibson (2013) revealed that what actually transpired in was something altogether different. Carefully analyzing the audiotapes of the Milgram exper- iments, Gibson noted considerable variation from the “official” published protocol of prods from trial to trial.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 554 Theory & Psychology 25(5)

The main point of this research is not to argue that Milgram was a poor scientist. According to Gibson, “the key value of these archival materials is to reorient our under- standing of what, precisely, we understand as being ‘the data’ from these studies” (2013, p. 192). When we widen our scope for what counts as “data” to include qualitative feed- back from participants, Milgram’s private notes, and the actions of researchers them- selves, the limitations of the “standard” textbook account of the Obedience studies quickly becomes evident.

The extent of harm and debriefing From its inception, critics of the Obedience research have focused on the treatment of participants (Baumrind, 1964). Milgram admitted to inducing “extreme” stress on his unsuspecting participants, but he insisted that he had debriefed all of his participants, that the procedure was without risk, and that the extreme stress reactions were only “momen- tary excitement” (Milgram, 1964b, p. 849). “At no point,” Milgram (1964b) stated, “were subjects exposed to danger and at no point did they run the risk of injurious effects resulting from participation” (p. 849). To buttress his claim, Milgram cited a follow-up survey indicating that 84% said that they were glad to have participated. This defense satisfied many in psychology to the point that it is now commonplace for researchers either to make light of the ethical issues involved in the obedience research or in some cases not to refer to them at all (see Nicholson, 2011b). However, close scrutiny of the archives has revealed several troubling facets of the studies which cast doubt on (a) Milgram’s claims regarding debriefing, (b) the risk posed to participants, and (c) the harm inflicted. One of the most disturbing revelations as noted by Nicholson (2011b) and Perry (2013) is the fact that Milgram did not debrief all of his participants as he had originally claimed. Since the study was based in a small town and since its efficacy depended on subterfuge, Milgram felt that he needed to maintain the “illusion of harm” as long as possible. While this may sound like a fair trade-off in the abstract, it takes on a rather different flavor when viewed in the context of actual lived experience. As documented by Nicholson (2011b) and Perry (2013), Milgram’s records contain detailed accounts of participants’ post-experimental reactions—some of which occurred while the study was still ongoing. Not only do these accounts refute Milgram’s insistence that all participants had been debriefed, they also undermine his claim that the experiment was a harmless undertaking that did nothing more than generate “momentary excitement” (Milgram, 1964b, p. 849). The emotional intensity and upset are clearly evident in many of these reports as par- ticipants recalled the anxiety and distress they felt over an extended period at having to live with the knowledge that they may have killed someone. One participant reported that he lost his job after the experiment due to an emotional outburst during a discussion about the experiment with a fellow employee who had also participated in the study. Another reported that he had suffered a mild heart attack shortly after the study, implying that the extreme stress of the study was at least partially responsible (Nicholson, 2011b; Perry, 2013). It is impossible using archival materials to specify the precise extent of the harm inflicted on participants by the application of Milgram’s enhanced stress tech- niques. However, it is apparent from reviewing the participant feedback that Milgram’s

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Brannigan et al. 555 dismissive characterization of the study’s effects as “momentary excitement” with no risk was inaccurate and dishonest. The study caused considerable psychological trauma among many participants, distress that in some cases endured for several weeks, if not longer (Perry, 2013). Undertaken without medical pre-screening, the study carried con- siderable risk for the unsuspecting participants, issues which Milgram was informed of while the study was ongoing (Nicholson, 2011a).

Theatrics and high impact social psychology One of the more unsettling discoveries in the archives is the fact that privately (before he became famous) Milgram held reservations about his research that were similar to those of his most vociferous critics. As a graduate student he objected to the use of in psychological research in the strongest possible terms: “my own view is that deception for any purposes [in psychology] is unethical and tends to weaken the fabric of confi- dence so desirable in human relations” (Milgram, 1959). Milgram eventually reconciled himself to the apparent necessity of deceiving others, but he struggled to shake the sense that his brand of misdirection and trickery was more of a theatrical game than it was a science. Deceiving innocent people and inducing “vio- lently convulsive seizures” (Milgram, 1963, p. 375) in them did make for an impressive show, but what did it mean? To his credit, Milgram was not overawed by the dramatic spectacle he had created. In his notebook, he questioned whether the Obedience research was “significant science” or merely “effective theater”—a convincing demonstration of psychology’s capacity to trick unsuspecting people (cited in Nicholson, 2011b, p. 752). Without the need to uphold the study’s celebrity, Milgram confided to his notebook that he was “inclined to accept” the view of the obedience research as a theatrical exercise rather than a scientific finding. It is important to note that Milgram had a high regard for theater as a source of insight, so such an admission did not mean that he considered the study valueless. Indeed, for Milgram the “take away” message could not be more dramatic and important. Theatrical though it may be, the study shows a “fatal flaw nature has designed into us”: the “capac- ity for man to abandon his humanity, indeed the inevitability that he does so, as he merges his unique personality into larger institutional structures” (Milgram, 1974, p. 188). Milgram found a large audience willing to read his theatrical production in pre- cisely this manner. Eventually, he wrote that his manipulation of unsuspecting people revealed “difficult to get at truths” (Milgram, 1974, p. 198). However, the archival mate- rial and more specifically the extensive participant feedback, does invite alternative readings. Instead of being an “objective” test of the flawed morality of autonomous individuals as Milgram had claimed, participant testimony draws our attention to the role played by the context of the psychological experiment. Participants came into the experiment thinking they were participating in something benign and expecting to be treated in a manner that respected their dignity and well-being. They assumed that the scientists knew what they were doing and if participants expressed doubts about the proceedings they were explicitly assured that all was well: “although the shocks may seem painful, there is no permanent tissue damage, so please go on” (Milgram, 1963, p. 374). Not unreasonably, many participants took the experimenter at his word and

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 556 Theory & Psychology 25(5) continued with the procedure: “Giving the shocks did not upset me until the learner men- tioned his heart, but I had faith in Yale that the doctor would stop the experiment if he thought it best” (Reaction of subjects, 1962). What is telling about this feedback is not simply that participants trusted the experi- menter—this was something that Milgram readily admitted. The key point is that partici- pants knew they were participating in a psychology experiment—a social space akin to a magician’s stage, one that licenses all sorts of atypical behavior and unexpected occur- rences, but which also brings with it the strong expectation that nobody will be harmed. Many participants stated that this awareness influenced their actions: “I have faith in the psychological experiments and suspected that the learner was not being hurt as badly as he pretended to be” (Errera, 1963, p. 10). Some participants doubted the veracity of the experiment from the outset while others struggled with the ambiguity over what was occurring (a screaming participant versus a calm, reassuring authority figure).

Validity What all of this testimony makes clear is that Milgram’s research was a world away from the “real life” scenarios of unlawful killing that he claimed to be investigating. Nazi kill- ers were not working in the context of benign expectations associated with a “psycho- logical experiment.” German death squads did not have to guess what was real nor did they have to struggle with the contradictory framework of an anguished victim and a calm authority figure reassuring them that everything was in order. For the members of the Nazi killing machine—the entity that Milgram was trying to explain—there was no such trickery; they knew exactly what they were doing and many were glad to participate (Goldhagen, 1996). From the first published study of obedience to authority, Milgram (1963) had dramati- cally framed his study as an inquiry into the Holocaust. Following as it did, the trial and execution of in 1961 and 1962, Milgram’s study garnered headlines around the world. Yet, his framing was at the time, and remains, a contentious point (see Nicholson, 2011a). His earliest critics questioned the study’s validity, noting the consid- erable differences between the willful behavior of Nazi killers with the frequently reluc- tant actions of entrapped psychology participants who were given to believe that they were participating in something benign and who were explicitly told that they were not doing any harm (Baumrind, 1964). Milgram (1964b) took note of many of these criti- cisms and endeavored to qualify his claims with respect to the Holocaust, arguing that genocide was a “background metaphor” (p. 851). Despite these qualifications however, Milgram continued to publicly defend the idea that the obedience study was a useful scientific “revelation of certain difficult to get at truths” (1974, p. 198). Indeed, as his fame grew, Milgram spoke as if explaining the Holocaust was a relatively minor matter when set against the study’s alleged ability to explain humanity’s apparent susceptibility to destructive obedience: “To focus only on the Nazis, however despicable their deeds … is to miss the point entirely” he claimed. “For the studies are principally concerned with the ordinary and routine destruction carried out by everyday people following orders” (1974, p. 178). In recent years, the validity of Milgram’s Holocaust claims have come under additional scrutiny to the point that even Milgram enthusiasts such as Miller (2004)

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Brannigan et al. 557 have questioned the study’s utility for understanding the Holocaust (see also Brannigan, 2013; Fenigstein, 2015).

The new histories of the Holocaust Working in 1961, Stanley Milgram’s conception of the circumstances surrounding the Holocaust was colored by the account that philosopher (1961) fur- nished in her reports of the trial of Adolph Eichmann in . Eichmann was the director of the Department of Jewish Affairs in the SS, and was responsible for the deportation of European from their homelands to death camps in Poland created specifically for their extermination. Arendt depicted him as an uninspired desk-mur- derer, a functional mediocrity, who simply complied with his superiors’ orders, and who was not a particularly avid anti-Semite. Recent historiography paints a rather dif- ferent picture. Eichmann was not a person bullied into submission by bureaucracy or authority figures. He was an enthusiastic murderer who took extraordinary steps to hasten the demise of Europe’s Jews to alter fundamentally the racial composition of Europe. Cesarani (2006) writes critically of the legacy created by Arendt and her notion of the “banality of evil”:

Her book, , more than the trial itself shaped Eichmann’s legacy. Anyone writing on the subject today works in the shadow of Hannah Arendt. Her notion of “the banality of evil,” combined with Milgram’s thesis on the predilection for obedience to authority, straight- jacketed research into Nazi Germany and the persecution of the Jews for two decades. (p. 15)

In her re-analysis of The Eichmann Trial, Deborah Lipstadt (2011) also presented a new view of Eichmann. She points out that the Third Reich was somewhat of an amorphous organization where “subordinates often took the lead,” (p. 64) including Eichmann. A significant fact for Lipstadt was that Eichmann organized the last massive transports of the Holocaust from Hungary in person. When Himmler ordered that the transport of the Hungarian Jews should be delayed until the war situation in the East was more favorable, Eichmann pressed on. He met with Jewish leaders, lied to them about resettlement of Hungarian Jews in German industries, and eventually conned them out of their wealth. He attempted to get the Hungarians to raise money and trucks for the Reich war effort. The scheme failed. Starting in April 1944, Eichmann organized 144 transports to Auschwitz with 440,000 Hungarian Jews in two months. This was at a point in the war when everyone knew that the Reich’s days were numbered. Senior military leaders attempted to assassinate Hitler in July. In the course of 67 audiotapes recorded in Argentina in the 1950s with Dutch SS officer Willem Sassen, Eichmann “bemoaned the fact that the regime had not killed more Jews and expressed great satisfaction about how smoothly the [Hungarian] deportation had run” (Lipstadt, 2011, p. 67). In a speech to his men in the spring of 1945, Eichmann estimated that the war had cost the lives of 5 mil- lion Jews, and that “he would jump into his grave fulfilled at having been part of this effort” (Lipstadt, 2011, p. 132). At his trial, Eichmann admitted that he followed orders, but did so “with a degree of fanaticism one expected of oneself as a National Socialist of long-standing” (Lipstadt, 2011, p. 137).

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 558 Theory & Psychology 25(5)

Klee, Dressen, and Riess (1996) have also assembled a collection of historical docu- ments that reflects the Holocaust as seen by its perpetrators and bystanders: The Good Old Days. What comes across repeatedly in these documents is the ease with which ordinary soldiers joined in the killing of Jewish civilians, often volunteering enthusiasti- cally without direct orders, and virtually never opposing on moral grounds. In the most recent biography of Eichmann, Stangneth (2014) focuses on Eichmann’s life before Jerusalem. Eichmann joined a group of former Nazis at the Argentine home of Willem Sassen to plan publication of a book to outline their justification of the racial war against the Jews. Eichmann proposed writing a letter to West German Chancellor Konrad Adenauer to challenge the world’s condemnation of the Holocaust, and to expunge German guilt for the crime. He contemplated returning to Germany to stand trial, which would have put his case before the public at a war crimes trial. He expected this to result in a light sentence, followed by his reinstatement as an honorable German of high status, as he had been during the Third Reich. Stangneth argues that at Jerusalem Eichmann played the role of an obedient cog in the machine since his previous arrogance would not have contributed to a sympathetic defense, and that Arendt’s account was entirely deceived by his act. When Milgram began to model the obedient behavior in his lab with images of individuals uncannily passive in the face of demands for to an authority figure, he was reifying a model of behavior that was materially mislead- ing. And when generations of psychology students learn about the Holocaust through the obedience experiments, they are being studiously misled by Eichmann’s act in Jerusalem. The legendary narratives about the banality of evil have been surpassed by the new his- toriography (Wolin, 2014).

Introduction to the articles Richard Griggs and George Whitehead (2015) review the major criticisms of Milgram’s work that have appeared since psychologists have accessed the archival records of the experiments. The question they ask is how accurately and extensively have such criti- cisms been reflected in introductory textbook accounts in psychology and social psy- chology. Griggs is the author of a successful introductory psychology textbook (Griggs, 2014b), and has written extensively in Teaching of Psychology about how important historical contributions to the field have been covered in introductory textbooks (Griggs, 2014a, 2014c). A particular focus has been how debates about the validity of such works or the controversies associated with them have been presented to students. In this contribution, Griggs and Whitehead examine a number of key textbooks in both introductory psychology and introductory social psychology, and find a definite “Milgram-friendly trend” that is overwhelmingly silent on criticisms of ethical issues and challenges of validity and reliability. As Thomas Kuhn (1970) noted in The Structure of Scientific Revolutions, textbooks provide the direction “from which each new scientific generation learns to practice its trade.” Kuhn’s criticism that “we are misled by them in fundamental ways” (p. 1) would seem quite apt in terms of the text- book coverage of Milgram that Griggs and Whitehead elaborate on. This suggests that a more critical appreciation of this well-known study might be important for the growth of psychological knowledge.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Brannigan et al. 559

Allan Fenigstein (2015) rejects the argument that there were common social dynam- ics operating both in the Holocaust and at the obedience experiments at Yale. While accepting that the laboratory is not expected to reflect reality like a mirror, Fenigstein painstakingly raises a number of important substantive differences that distinguish the historical accounts of the activities of the German Order Police and the einsatzgruppen on the one hand, and the obedience participants on the other. These include, for example, the cultural context of scientific research designed to improve humankind versus “the war against the Jews,” actions performed that were defined as resulting in no permanent damage versus outright intent to annihilate a population, and the relationship to the authority figure and the assumption that the scientist was morally sensitive to the learner versus total indifference on the part of Nazi commanders. Even if Milgram had rejected these as “surface” differences, other factors were far more damaging, including the absence of evidence of an agentic state, the assumption of at Yale that had to be overcome by the pressure of authority, the enthusiasm for the of the Jewish Question among the great number of perpetrators, and the tolerance of non-con- formity among individual persons disinclined to participate in the Final Solution. Assumptions about the nature of the Holocaust circa 1961 when the Milgram experi- ments were being fine-tuned and implemented have simply proved invalid. One of the most original papers to be presented at the Bracebridge conference was Maya Oppenheimer’s (2015) analysis of the shock appliance used by Milgram. Oppenheimer studies design history in the context of human factors engineering, a field that was developing in the 1950s to optimize efficient person–machine interaction. She examines the overdevelopment of the controls and feedback signals on the Milgram device, and contrasts these to feedback mechanisms developed by Chapanis in his work on advanced machines that were being introduced as military and industrial applica- tions during this period (Chapanis, Garner, & Morgan, 1948). She argues that the design of the machine had more to do with Milgram’s interest in dramaturgy, and contributed to the facilitation of the behavior that Milgram generalized as obedience. The machine had no apparent safety provisions which would have been common in devices of a simi- lar vintage, and which would have signaled issues of user endangerment. This suggests that the alleged high levels of obedience were partially an artifact of the machine’s design features. Gina Perry is the author of one of the most provocative reassessments of the internal validity of the obedience experiments. She spent four years examining the archival mate- rials of the obedience experiments at Yale. Behind the Shock Machine (published initially in Australia in 2012, and released in the United States in 2013) was an exposition of the difference between Milgram’s published claims about the protocols he employed, and the realities of actual lab practices as evidenced in the audiotapes, diary entries, and research notes stored at the Yale library. One of the most effective methods that Milgram used to communicate his ideas was his creation of the Obedience film (Milgram, 1965a), which has probably been shown to more students than any other audio-visual materials in the history of psychology. The film was funded by the National Science Foundation (NSF) after it declined Milgram’s request for a second round of major funding for new experiments. The film was ostensibly created to assist other researchers to replicate and extend Milgram’s research. In fact, Perry (2015) argues in her paper that the film was

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 560 Theory & Psychology 25(5) designed to rebut some of the earlier criticisms of the obedience research made by NSF evaluators. The film misrepresents the debriefing process, and fails to communicate the long-term trauma experienced by many participants. It also fails to portray participants who were obviously skeptical about the cover story. It appears to address the ethical misgivings that the original studies attracted, and to establish the powerful ecological validity of the experiments that argued persuasively for the subservience of persons to governments, including the Third Reich. Perry contends that the film was actually a masterstroke of . Readers of Theory & Psychology will be aware of Ian Nicholson’s earlier work on Milgram published in this journal (“Torture at Yale,” 2011b) and his historical analysis of masculinity and the obedience paradigm which appeared in ISIS (2011a). In the cur- rent paper, Nicholson (2015) moves the focus away from the experimental participants and highlights the social scientists who conducted the obedience experiments, including Milgram and his associates and assistants. Rather than drawing the parallel between the “teachers” at Yale and the Order Police perpetrators, Nicholson raises the parallel between the well-educated Nazi doctors who carried out selections for work as the trans- ports arrived at the death camps, and who undertook barbarous medical experiments of the inmates in the service of National Socialism. Nicholson asks how the social scientists could have employed methods that in some cases reduced their participants to visceral and distress. The protocols in the obedience experiments amount to what has euphe- mistically been labeled as “enhanced stress techniques” in the recent cases of water- boarding and other methods employed by the CIA against suspected terrorists. Like the serenity of the Nazi doctors and SS leaders, the behavior of Milgram and his co-investi- gators is relevant to the issue of how torture becomes “normalized.” While Milgram’s own contrition, regret, and misgivings about the stress he created are sprinkled through- out his private notebooks, Milgram claimed publicly that the majority of the participants actually benefited from the trauma to which they were subjected. Nicholson’s article significantly revises the narratives around the obedience experiments by focusing atten- tion on the study’s principal investigator and his associates. George Mastroianni (2015) is a psychologist who has worked in the US Air Force Academy and as a Research Psychologist in the US Army. He is an expert in leadership factors in the armed forces, and has recently criticized the attempt to fit the abuses at Abu Ghraib into a situationist framework (Mastroianni, 2013). Analysis of that case revealed that several of the worst offenders were individuals with personal records for and , that they were poorly supervised, and that the prisoners abused at that location were not being interrogated as suspected terrorists. Likewise, Mastroianni dis- putes the situationist account of obedience to authority as a key to understanding the Holocaust. The persons who participated in the Holocaust were drawn from a wide swath of German society, including economists who planned mass starvation of Slav popula- tions in Eastern Europe and Russia, as well as ordinary members of the army acting with- out duress, in addition to the special killing squads. Rather than condensing the question of “why-did-it-happen?” to the power of superiors over compliant subalterns, the problem should be re-stated as “how-did-it-happen?” The answer lies in a complex mix of indi- vidual action and structural agenda setting, belief systems and perception of outsiders, economic upheaval, ideology, and national memory. He proposes a multi-faceted account

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Brannigan et al. 561 that draws from the contribution of several psychological theorists. Certainly, many peo- ple were following orders, but the reduction of such complex and protracted historical developments to the singular explanation of individual obedience makes history and human action wrongly makes history and social action one-dimensional. The final paper, by Hoffman, Myerberg, and Morawski (2015), returns to the archival accounts of the original obedience experiments to illustrate how participants resisted the course of “teaching” that has dominated the received view of the experiment. Participants did not always conclude that the shock appliance was causing harm. Hoffman et al. offer a sophisticated analysis of the situational dynamics of the participant–experimenter inter- action based on evidence from the archives, the post-experimental interviews, and the participant comments on their experiences in questionnaires. Many participants were compliant precisely because they were skeptical about harm to the learners. Many invoked knowledge of professional standards at institutions such as Yale that would not counte- nance life-threatening electrocution of participants. Others noticed the dog-eared check used to pay the learner. One wondered what all the glass was disguising and why only he, as the teacher, was asked to fill out a release form. Taken together, this re-analysis raises serious doubts about the ability of the experimenter to draw any hard and fast conclusion regarding compliance when reporting solely the maximum shock administered. This criti- cism was evident from the very first evaluation of Milgram’s by the NSF observers. Hoffman et al. provide strong evidence that participants resisted the authority figure, not only in outright refusal to continue, but in “layered, multiple ways.” The conclusions drawn by this set of papers fundamentally challenges the view of obedience reiterated uncritically in most leading textbook accounts. It opens up new nar- ratives on participant resistance, the consequences of the design of the shock appliance, the pedagogical significance of the Obedience film, the perplexing behavior of the scien- tists at Yale who conducted the experiments, and the complexity of explaining large- scale historical events like the Holocaust through laboratory research.

Declaration of Conflicting Interests The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding The author(s) received no financial support for the research and/or authorship of this article.

References Arendt, H. (1961). Eichmann in Jerusalem: Report on the banality of evil. New York, NY: Norton. Baumrind, D. (1964). Some thoughts on ethics of research: After reading Milgram’s “behavioral study of obedience”. American Psychologist, 19, 421–423. Blass, T. (2004). The man who shocked the world: The life and legacy of Stanley Milgram. New York, NY: Basic Books. Brannigan, A. (2013). Beyond the banality of evil: Criminology and genocide. Oxford, UK: Oxford University Press. Burger, J. M. (2009). Replicating Milgram: Would people still obey today? American Psychologist, 64(1), 1–11.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 562 Theory & Psychology 25(5)

Carraher, D. (2014, April 4). Re: Two views of Milgram’s “notorious” research [Comment on blog post]. Retrieved from http://psyccritiquesblog.apa.org/2014/04/two-views-of-milgrams- notorious-research.html Cesarani, D. (2006). Becoming Eichmann: The life, times and trial of a “desk-murderer”. New York, NY: Capo Press. Chapanis, A., Garner, W. R., & Morgan, C. T. (1948). Applied : Human factors in engineering design. London, UK: John Wiley & Sons. Errera, P. (1963). Obedient subjects [Meeting conducted by Dr. Paul Errera]. Stanley Milgram Papers (Sanitized Data, Box 155A). Yale University Archives, New Haven, CT. Fenigstein, A. (2015). Milgram’s shock experiments and the Nazi perpetrators: A contrarian per- spective on the role of obedience pressures during the Holocaust. Theory & Psychology, 25, 581–598. doi:10.1177/0959354315601904 Gibson, S. (2013). Milgram’s obedience experiments: A rhetorical analysis. British Journal of Social Psychology, 52, 290–309. Goldhagen, D. (1996). Hitler’s willing executioners. New York, NY: Knopf. Griggs, R. A. (2014a). Coverage of the Stanford prison experiment in introductory psychology textbooks. Teaching of Psychology, 41(3), 195–203. Griggs, R. A. (2014b). Psychology: A concise introduction (4th ed.). New York, NY: Worth Books. Griggs, R. A. (2014c). The continuing saga of Little Albert in introductory psychology textbooks. Teaching of Psychology, 41(4), 309–317. Griggs, R. A., & Whitehead, G. I., III. (2015). Coverage of recent criticisms of Milgram’s obedi- ence experiments in introductory social psychology textbooks. Theory & Psychology, 25, 564–580. doi:10.1177/0959354315601231 Herrera, C. (2013). Stanley Milgram and the ethics of social science research. Theoretical and Applied Ethics, 2(2), vii–viii. Hoffman, E., Myerberg, N. R., & Morawski, J. G. (2015). Acting otherwise: Resistance, agency, and subjectivities in Milgram’s studies of obedience. Theory & Psychology, 25, 670–689. Klee, E., Dressen, W., & Riess, V. (Eds.). (1996). “The good old days”: The Holocaust as seen by its perpetrators and bystanders (D. Burnstone, Trans.). Old Saybrook, CT: Konecky and Konecky. Kuhn, T. S. (1970). The structure of scientific revolutions (2nd ed.). Chicago, IL: University of Chicago Press. Lipstadt, D. (2011). The Eichmann trial. New York, NY: Schocken Press. Mastroianni, G. R. (2013). Looking back: Understanding Abu Ghraib. Parameters, 43(2), 53–65. Mastroianni, G. R. (2015). Obedience in perspective: Psychology and the Holocaust. Theory & Psychology, 25, 657–669. Milgram, S. (1959). Note to self. Stanley Milgram Papers (Series I, Box 23, Folder 383). Yale University Archives, New Haven, CT. Milgram, S. (1963). Behavioral study of obedience. Journal of Abnormal & Social Psychology, 67, 371–378. Milgram, S. (1964a). Group pressure and action against a person. Journal of Abnormal & Social Psychology, 69, 137–143. Milgram, S. (1964b). Issues in the study of obedience: A reply to Baumrind. American Psychologist, 19, 848–852. Milgram, S. (Producer & Director). (1965a). Obedience [DVD]. United States: Penn State University Audio-visual. Milgram, S. (1965b). Some conditions of obedience and disobedience to authority. Human Relations, 18, 57–76. Milgram, S. (1974). Obedience to authority: An experimental view. London, UK: Tavistock. Miller, A. (2004). What can the Milgram obedience experiments tell us about the Holocaust? In A. Miller (Ed.), Social psychology of good and evil (pp. 193–239). New York, NY: Guildford.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Brannigan et al. 563

Nicholson, I. (2011a). “Shocking” masculinity: Stanley Milgram, “Obedience to Authority”, and the crisis of manhood in Cold War America. ISIS, 102, 238–268. Nicholson, I. (2011b). “Torture at Yale”: Experimental subjects, laboratory torment, and the “rehabilitation” of Milgram’s “obedience to authority”. Theory & Psychology, 21, 737–761. doi:10.1177/0959354311420199 Nicholson, I. (2015). The normalization of torment: Producing and managing anguish in Milgram’s “obedience” laboratory. Theory & Psychology, 25, 639–656. doi: 10.1177/0959354315605393 Oppenheimer, M. (2015). Designing obedience in the lab: Milgram’s shock simulator and human factors engineering. Theory & Psychology, 25, 599–621. doi: 10.1177/0959354315605392 Perry, G. (2013). Behind the shock machine: The untold story of the notorious Milgram psychology experiments. New York, NY: The New Press. Perry, G. (2015). Seeing is believing: The role of the film Obedience in shaping perceptions of Milgram’s obedience to authority experiments. Theory & Psychology, 25, 622–638. doi:10.1177/0959354315604235 Reaction of subjects. (1962). Stanley Milgram Papers (Series II, Box 44). Yale University Archives, New Haven, CT. Reicher, S., & Haslam, A. (2011). The shock of the old. The Psychologist, 24, 650–652. Reicher, S., Haslam, A., & Miller, A. (Eds.). (2014). Milgram at 50: Exploring the enduring rel- evance of psychology’s most famous studies [Special issue]. Journal of Social Issues, 70(3), 393–602. Russell, N. (2010). Milgram’s obedience to authority experiments: Origins and early evolution. British Journal of Social Psychology, 49, 1–23. Russell, N. (2014). Stanley Milgram’s obedience to authority “relationship condition”: Some methodological and theoretical implications. Social Sciences, 3, 194–214. Stangneth, B. (2014). Eichmann before Jerusalem: The unexamined life of a mass murderer. New York, NY: Knopf. Wolin, R. (2014, Fall). The banality of evil: The demise of a legend [Review of the book Eichmann before Jerusalem: The unexamined life of a mass murderer, by B. Stangneth]. Jewish Review of Books. Retrieved from https://jewishreviewofbooks.com/articles/1106/the-banality-of- evil-the-demise-of-a-legend/

Author biographies Augustine Brannigan is Professor Emeritus of Sociology at the University of Calgary, Alberta, Canada. His most recent book is Beyond the Banality of Evil: Genocide and Criminology (Oxford University Press, 2013). His earlier misgivings about the validity of methods in Social Psychology were outlined in The Rise and Fall of Social Psychology: The Use and Misuse of the Experimental Method (Aldine Transaction, 2004). His initial analysis of the link between scientific knowledge and social change is outlined in The Social Basis of Scientific Discoveries (Cambridge University Press, 1981). Email: [email protected] Ian Nicholson is Professor of Psychology at St. Thomas University in Fredericton, New Brunswick, Canada. He is the editor of The Journal of the History of the Behavioral Sciences and the author of numerous papers on the history of psychology. Email: [email protected] Frances Cherry is Professor Emerita of Psychology at Carleton University in Ottawa, Ontario, Canada. She has advocated for the utilization of Psychology to advance issues of social justice. Her published work has focused on classic studies in social psychology and the extra-scientific factors that sustain their definition of the field (The Stubborn Particulars of Social Psychology, Routledge, 1995); Social Psychology and Social Change, in Fox, Prilleltensky, & Austin’s Critical Psychology: An Introduction (Sage, 2009). Email: [email protected]

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 TAP0010.1177/0959354315601231Theory & PsychologyGriggs and Whitehead III 601231research-article2015

Article

Theory & Psychology 2015, Vol. 25(5) 564­–580 Coverage of recent criticisms © The Author(s) 2015 Reprints and permissions: of Milgram’s obedience sagepub.co.uk/journalsPermissions.nav DOI: 10.1177/0959354315601231 experiments in introductory tap.sagepub.com social psychology textbooks

Richard A. Griggs University of Florida

George I. Whitehead III Salisbury University

Abstract This article has two purposes: (a) to broaden awareness of recent criticisms of Milgram’s obedience experiments by providing a relatively inclusive review of them interlaced within a discussion of Gina Perry’s main substantive criticisms and (b) to report the findings of our coverage analysis for recent criticisms in current introductory social psychology textbooks. Past coverage analyses have found a “Milgram-friendly” trend (little or no discussion or even acknowledgment of the large body of criticism published from 1964 onward) that evolved in textbooks from the 1960s to the 1990s and has become more pronounced since that time period. Our findings on coverage of recent criticisms were consistent with those of past text analyses. None of the recent criticisms were covered, even in the social psychology textbooks dated 2015. We discuss a possible explanation for these findings that involves a proposed knowledge-conserving function of social psychology textbooks.

Keywords Milgram obedience experiments, social psychology textbooks, textbook analysis

Arguably, the most famous (or infamous) set of experiments in psychology is Milgram’s obedience experiments (Milgram, 1963, 1964, 1965a, 1965b, 1974). The 50th anniver- sary of Milgram’s first major publication about the obedience experiments (Milgram, 1963) recently occurred in 2013. Surprisingly, according to Web of Science, the annual

Corresponding author: Richard A. Griggs, 4515 Breakwater Row West, Jacksonville, FL 32225, USA. Email: [email protected]

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Griggs and Whitehead III 565 rate of citation to Milgram’s 1963 article has risen significantly in recent years, espe- cially from 2007 through 2012 (Reicher, Haslam, & Miller, 2014; see Figure 1, p. 395). This finding supports Gibson’s (2013b) assertion with respect to the obedience experi- ments that the “debate surrounding the ethical, theoretical, and empirical issues they raise shows no signs of abating” (p. 177). Much of this recent interest in the Milgram experiments is concerned with presenting new criticisms of the obedience experiments, such as Milgram’s misrepresentation of the debriefing process used in the experiments (e.g., Nicholson, 2011), but even some older criticisms, such as the unethical nature of Milgram’s experimental paradigm, have been renewed (e.g., Baumrind, 2013). Most of these new criticisms of Milgram’s experiments are based on analyses of the Stanley Milgram Papers in the Manuscripts and Archives section of Yale’s Sterling Memorial Library. We will discuss these recent criticisms, focusing on those proffered by Gina Perry (2013a) because they are the most extensive.1 Discussion of some other recent, related critiques of the obedience experiments will be interwoven within the summary of Perry’s main critical points. A discussion of our examination of how current introductory social psychology textbooks have dealt with these recent criticisms will follow our examination of recent criticisms.

Perry’s main criticisms Over the course of four years through her review of the Yale archival materials, including 140 audio recordings of the original experiments, scores of participant debriefing conver- sations with a psychiatrist, and the documentation, notes, and correspondence accumu- lated during the study, and her personal interviews with former participants, experts familiar with the research, and relatives of the men who served as the experimenter and learner in the experiments, Perry (2013a) found serious methodological and ethical prob- lems with Milgram’s experiments. In an excellent summary of Perry’s findings, Brannigan (2013) describes Perry’s conclusions as “disturbing” and thinks that they will “fundamen- tally challenge the way scholars interpret Milgram and his experiments” (p. 624).

The experimenter’s improvisational, extended prodding Perry (2013a) found that Milgram’s depiction of the experimental procedure with respect to the experimenter’s use of prods to coax participants to continue on in the experiment is at odds with the experimenter’s actual behavior in using the prods and following the script for their use. Supposedly, the experimenter (confederate John Williams) used a series of four prods to encourage the participant to continue when a participant protested or expressed doubt about continuing.2 The sequence of prods was Prod 1: “Please con- tinue, or, Please go on”; Prod 2: “The experiment requires that you continue”; Prod 3: “It is absolutely essential that you continue; and Prod 4: “You have no other choice, you must go on” (Milgram, 1974, p. 21).3 The experimenter was to begin this sequence anew “on each occasion that the teacher balked or showed reluctance to follow orders” (p. 21). However, if after the fourth prod, the participant refused to continue, the participant would be classified as disobedient and the experiment terminated. But the archival audi- otapes of the experimental sessions and the experimenter’s notes about the sessions

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 566 Theory & Psychology 25(5) revealed a very different story. The experimenter didn’t always follow the controlled script for using the prods. He would parry participants’ protests, escalating the pressure by inventing more coercive prods. The experimenter’s behavior led Perry (2013c) to conclude that “The slavish obedience to authority we have come to associate with Milgram’s experiments begins to sound much more like and coercion when you listen to this material” (p. 223). How much parrying and coercing varied across both participants and experimental conditions (see Perry, 2013a, pp. 115–117, for a more complete description). Russell (2009) describes the experimenter’s improvisational prodding in the following manner: “Williams frequently displayed great feats of bottom- up innovation in the invention of progressively more coercive (stressful?) prods in trying to bring about what he sensed his boss desired” (p. 182). Consistent with Russell’s com- ments, Darley (1995), based on the transcribed experimental excerpts provided in Milgram (1974), concluded that “the experimenter’s answers to the teacher’s queries reveal that the experimenter had defined his role as doing whatever was necessary to get the teacher to continue giving the shocks” (pp. 130–132). Perry (2013a) reports a good example of this probable experimenter bias. The experimenter’s variance from the script was very prominent in Condition 20 in which women served as participants.4 For exam- ple, one woman was prompted 26 times and other women 14 times, 11 times, 9 times, and so on. Thus, it appears that the 65% obedience rate reported for this condition was at least partially due to a great deal of extra parrying and prodding by the experimenter. In sum, the experimenter’s behavior with respect to prompting the participants was clearly not standardized, and the experimenter’s deviations from script may have been driven by experimenter bias. Perry (2013a) also notes that Milgram appears to have tacitly allowed the experimenter the license to improvise because he watched a number of the experi- mental conditions through a one-way mirror. Gibson (2013b) not only discusses the experimenter’s improvisational behavior but also a “forgotten prod” (probably more accurately described as an experimenter tactic for getting the participant to continue in the experiment)—the experimenter leaves the room to check on the learner to make sure that he is okay. Gibson found evidence in the archi- val audio tapes that not only was there great flexibility in how the experimenter employed the four prods but also that the experimenter in the voice-feedback condition (Condition 2) sometimes complied with participants’ demands for him to check on the learner once he had fallen silent. When he returned after supposedly speaking to the learner, he would report that the learner was okay and willing to continue. However, this “leaving the room” tactic was not employed in a standardized manner across participants when it was used and appears to have been abandoned in subsequent experimental conditions. Its use likely led participants to infer that the learner’s predicament was not as serious as it seemed. It is important to note that Milgram failed to include any description of this experimenter tactic in this or any other condition in any of his publications on the obedi- ence experiments. This is likely due to the fact that Milgram was still refining his experi- mental procedure during the proximity series of four experimental conditions, of which the voice-feedback condition was a part (Russell, 2009). Another interesting recent development with respect to the four prods to be used by the experimenter concerns which of the prods actually constitute an order, and of those that do, how did Milgram’s participants respond to them versus those prods that do not

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Griggs and Whitehead III 567 constitute orders. As Burger (2009) points out, only Prod 4, “You have no other choice, you must go on.” clearly constitutes an order (see also Gibson, 2013a). In his partial Milgram replication, Burger found that the fourth prod was the least successful in getting his participants to continue (Burger, Girgis, & Manning, 2011). In fact, this prod did not elicit any obedience because not a single participant continued after receiving Prod 4. Burger et al. concluded that whatever Milgram was studying, it was not obedience to orders, and “that the way the research is portrayed to students, scholars and the public may need to be reassessed” (p. 465). Burger et al.’s finding agrees with what Gibson (2013a) found in the Milgram materi- als in the Yale archives. Gibson’s rhetorical analysis of the recorded interactions between the experimenter and the participants revealed that the experimenter’s most order-like interventions were overwhelmingly resisted by participants. Gibson (2013a) concluded that his analysis “points to the intriguing possibility that the studies ultimately may have little to do with obedience as conventionally understood” (p. 303). Thus, rather than showing that participants in the Milgram experiments were obeying orders of those in authority, the Milgram experiments seem to provide evidence of the opposite, that orders from an authority lead to disobedience. Based upon these findings and some of their own, Alex Haslam, Stephen Reicher, and their colleagues have proposed an intriguing interpretation of Milgram’s experiments as explorations of the power of social identity- based leadership to induce active and committed followership and not obedience (e.g., Haslam, Reicher, Millard, & McDonald, 2015; Reicher, Haslam, & Smith, 2012). Briefly, the “engaged followership” is predicated upon the teachers’ acceptance of the experi- menter’s scientific goals and the leadership he exhibits in pursuing them. Also, in support of their explanation, Haslam and Reicher provide some empirical evidence that in an experimental analogue of the Milgram paradigm, participants are motivated not by orders but by appeals to science (Haslam, Reicher, & Birney, 2014).

Milgram’s deceptive dehoaxing Perry (2013a, 2013b) also discovered that the majority of participants were not appropri- ately debriefed (“dehoaxed” in Milgram’s terminology) in a timely manner as Milgram led us to believe. The debriefing that Milgram (1963, 1965b) described simply did not happen. For most participants, the immediate debriefing did not tell them that the learner was not really shocked. According to Perry (2013a), the comments that the participants wrote on their questionnaires show that “far from being a systematic and detailed process, debriefing varied across time, and in most cases was not a debriefing in the sense that I had understood it at all” (p. 79). In addition, the transcripts of the follow-up participant group interviews involving a subset of the participants with a psychiatrist nine months after the experiments had ended also clearly demonstrate that the participants were not properly debriefed and that most participants had not received an explanation that the victim was unharmed before leaving the lab. The participants in Conditions 1–18, about three-fourths of the participants (roughly 600), left the lab believing that they had shocked a man and were certainly not debriefed in the manner that Milgram had claimed. Most participants were not told the full story until they received the study report and a questionnaire that Milgram sent to them almost a year later in July 1962. These findings

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 568 Theory & Psychology 25(5) of Milgram’s seriously inadequate debriefing of participants are particularly relevant to Baumrind’s (1964) wondering “what sort of procedures could dissipate the type of emo- tional experience” (p. 422) that many of Milgram’s participants may have experienced. Obviously, Milgram’s inadequate procedure did not accomplish this. Perry (2013a) is not the only researcher leveling these criticisms about the inadequate debriefing that participants received and the harm done to many of them. For example, in his 2011 article, “Torture at Yale,” Ian Nicholson also uses the Milgram archival mate- rials to provide a forceful argument that Milgram misrepresented the extent and efficacy of his debriefing procedures, the risk posed by his experiments, and the harm done to his participants. Based on his examination of participants’ feedback about Milgram not informing them about the true nature of the study immediately afterward (Reaction of subjects, 1962), Nicholson was led to assert that Milgram “deliberately misrepresented his post-experimental procedures in his published work” (2011, p. 744), probably to protect his credibility as a responsible researcher and the ethical integrity and possible future of the obedience study.

The unreported relationship condition Whereas Milgram (1974) reported 18 experimental conditions in his obedience study, Perry (2013a) reports that, according to her analysis of the archival experimental data, there were actually 24 (see Perry’s Appendix: List of Conditions), but one (Condition 21, Expert Judgment) was not an actual experimental condition but rather the solicitation of estimations by psychiatrists and laypeople of the level of obedience that would be observed for the baseline voice-feedback baseline condition (Condition 2).5 Of the unre- ported conditions (see Note 5), Perry found Condition 24, a second experimental condi- tion conducted in Bridgeport, CT, to be the most interesting and devotes a chapter to it. Russell (2014) refers to this condition as “arguably the most controversial variation” (p. 194) of Milgram’s obedience experiments. It was not reported by Milgram even though he said that he would do so in “Some Conditions of Obedience and Disobedience to Authority” (1965b, p. 71). This condition is usually referred to as the Relationship condition because the pairs of participants in it (one serving as teacher and the other as learner) were related in some way, but it has also been referred to as the “Bring a Friend” condition (Rochat & Blass, 2014) because participants were asked to bring a “friend”— someone that they knew well, such as a close acquaintance, a neighbor, or a relative—to also participate in the experiment. Twenty pairs of male participants who were relatives, friends, or neighbors served as teacher and learner. Only three of the 20 pairs were members of the same family—a father and son, an uncle and nephew, and two brothers-in-law. After the learner was strapped in and the teacher and experimenter left the room, Milgram explained privately to him about the experimental ruse and coached him on how to vocalize like the confed- erate learner had done in other experimental conditions in response to the supposed shocks. Thus, this condition was the same as the reported experiment in Bridgeport (which Milgram called the Office Building, Bridgeport condition), but the participants were related with one serving as teacher and the other as learner. Rochat and Blass (2014, p. 457) point out one other important difference between these two Bridgeport

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Griggs and Whitehead III 569 conditions. In the reported condition with unrelated participants, the learner’s protests are aimed at the experimenter. In the unreported one with related participants, the learn- er’s protests are aimed at the teacher. An 85% rate of disobedience, one of the highest rates in all of the experiments, was observed. Perry (2013a) discusses two possible rea- sons for Milgram not publishing the findings for this condition: (a) they did not suit his purpose in that they comprise a strong demonstration of disobedience (which Milgram himself admitted in an archival note, see Perry, 2013a, p. 177), thereby challenging the study’s overall emphasis on obedience and (b) the experiment itself would be difficult to defend ethically because the teacher was asked to inflict pain on a friend or relative, especially given the ethical criticisms (e.g., Baumrind, 1964) that had already been lev- eled against the experimental conditions that he initially reported (Milgram, 1963). Perry (2013a), however, was not the first researcher to discover this particular unre- ported condition. Russell (2014) pointed out that Rochat and Modigliani (1997) described this unreported condition in the Milgram archives more than a decade earlier. Russell also gives a very insightful discussion of why Milgram may have decided against pub- lishing this experiment and ends up concluding that reporting it “would have predictably stimulated an ethical firestorm” (2014, p. 200).

Skeptical participants Perry (2013a) found evidence in the archives that a significant number of participants had expressed doubts about the experimental set-up and cover story in their responses to Milgram’s questionnaire. For example, one participant wrote that he found it difficult to believe that Yale would allow a participant (the learner) to absorb such punishment, that the description on the control board of the shock generator was far-fetched, that the learner’s poor answers were not believable, and that the experiment was rigged and the learner not hurt in any way. Another participant reported that he had lowered the shock level but noticed that the learner anomalously expressed increased pain. For more detail and examples, see Perry (2013a, pp. 133–138) and Parker (2000, p. 717). Of more importance to this particular criticism, Perry (2013a) questions Milgram’s claim that more than 75% of his participants believed that the learner was receiving pain- ful shocks (pp. 139–141). In reviewing Milgram’s Table 7 (1974, p. 172) in which he summarized his participants’ responses to his question about their belief that the learner was being shocked, Perry points out that it is more truthful to say that only about half of the participants fully believed that it was real, and of those, two-thirds disobeyed the experimenter (2013a, p. 139). Milgram’s questionable numerical conclusions stem from his inclusion of the 24% of the participants who had expressed some doubt about whether the learner was getting shocked. This criticism of how Milgram chose to report the ques- tionnaire belief data has been discussed before (e.g., Parker, 2000; Patten, 1977). The argument that many of Milgram’s participants did not believe that they were really administering shock to another person has also been posited by many Milgram critics going all the way back to Orne and Holland (1968; but see Milgram’s rejoinder, 1972). Orne and Holland report some data that Orne and Evans (1965) had collected which indi- cated that the vast majority of participants (84%) would comply with an experimenter’s instructions to perform dangerous tasks, such as retrieving a coin from what appeared to

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 570 Theory & Psychology 25(5) be nitric acid, if they thought that they were participating in an experiment because they assumed that things were not at all as they appeared to be. However, participants who were not told they were participating in an experiment declined to perform these tasks. Consistent with these data, Laurent (1987) points out a seldom-cited, congruent finding in a Milgram replication at the Max Planck Institute in Germany: “the subjects … seem to have felt that the Max Planck Institute would not let anything dreadful happen” (Mantell & Panzarella, 1976, p. 244). Laurent further points out that, in his opinion, Milgram was too concerned with participants’ behavior and not enough with their perception of the situation. Participants in an experiment concern themselves with being good subjects and performing in a manner that they perceive is expected of them. As Mixon (1972) concluded in discussing the use of deception in Milgram’s experiments, “it is often difficult to deter- mine who is deceiving whom” (p. 145). In further support of her skeptical-participants criticism, Perry (2013a) discovered an analysis that Milgram had his research assistant, Taketo Murata, compile, but then chose not to publish.6 The analysis was a condition-by-condition breakdown of the degree of shock given by participants who were certain that the learner was being shocked versus that given by participants who had doubts about this. The analysis revealed that many of those who administered the maximum shock did not think that they were truly shocking the learner. In 18 of the 23 experimental conditions, the participants who fully believed that the learner was being shocked gave lower levels of shock than the participants who expressed doubts about the learner being shocked. Of most importance, Murata found that in all 23 conditions, the participants most likely to disobey were those who said they believed the learner was being shocked.7

Coverage of recent Milgram criticisms in introductory social psychology textbooks We wondered how current social psychology textbooks have dealt with the spate of recent criticisms of Milgram’s obedience experiments. Thus, we decided to examine the discussions of Milgram’s obedience experiments in these textbooks to answer this ques- tion: Do current textbook discussions reflect any of the recent criticism of Milgram’s experiments? The results of similar studies on the coverage of criticisms of another famous study in introductory psychology and introductory social psychology textbooks, the Stanford prison experiment (Zimbardo, 2007), would predict that the criticisms would be given minimal, if any, coverage, as this is what was observed for the prison study (Griggs, 2014; Griggs & Whitehead, 2014). Such a prediction also agrees with Nicholson’s (2011) findings for a sample of three introductory social psychology text- books with copyright dates from 2006 to 2011. Nicholson found that, counter to Stam, Lubek, and Radtke’s (1998) findings for social psychology textbooks from 1965 to 1995, the three texts that he examined provided little or no coverage of the ethical and episte- mological controversies that have surrounded Milgram’s work for the past half century. He also found that the “Milgram-friendly” trend that Stam et al. had observed developing in the textbooks over the three decades that they studied had become even more pro- nounced with the obedience experiments now being presented as comprising a classic study with little or no discussion or even acknowledgment of the voluminous body of

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Griggs and Whitehead III 571 criticism that has been published in the last 50 years. In addition, Miller (1995; but see Stam et al., 1998, Note 14, p. 182) examined a set of 50 social psychology, introductory psychology, and sociology texts with copyright dates in the early 1990s and found that well over half of them made no reference to the many external validity criticisms that have been levied against the obedience experiments over the past 50 years (e.g., Baumrind, 1964; Lutsky, 1995; Mantell & Panzarella, 1976; Orne & Holland, 1968). In those texts that did include mention of such criticism, almost all of them took a pro- Milgram stance on this generalization issue. Miller’s finding agrees with Stam et al.’s observation that coverage of concerns over realism and generalizability of the obedience experiments have declined or, if covered, are dealt with summarily by textbook authors. In sum, based on the findings in all of these prior textbook analysis studies, it is likely that current introductory social psychology textbooks do not cover the recent criticisms of Milgram’s experiments, and by failing to do so, are committing errors of omission and thus not covering Milgram’s work accurately. We next report our study of current intro- ductory social psychology textbooks to determine the accuracy of this coverage and its findings.

Our textbook study Method. We used the most recent editions of 10 introductory social psychology text- books as the text sample to ensure that we had the most up-to-date sample available. Copyright dates for these texts include three 2015s, two 2014s, four 2013s, and one 2012. We include complete reference information for all of these texts in the References section, with each reference denoted by an asterisk. These 10 texts essentially comprise the population of American introductory social psychology textbooks if briefer versions of two of these texts and Aronson’s briefer, more trade-like The Social Animal (2012) are excluded. To determine all of the locations of coverage of Milgram’s obedience experiments within each text, the Name Index was checked for Milgram and the Subject Index was checked for obedience to authority experiments or any possible variants, such as obedi- ence or obedience experiments.8 Once all of the locations were determined, the extent of the coverage was measured in terms of the number of pages devoted to it. The number of coverage pages for each text was rounded to the nearest whole number. The coverage of criticism in each text was determined by noting which recent critiques of the obedience experiments were cited; if any were cited, how much space was devoted to them; and how the critiques were treated (e.g., were they dismissed and Milgram’s view accepted).

Coverage of recent criticisms. None of the 10 textbooks cited or discussed any of the recent (from 2011 onward) criticisms. This is not surprising given the recent finding of little or no coverage of any criticism of the obedience experiments in introductory social psy- chology textbooks (Nicholson, 2011). First, with respect to citing Perry’s (2013a) criti- cisms, realistically only the three texts with 2015 copyright dates could be expected to do so, but none of them did.9 However, Nicholson (2011) was early enough to have been included in all but one of the 10 texts, but none of the texts cited or discussed his criti- cism. Most surprising was the fact that not one text mentioned either Burger et al.’s

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 572 Theory & Psychology 25(5)

(2011) finding that Milgram’s participants were probably not obeying orders given their behavior with respect to Prod 4 or one of Haslam and Reicher’s early articles on their social identity-based followership explanation of Milgram’s findings (e.g., Reicher et al., 2012). This lack of recent criticism coverage is definitely not due to lack of space. The 10 textbooks devoted on average 7.4 pages (Mdn = 8 pages) to coverage of Milgram’s obe- dience experiments with a range from 4 to 16 pages. Though not as extensive as in intro- ductory social psychology texts, introductory psychology textbooks typically provide rather substantial coverage of Milgram’s obedience experiments in their social psychol- ogy chapters (Griggs, 2014). Thus, we were curious as to whether introductory psychol- ogy textbooks dated 2015 provided any coverage of the recent Milgram criticisms, so we decided to examine the social psychology chapters in four introductory psychology text- books with 2015 copyright dates for coverage of such criticisms. Complete reference information for these four texts is included in the Reference section, with each reference denoted by an asterisk. Given the early publication (publication before the copyright year of a text; see Note 9) of one of these texts, only three of these texts would have had sufficient lead time to incorporate coverage of Perry’s (2013a) criticisms. Contrary to what we found in our examination of the three introductory social psychology textbooks dated 2015, two of the three introductory psychology textbooks did cite Perry (2013a) and include coverage of some of her criticisms. One of these two texts cited Nicholson (2011) along with Perry for a statement that Milgram’s debriefing “was less extensive and his participants’ distress greater than what he had suggested.” In contrast, not only did none of the 10 introductory social textbooks mention this criticism, but only one of these texts even described the debriefing process per se and that description was inaccurate in that it claimed that all participants were “fully debriefed” once the experiment was over. The other introductory psychology text that mentioned one of Perry’s (2013a) criti- cisms included three sentences on Perry’s discovery of Taketo Murata’s unpublished data that indicated that Milgram’s participants were more likely to disobey if they believed that the learner was actually being shocked. In this coverage, the authors say “Across the 23 variations of Milgram’s experiment …” but do not mention that Milgram did not report all of these variations in his publications on the obedience experiments (see Note 5). This text also included two parenthetical Perry (2013a) citations: one with Perry alone for a statement about questioning the ethics of Milgram’s experiments and the other among a group of citations for a statement asserting that there have been some recent replications and partial replications of Milgram’s experiments including several by entertainment and news media. The introductory psychology text dated 2015 that was published too early to include coverage of Perry (2013a), however, did include a paragraph on Burger et al.’s (2011) finding that the participants’ behavior did not display obedience as normally portrayed but rather indicated that a social identity process was operating. The authors also cited Reicher et al. (2012) and briefly described how participants were likely identifying with the experimenter and not the learner and act in a way to demonstrate their commitment to the larger scientific process. In contrast, only one of the 10 introductory social psy- chology textbooks cited Burger et al. (2011), but instead of discussing Burger et al.’s

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Griggs and Whitehead III 573 questioning the claim that Milgram was really studying obedience (following orders of a person of authority), it described Burger et al.’s finding that participants who expressed concern for the well-being of the learner exhibited a greater reluctance to continue than those who did not express such concern. Given the coverage of recent criticisms in the introductory psychology textbooks dated 2015 versus that in the introductory social psychology textbooks also dated 2015, it appears that introductory psychology authors are more up-to-date in their coverage of criticism of Milgram’s obedience experiments than introductory social psychology text- book authors and that they achieve this within fewer pages of overall Milgram coverage (Mdn = 3.5 pages). This counterintuitive finding is consistent with some other related findings on coverage accuracy in introductory psychology versus that in social psychol- ogy textbooks. For example, Griggs (2015b) found that introductory psychology text- book authors seem to be doing a better job than introductory social psychology text authors in providing an accurate version of the Kitty Genovese story, and Griggs and Whitehead (2014) found that more coverage of the Stanford prison experiment criticisms is provided in introductory psychology textbooks than in introductory social psychology textbooks.

Discussion of the coverage findings. Our examination of current introductory social psy- chology textbooks revealed that they provide no coverage of recent criticisms of Mil- gram’s obedience experiments. We also found that current introductory psychology textbooks provided some coverage of recent Milgram criticisms, indicating that these texts are more up-to-date in their coverage of Milgram criticism than current introduc- tory social psychology textbooks. To try to understand these findings, we discuss a pro- posal by Stam et al. (1998) that provides a tenable explanation for them. Stam et al. proposed that social psychology textbooks “serve a knowledge-conserving function for the discipline … there is a great deal of temporal consistency, a shared core of material and authors to be discussed, and the of a homogenous, conservative perspec- tive” (p. 156). More specifically, Stam et al. explain how as part of this knowledge- conserving function, the “standard” view of the obedience experiments in both social psychology textbooks and the broader literature has developed. According to Stam et al.:

The obedience research is no longer a case study of the importance of obedience to authority but an important promoter of the importance and necessity of experimental social psychological research. The visibility of the research has become a token: by its critics, a token of the vulnerability of the discipline; by proponents, a token of its strengths. Within the discipline, Milgram is valorized for his contributions but the recurring appearance of discussions of methodology and ethics indicate that in order to valorize Milgram’s studies social psychologists must continually engage in damage control. It is this combined valorization/defensiveness that we take to be the standard view of the obedience experiments. (1998, pp. 162–163)

The Milgram-friendly lack of coverage of recent criticisms of the obedience experiments that was observed in the present study for social psychology textbooks is definitely consistent with Stam et al.’s (1998) proposal of a standard view of the obedience experiments that evolved from a valorization/defensiveness process engaged in by social

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 574 Theory & Psychology 25(5) psychologists. Such coverage would be part of the damage control. Given the knowledge- conserving function of social psychology textbooks, “standard views” of other famous studies in social psychology should have also evolved. Some recent related research indi- cates that this is likely the case. The Stanford prison experiment is clearly a famous social psychological study, and Griggs and Whitehead (2014) found no or very minimal cover- age of the extensive body of criticism of the Stanford prison experiment in social psy- chology textbooks. As the lack of coverage of criticisms of the obedience experiments observed in the present study, the minimal coverage of the Stanford prison experiment criticisms would serve a damage control function. Similarly, this evolution of a standard view of famous studies likely plays a role in the finding that an analysis of social psychol- ogy textbooks from 1953 to 1984 revealed that although most of the responses on critical trials were independent ones in Asch’s classic social pressure experiments (1951, 1952, 1955, 1956), textbooks increasingly overemphasized the minority responding and deemphasized the majority independent responding during that time period (Friend, Rafferty, & Bramel, 1990). Griggs (2015a) found that this distorted coverage (the standard view) has not only persisted but also increased over the past 30 years. This standard view theory may not only apply to famous experiments but also to famous stories in social psychology textbooks. Indeed, Manning, Levine, and Collins (2007) proposed this type of explanation for their finding of the perseverance of factual inaccuracies in the coverage of the murder of Kitty Genovese in the social psychology textbooks that they examined. Seven years later, Griggs and Whitehead (2014) found that this inaccuracy problem was still the case in introductory social psychology text- books. The present finding that introductory psychology texts cover recent criticisms of the Milgram obedience experiments better than introductory social psychology text- books and Griggs’s (2015b) finding of more accurate coverage of the Kitty Genovese story in introductory psychology textbooks also provide indirect support for the standard view theory in that introductory psychology textbook authors would not be as likely to provide social psychology’s standard view of its famous studies and stories because most would not be social psychologists. Griggs (2014, see Note 4) provides some support for this reasoning. In brief, Griggs found that a larger percentage of the introductory psy- chology textbooks without a social psychologist author included criticism of the Stanford prison experiment than those with a social psychologist author (57% vs. 25%).

Epilogue Social psychology textbooks are still giving a large amount of space to Milgram’s obedi- ence experiments and his conclusions about them, but no space to the many recent criti- cisms of the experiments and how Milgram depicted them in his publications. According to Morawski (1992), textbooks are the key transmitters of psychological knowledge both to potential new members of the discipline and to those outside of the discipline (giving psychology away), and therefore it is essential that textbook information be accurate. Thus, it is important to identify inaccuracies in our textbooks so that they can be cor- rected and we as teachers and textbook authors do not continue to “give away” false information about our discipline. Given that Milgram’s obedience study is one of the

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Griggs and Whitehead III 575 most famous studies in psychology with far-reaching impact outside of psychology, it is especially important that coverage of it in our textbooks be accurate. We are not propos- ing banishing coverage of the obedience experiments in textbooks, but rather covering them in a more accurate manner that includes coverage of their flaws and shortcomings as explicated in the extensive criticism now available. Tavris (2014) argues that Milgram’s obedience study should be taught as a “contentious classic,” but that would prove diffi- cult given that current introductory social psychology textbooks present it as an uncon- tentious classic. As Peter Baker (2013) argues, “We have heard Milgram’s version enough. What we need now is … a proper dehoaxing” (final para).

Declaration of Conflicting Interests The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.

Notes 1. Gina Perry’s book, Behind the Shock Machine: The Untold Story of the Notorious Milgram Psychology Experiments, was initially published in Australia in May 2012 and reviewed in the U.S. in early 2013 (e.g., Whitbourne, 2013). The U.S. edition of Perry’s book was pub- lished in September 2013. 2. Gibson (2013b, pp. 179–180) points out that there is an inconsistency between Milgram’s (1965b) description of the prods and the experimenter’s use of them and the descriptions given in Milgram (1963, 1974). He concludes that the description in Milgram (1963, 1974), consistent with the description here, has come to be the accepted view of this part of the experimental procedure. 3. In addition to the four main prods, the experimenter had two special prods that he was to use in specified situations (see Milgram, 1974, pp. 21–22). First, if the participant asked if the learner was liable to suffer permanent physical injury, the experimenter was to say: “Although the shocks may be painful, there is no permanent tissue damage, so please go on.” If neces- sary, this would be followed by Prods 2, 3, and 4. Second, if the participant said that the learner did not want to go on, the experimenter was to reply: “Whether the learner likes it or not, you must go on until he has learned all the word pairs correctly. So please go on.” Again, if necessary, this would be followed by Prods 2, 3, and 4. 4. The numbering of the experimental conditions used in this paper reflects the chronological order that they were conducted, except for Condition 22, which was conducted at various times during 1961 and 1962. The chronologically ordered list of conditions, along with brief descriptions of each condition, is given in the Appendix: List of Conditions (Perry, 2013a, pp. 304–312, and in Haslam, Loughnan, & Perry, 2014, Table 1). It is important to note that because Milgram did not report the findings of all of the conditions that he conducted, this numbering scheme is different than that used by Milgram (1974). 5. Gina Perry (personal communication, December 8, 2014) sent me (Richard A. Griggs) copies of the original data sheets for Milgram’s obedience experiments that are part of the Milgram papers at Yale University. By using these data sheets and the listing of the 23 experimen- tal conditions and the one non-experimental condition in Perry (2013a, Appendix: List of

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 576 Theory & Psychology 25(5)

Conditions) and the experimental conditions described in Milgram (1974, Tables 2–5) and in Milgram (1963, 1964, 1965a, 1965b), I determined that Milgram did not report Condition 17 (Teacher in charge) and only very briefly described the nature of Conditions 19 (Authority from afar) and 24 (Intimate Relationships) in “Some Conditions of Obedience and Disobedience to Authority” (1965b). He also only reported a brief general description in a footnote of the obedience finding for Condition 19 and nothing on the outcome of Condition 24. In addition, Milgram did not report anything about Part B of Conditions 10 (Conflicting instructions) and 15 (Good experimenter, bad experimenter) and only provided a brief general description of the nature and results of Part B of Condition 18 (No experimenter) in “Some Conditions …” (1965b). All of the above condition numbers and names are those used by Perry in her list- ing of conditions. In this listing, Perry also includes descriptions of and the results for all 24 conditions that Milgram ran. 6. Taketo Murata’s unpublished analysis is titled “Reported Belief in Shocks and Level of Obedience” and is in the Stanley Milgram Papers, Box 45, Folder 158 at Yale University. 7. Gus Brannigan pointed out that it would be interesting to use the data that Milgram reported in Table 7 (1974, p. 172), which juxtaposes defiance/obedience rates versus belief in the reality of the shocks, to compute the odds ratio of defiance based on belief (personal communication, December 9, 2014). Comparing the data for participants who believed the shocks to be real or probably real versus those who believed the shocks were not real or probably not real, he calculated the odds ratio of defiance based on belief to be 2.57. This means that if a participant thought that the shocks were real or probably real, this increased the odds ratio of defiance by 2.57 times, which is generally consistent with Taketo Murata’s analysis. 8. For one textbook not yet published, this search process was not used. Instead, the publisher of this text sent us the PDF of the only chapter in which the Milgram obedience experiments were covered. 9. The publication date of a textbook (the date that the book is actually published and avail- able) is typically different from its copyright date. Textbooks are sometimes published up to a year before their copyright date. In addition, the text finalization and production processes typically tack on another 3 to 6 months of time before the publication date. Thus, the authors of the social psychology textbooks with 2014 copyright dates would not have had sufficient time to incorporate coverage of the U.S. edition of Perry’s book even if they were published in January, 2014. However, there is a caveat to this statement. They would have had sufficient time if they were aware of the earlier, more limited publication of the Australian edition in May, 2012 (Perry, 2012) because the two textbooks were published in January and February of 2013, respectively. The authors of all three 2015 textbooks in the present study, however, would have had sufficient time to be aware of the U.S. edition of the book because it was published in September, 2013, and they were published in July, 2014, November, 2014, and January, 2015, respectively.

References References marked with an asterisk indicate textbooks examined in the present study. Aronson, E. (2012). The social animal (11th ed.). New York, NY: Worth. *Aronson, E., Wilson, T. D., & Akert, R. M. (2013). Social psychology (8th ed.). Boston, MA: Pearson. Asch, S. E. (1951). Effects of group pressure upon the modification and distortion of judgment. In H. Guetzkow (Ed.), Groups, leadership and men (pp. 177–190). Pittsburgh, PA: Carnegie Press.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Griggs and Whitehead III 577

Asch, S. E. (1952). Social psychology. Englewood Cliffs, NJ: Prentice-Hall. Asch, S. E. (1955). Opinions and social pressure. Scientific American, 193, 31–35. Asch, S. E. (1956). Studies of independence and conformity: 1. A minority of one against a unani- mous majority. Psychological Monographs, 70, (Whole No. 416). Baker, P. C. (2013, September 10). Electric schlock: Did Stanley Milgram’s famous obedience experiments prove anything? Pacific Standard: The Science of Society. Retrieved from http:// www.psmag.com/navigation/health-and-behavior/electric-schlock-65377/ *Baron, R. A., Branscombe, N. R., & Byrne, D. (2012). Social psychology (13th ed.). Boston, MA: Pearson. *Baumeister, R. F., & Bushman, B. J. (2014). Social psychology and (3rd ed.). Belmont, CA: Wadsworth. Baumrind, D. (1964). Some thoughts on the ethics of research: After reading Milgram’s “behavio- ral study of obedience”. American Psychologist, 19, 421–423. Baumrind, D. (2013). Is Milgram’s deceptive research ethically acceptable? Theoretical and Applied Ethics, 2, 1–18. Brannigan, A. (2013). Stanley Milgram’s obedience experiments: A report card 50 years later. Society, 50, 623–628. Burger, J. M. (2009). Replicating Milgram: Would people still obey today? American Psychologist, 64, 1–11. Burger, J. M., Girgis, Z. M., & Manning, C. C. (2011). In their own words: Explaining obedience to authority through an examination of participants’ comments. Social Psychological and Personality Science, 2, 460–466. *Cervone, D. (2015). Psychology: The science of person, mind, and brain. New York, NY: Worth. *Ciccarelli, S. K., & White, J. N. (2015). Psychology (4th ed.). Upper Saddle River, NJ: Pearson. Darley, J. M. (1995). Constructive and destructive obedience: A taxonomy of principal-agent rela- tionships. Journal of Social Issues, 51, 125–154. *DeLamater, J. D., Myers, D. J., & Collett, J. L. (2015). Social psychology (8th ed.). Boulder, CO: Westview Press. *Franzoi, S. (2013). Social psychology (6th ed.). Redding, CA: BVT. Friend, R., Rafferty, Y., & Bramel, D. (1990). A puzzling misinterpretation of the Asch “conform- ity” study. European Journal of Social Psychology, 20, 29–44. Gibson, S. (2013a). Milgram’s obedience experiments: A rhetorical analysis. British Journal of Social Psychology, 52, 290–309. Gibson, S. (2013b). “The last possible resort”: A forgotten prod and the in situ standardization of Stanley Milgram’s voice-feedback condition. History of Psychology, 16, 177–194. *Gilovich, T., Keltner, D., Chen, S., & Nisbett, R. E. (2013). Social psychology (3rd ed.). New York, NY: Norton. *Greenberg, J., Schmader, T., Arndt, J., & Landau, M. (2015). Social psychology: The science of everyday life. New York, NY: Worth. Griggs, R. A. (2014). Coverage of the Stanford Prison Experiment in introductory psychology textbooks. Teaching of Psychology, 41, 195–203. doi: 10.1177/0098628314537968 Griggs, R. A. (2015a). The disappearance of independence in textbook coverage of Asch’s social pressure experiments. Teaching of Psychology, 42, 137–142. doi: 10.1177/0098628315569939 Griggs, R. A. (2015b). The Kitty Genovese story in introductory psychology textbooks: Fifty years later. Teaching of Psychology, 42, 149–152. doi: 10.1177/0098628315573138 Griggs, R. A., & Whitehead, G. I., III (2014). Coverage of the Stanford Prison Experiment in introductory social psychology textbooks. Teaching of Psychology, 41, 318–324. doi: 10.1177/0098628314549703 Haslam, N., Loughnan, S., & Perry, G. (2014). Meta-Milgram: An empirical synthesis of the obe- dience experiments. PLoS ONE, 9(4): e93927. doi: 10.1371/journal.pone.0093927

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 578 Theory & Psychology 25(5)

Haslam, S. A., Reicher, S. D., & Birney, M. E. (2014). Nothing by mere authority: Evidence that in an experimental analogue of the Milgram paradigm participants are motivated not by orders but by appeals to science. Journal of Social Issues, 70, 473–488. Haslam, S. A., Reicher, S. D., Millard, K., & McDonald, R. (2015). “Happy to have been of ser- vice”: The Yale archive as window into the engaged followership of participants in Milgram’s “obedience” experiments. British Journal of Social Psychology, 54(1), 55–83. doi: 10.1111/ bjso.12074 *Hockenbury, S. E., Nolan, S., & Hockenbury, D. H. (2015). Psychology (7th ed.). New York, NY: Worth. *Kassin, D. T., Fein, S., & Markus, H. R. (2014). Social psychology (9th ed.). Belmont, CA: Wadsworth. *Kenrick, D. T., Neuberg, S. L., & Cialdini, R. B. (2015). Social psychology: Goals in interaction (6th ed.). Boston, MA: Pearson. Laurent, J. (1987). Milgram’s shocking experiments: A case in the social construction of “sci- ence”. Indian Journal of History of Science, 22, 247–272. Lutsky, N. (1995). When is “Obedience” obedience?: Conceptual and historical commentary. Journal of Social Issues, 51, 55–65. Manning, R., Levine, M., & Collins, A. (2007). The Kitty Genovese murder and the social psy- chology of helping: The parable of the 38 witnesses. American Psychologist, 62, 555–562. Mantell, D. M., & Panzarella, R. (1976). Obedience and responsibility. British Journal of Social and Clinical Psychology, 15, 239–245. Milgram, S. (1963). Behavioral study of obedience. Journal of Abnormal and Social Psychology, 67, 371–378. Milgram, S. (1964). Group pressure and action against a person. Journal of Abnormal and Social Psychology, 69, 137–143. Milgram, S. (1965a). Liberating effects of group pressure. Journal of Personality and Social Psychology, 1, 127–134. Milgram, S. (1965b). Some conditions of obedience and disobedience to authority. Human Relations, 18, 57–76. Milgram, S. (1972). Interpreting obedience: Error and evidence – A reply to Orne and Holland. In A. G. Miller (Ed.), The social psychology of psychological research (pp. 138–154). New York, NY: Free Press. Milgram, S. (1974). Obedience to authority: An experimental view. New York, NY: Harper & Row. Miller, A. G. (1995). Constructions of the obedience experiments: A focus upon domains of rel- evance. Journal of Social Issues, 51, 33–53. Mixon, D. (1972). Instead of deception. Journal for the Theory of Social Behavior, 2, 145–177. Morawski, J. G. (1992). There is more to our history of giving: The place of introductory psychol- ogy textbooks in American psychology. American Psychologist, 47, 161–169. *Myers, D. G. (2012). Social psychology (11th ed.). New York, NY: McGraw-Hill. *Myers, D. G., & DeWall, C. N. (2015). Psychology (11th ed.). New York, NY: Worth. Nicholson, I. (2011). “Torture at Yale”: Experimental subjects, laboratory torment, and the “reha- bilitation” of Milgram’s “Obedience to Authority.” Theory & Psychology, 21, 737–761. doi: 10.1177/0959354311420199 Orne, M. T., & Evans, F. J. (1965). Social control in the psychological experiment: Antisocial behavior and hypnosis. Journal of Personality and Social Psychology, 1, 189–200. Orne, M. T., & Holland, C. C. (1968). On the ecological validity of laboratory . International Journal of Psychiatry, 6, 282–293. Parker, I. (2000, Autumn). Obedience. Granta, 71, 99–125. Patten, S. C. (1977). Milgram’s shocking experiments. Philosophy, 52, 425–440.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Griggs and Whitehead III 579

Perry, G. (2012). Behind the shock machine: The untold story of the notorious Milgram psychology experiments. Melbourne, Australia: Scribe. Perry, G. (2013a). Behind the shock machine: The untold story of the notorious Milgram psychol- ogy experiments. New York, NY: The New Press. Perry, G. (2013b). Deception and illusion in Milgram’s accounts of the obedience experiments. Theoretical & Applied Ethics, 2, 79–92. Perry, G. (2013c). Response to Russell’s review of Behind the Shock Machine. History of the Behavioral Sciences, 49, 223–224. doi:10.1002/jhbs.21600 Reaction of subjects. (1962). Stanley Milgram papers (Series II, Box #44). Yale University Archives, New Haven, CT. Reicher, S. D., Haslam, S. A., & Miller, A. G. (2014). What makes a person a perpetrator? The intellectual, moral, and methodological arguments for revisiting Milgram’s research on the influence of authority. Journal of Social Issues, 70, 393–408. Reicher, S. D., Haslam, S. A., & Smith, J. R. (2012). Working towards the experimenter: Reconceptualizing obedience within the Milgram paradigm as identification-based follower- ship. Perspectives on Psychological Science, 7, 315–324. Rochat, F., & Blass, T. (2014). The “Bring a Friend” condition: A report and analysis of Milgram’s unpublished Condition 24. Journal of Social Issues, 70, 456–472. Rochat, F., & Modigliani, A. (1997). Authority: Obedience, defiance, and identification in experi- mental and historical contexts. In M. Gold & E. Douvan (Eds.), A new outline of social psychology (pp. 235–246). Washington, DC: American Psychological Association. doi: 10.1037/10225–013 Russell, N. (2009). Stanley Milgram’s obedience to authority experiments: Towards an under- standing of their relevance in explaining aspects of the Nazi holocaust (Doctoral disserta- tion). University of Wellington, Australia. Retrieved from http://researcharchive.vuw.ac.nz/ handle/10063/1091 Russell, N. (2014). Stanley Milgram’s obedience to authority “relationship” condition: Some methodological and theoretical implications. Social Sciences, 3, 194–214. doi: 10.3390/soc- sci3032194 Stam, H. J., Lubek, I., & Radtke, H. L. (1998). Repopulating social psychology texts: Disembodied “subjects” and embodied subjectivity. In B. M. Bayer & J. Shotter (Eds.), Reconstructing the psychological subject: Bodies, practices, and technologies (pp. 153–186). London, UK: Sage. Tavris, C. A. (2014, October). Teaching contentious classics. APS Observer, 27(8). Retrieved from http://www.psychologicalscience.org/index.php/publications/observer/2014/octo- ber-14/teaching-contentious-classics.html Whitbourne, S. K. (2013, January 22). The secrets behind psychology’s most famous experiment. [Review of the book Behind the shock machine, by G. Perry]. Psychology Today. Retrieved from http://www.psychologytoday.com/blog/fulfillment-any-age/201301/the-secrets-behind- psychology-s-most-famous-experiment Zimbardo, P. G. (2007). : Understanding how good people turn evil. New York, NY: Random House.

Author biographies Richard A. Griggs is Professor Emeritus in the Department of Psychology at the University of Florida. After earning his PhD in cognitive psychology at Indiana University in 1974, he joined the faculty at the University of Florida that fall. He spent his entire academic career there, retiring in 2008. He won numerous teaching awards at the University of Florida and in 1994 was named

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 580 Theory & Psychology 25(5)

Teacher of the Year for 4-year colleges and universities by APA’s Division Two, Teaching of Psychology. His two main research areas are human reasoning and the teaching of psychology. He has published widely in both areas, including 45 articles in Teaching of Psychology. The fourth edition of his introductory psychology textbook, Psychology: A Concise Introduction, was pub- lished in 2014. Email: [email protected] George I. Whitehead III is Professor in the Department of Psychology at Salisbury University. He earned his PhD in social psychology at the University of Massachusetts-Amherst in 1973. The previ- ous year he had joined the faculty at Salisbury University (Salisbury State College at the time), where he is still teaching. He was named outstanding faculty member by the Student Government Association twice and has received a number of local and state awards for his community service. His research interests include pro-social behavior, service-learning, self-presentation theory, and the teaching of psychology. He has published research in each of these areas and co-authored two books on service- learning. The most recent book, A Glorious Revolution for Youth and Communities: Service-Learning and Model Communities was published in 2010. Email: [email protected]

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 TAP0010.1177/0959354315601904Theory & PsychologyFenigstein research-article6019042015

Article

Theory & Psychology 2015, Vol. 25(5) 581­–598 Milgram’s shock experiments © The Author(s) 2015 Reprints and permissions: and the Nazi perpetrators: A sagepub.co.uk/journalsPermissions.nav DOI: 10.1177/0959354315601904 contrarian perspective on the tap.sagepub.com role of obedience pressures during the Holocaust

Allan Fenigstein Kenyon College

Abstract In contrast to many scholars who believe that Milgram’s studies of obedience provide an incisive understanding of the Holocaust perpetrators, this article argues that pressures to obey authority had little role in the Holocaust. Unlike Milgram’s participants, most Nazi perpetrators showed no remorse or moral distress over the murders, severely compromising the explanatory necessity of obedience pressures; the excesses of the Nazis’ brutal and wanton cruelty, and the enthusiasm shown in the killing process, is entirely inconsistent with the behavior of the laboratory participants and with the concept of dutiful, but emotionless, obedience; and finally, when Milgram’s participants had the chance to evade giving shock, they frequently seized that opportunity; in contrast, although Nazi killers were often given the opportunity to withdraw from the killing operations, very few chose to do so. These arguments suggest that most of the Nazi perpetrators believed in what they were doing, and would have been willing, perhaps even eager to kill Jews, even in the absence of orders to do so.

Keywords cruelty, Holocaust, morality, obedience, perpetrators

Milgram’s (e.g., 1963, 1974) remarkable laboratory studies on obedience to authority have achieved their enduring impact and visibility, not only because of their unexpected findings—which at the time appeared to challenge some of our most basic and cher- ished notions about the human capacity for evil—but also because of their apparent

Corresponding author: Allan Fenigstein, Department of Psychology, Kenyon College, Gambier, OH 43022, USA. Email: [email protected]

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 582 Theory & Psychology 25(5) ability to shed light on the behavior of Nazi (and collaborating) murderers who actively participated, on a massive scale, in carrying out the systematic extermination of most of European Jewry.1 From the inception of his studies of obedience, Milgram was motivated by a desire to make sense of the Nazi extermination policy (Blass, 2002; Miller, Collins, & Brief, 1995). He believed that destructive obedience, that is, obeying orders to kill, was the underlying psychological mechanism that explained how otherwise ordinary Germans became capable of destroying the lives of millions of innocent Jews, writing that the Final Solution was “the most extreme instance of abhorrent, immoral acts carried out in the name of obedience [emphasis added]” (Milgram, 1967, p. 3). For Milgram (1974), the focus and abiding legacy of his research program was in (a) demonstrating the ease with which destructive obedience—that is, obeying orders to seriously injure or kill innocent victims, in spite of powerful moral or emotional resistance to those actions— could be elicited from seemingly normal persons and (b) suggesting that the Holocaust was, in large part, the result of obedience pressures that compelled the perpetrators to act out of a sense of obligation to their superiors. Although other scholars and researchers have accepted, at least in part, Milgram’s analysis of the Holocaust (e.g., Bauman, 1989; Berger, 1983; Blass, 2002; Darley, 1992; Kelman & Hamilton, 1989; Miller, 1986; Miller et al., 1995; Sabini & Silver, 1980), the present paper challenges that position. Specifically, it attempts to identify a number of critical conceptual problems2 that seriously undermine the notion that parallels can be drawn between the behavior of Milgram’s participants and those of the Nazi murderers. Ultimately, it will be argued, these problems warrant the conclusion that Milgram’s research has little, if anything, to say about the behavior of the perpetrators of the Holocaust.

The Milgram studies We begin with a brief description of the classic Milgram (1974, p. 35) research paradigm (referred to as the “voice feedback condition”). Milgram (1963) asked ordinary, mentally healthy participants,3 drawn from a broad spectrum of socioeconomic and educational backgrounds, to participate in a study on “learning” in which, as “teachers,” they were to administer punishment—in the form of increasingly severe electrical shocks—to another participant, the “learner,” whenever the learner failed to answer a question correctly. Although each teacher submitted to a low level, but quite unpleasant, sample shock (in order to heighten the realism of the situation), no other shocks were actually delivered during the experiment. But the teacher did not know that; for the teacher, the situation was extremely realistic and tension-provoking. As the supposed shocks grew increas- ingly intense, the learner/victim’s prerecorded reactions became more excruciating. Early in the procedure, the victim shouted that the shocks were becoming too painful. That was soon followed by a demand to be let out of the experiment. Next the victim cried out that he could no longer stand the pain. Eventually, he began yelling that he would not provide any more answers and insisted that he be freed. Over the course of the next set of shocks, there were agonized screams and, finally, silence. In trying to decide what to do during the experiment, many participants turned to the experimenter/authority

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Fenigstein 583 for help. The experimenter’s response, using a number of different verbal prods, was simple and direct: the experiment must continue and the experimenter would take “full responsibility.” How far did participants go in obeying the experimenter? The now well-known and still startling finding was that a maximum of 65% of the participants were willing to punish another person with an almost lethal dose of electric shock, despite the fact that the victim did nothing to merit such severe punishment, and that the experimenter had relatively little power to enforce his orders.4 Much of the fascination that attends to Milgram’s studies derives from the lessons it implies about the nature of evil, that is, about how easily ordinary persons, possessing no hostility or malevolence, but just “doing their jobs,” can carry out terribly destructive, inhumane commands. Another implication that some scholars (e.g., Askenasy, 1978; Charny, 1982; Lifton, 1986; Sabini & Silver, 1980) have drawn is that anyone—not just Nazis and their collabora- tors and sympathizers–could have participated in the nearly successful annihilation of the Jewish people.

Relevance to the Holocaust Almost immediately after they were published, the obedience studies were criticized as being too dissimilar from Nazi Germany to warrant generalizations (e.g., Baumrind, 1964). Milgram (1974), as well as others who shared his perspective (e.g., Browning, 1992), also recognized that there were enormous disparities in circumstances and conse- quences between the artificial laboratory situation of the obedience studies, and the all- too-real horrors of Nazi Germany. However, Milgram argued that the plausibility of extrapolating from the laboratory to the real world depends not so much on a detailed point-by-point comparison between the surface features of the two events but rather on the comparability of the underlying explanatory mechanism. Despite the apparent differ- ences in the situational details, Milgram believed that a similar psychological process— essentially, a predilection toward obedience—was centrally involved in both his laboratory studies and the Holocaust.

Comparing the external situational features As suggested, virtually everyone, including Milgram, has acknowledged enormous dif- ferences between the specifics of the laboratory studies done in the early 1960s, and the killing fields of Central Europe in the early 1940s; it is a relatively easy task to catalog these very concrete, observable, and sometimes painfully obvious differences.

Cultural context Milgram’s participants included college students as well as persons recruited from all walks of life (Milgram, 1974, p. 15) involved in psychological research, and who were presumably contributing to science, under the auspices of an academic institution dedi- cated to the welfare and betterment of all humankind. Nazi murderers were participating in what was considered to be a “war against the Jews” (Dawidowicz, 1975), even though

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 584 Theory & Psychology 25(5) their victims were totally defenseless, under the auspices of a dictatorship that was sin- gularly dedicated to the extermination of its Jewish “enemies.”

Actions performed Milgram’s participants were pushing buttons that many firmly believed were delivering painful, and potentially dangerous, shocks to the victim (see Milgram, 1974, p. 172). However, participants received repeated assurances from the authority that no permanent physical harm or injury would be inflicted on the other person. Thus, even if the partici- pant thought that the victim might be hurt, there likely was some doubt or uncertainty regarding the outcome of their actions, especially considering what some observers regard as a tacit understanding, in addition to the explicit promises given, that the experi- menter would not allow the participant to engage in dangerous or harmful behavior (e.g., Mixon, 1979; Orne & Holland, 1968). Nazi perpetrators knew, without question, that they were murdering their victims.

Clarity vs. ambiguity of the situation The question of certainty or confusion also extends to an understanding of the larger situ- ation. Ross & Nisbett (1991) have argued that in many ways, much of what was going on in the Milgram study simply did not make any sense to the participants; for example, at one point, they had to continue to interact with a totally silent, nonresponsive learner/ victim; and throughout the study, the experimenter seemed absolutely (and curiously) oblivious to the cries of anguish of an innocent research participant. In the face of this chaos and confusion, it may be understandable that participants became indecisive and highly dependent on a calm and confident authority issuing orders. In contrast, given the very clear and well-defined “military” operation before them, the powerful indoctrina- tion preparing them for the task, and the orderliness and efficiency of the killing process, it is unlikely that the Nazi perpetrators experienced any uncertainty about the murderous events taking place.

“Gradualness” of harm-doing Milgram’s participants initially used very low, harmless levels of shock, and only gradu- ally did the intensity of the shock increase. This may have created a sense of psychologi- cal entrapment or dissonance for the participant during the course of the experiment (e.g., Gilbert, 1981). For many of the killers, even if this was their first encounter with a victim, the act of killing came about quickly. If there were any preliminary activities, they often involved the humiliation, degradation, or brutalization of the victim (Dawidowicz, 1975; Goldhagen, 1996).

Relationship to victim Milgram’s participants thought that they were delivering potentially injurious shock to other experimental participants who were their peers. In Nazi Germany, innocent Jewish

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Fenigstein 585 victims (including children) had been systematically devalued to the point where their lives were considered to be “unworthy of life.” Centuries of anti-Semitism, a decade of legalized , and an intense, unrelenting onslaught of vicious, hateful propa- ganda (e.g., Herf, 2006) depicting Jews as vile enemies, intent on domination, exploita- tion, and victimization of the German people, had allowed many Germans to remove Jews from the circle of human obligation, ultimately preparing the perpetrators to accept, and perhaps even welcome, the vilification and eventual destruction of the Jews.

Relationship to authority The pressures to obey authority in the Milgram study derived not from any power the experimenter had to reward or punish the participant, but rather from the experimenter’s “legitimate” position as a trusted, expert scientist (see Aronson, Brewer, & Carlsmith, 1985). That legitimacy, however, not only conveyed an obligation on the part of the par- ticipant toward cooperation and obedience, it also involved responsibility, on the part of the experimenter, for the safety and well-being of the research participants. The dual role of the experimenter/authority, pressuring the participant’s behavior toward the learner in two different directions, may help to explain why the teacher/participants displayed a great deal of resistance, questioning, and hesitation in response to the experimenter’s instructions; there was little indication of any clear, “absolute” obedience (see Hoffman, Myerberg, & Morawski, 2015). In contrast to the Milgram studies, the Nazi killer’s act of murdering the victim was in no way countermanded; as noted by Baumrind (1964), Nazi perpetrators had little reason to believe that their superior officer had any concern for the well-being of the victim. In addition to the legitimate authority of commanding officers, superior officers in a military hierarchy can also expect obedience from their subordinates by virtue of their coercive power. Thus, there is every reason to expect unquestioning, absolute obe- dience in a military operation such as the Holocaust, and that has been the operating assumption for many scholars (see, e.g., Miller, 1986). Interestingly, however, the extent to which the killing of innocent Jews was absolutely demanded from Nazi soldiers has been seriously questioned (e.g., Browning, 1992; Friedlander, 1989; Goldhagen, 1985). The critical issue for the moment, however, is that the nature of authority, and the subor- dinate’s relation to that authority, were markedly different in the two situations.

Intensity of authority pressure Based on post-experimental interviews with laboratory participants, as well as observa- tions made during their participation, it may be argued that participants were faced with an ethically unacceptable high level of obedience pressure directing them to physically harm an innocent other (Baumrind, 1964). The intensity of this pressure may stand in ironic, yet significant, contrast to the experience of the perpetrators in Nazi Germany. As noted above (and discussed in more detail below), one of the most remarkable aspects of the authority–subordinate relationship operating among the perpetrators of the Holocaust was the extraordinary lack of pressure emanating from superior officers when asking for (rather than demanding) the participation of subordinates in the slaughter of Jews.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 586 Theory & Psychology 25(5)

Inter-individual vs. intergroup perspectives The Milgram studies may be seen as focusing almost exclusively on the attitudes and behaviors of people as individuals; that is, the paradigm is an attempt to understand how any person, regardless of their background, personality, or group identity, would respond to authority pressures to hurt another person who, again, could be anyone. In contrast, the almost successful annihilation of European Jews by the Nazis, involved the actions of individuals as group members, and must be fundamentally understood and analyzed as an intergroup phenomenon. The Holocaust, as an event, is not only historically defined in terms of Nazis and Jews, but was experienced in those same terms by the victims and the perpetrators (Tajfel, 1981). As a result, a psychological understanding of the behavior of the Nazi murderers must take on a fundamentally different focus of analysis: the thoughts, feelings, and actions of the perpetrators as a whole may have been much more a matter of their shared group identity as Nazis than a matter of their individual charac- teristics (e.g., Fenigstein, 1998; Reicher, Haslam, & Smith, 2012).

Can “surface” differences be dismissed? Milgram acknowledged a multitude of circumstantial and behavioral differences between his participants and those of the Nazi murderers, but dismissed those differences as largely irrelevant to the issue of generalization (see Mastroianni, 2015; Nicholson, 2015). This dismissal is especially puzzling in view of the fact that Milgram’s own research program showed that different experimental circumstances (such as altering the physical distance between the teacher and the experimenter) had significant effects on the extent to which participants obeyed or defied the experimenter (see, e.g., Blass, 1992; Mandel, 1998; Miller, 1986; Ross & Nisbett, 1991). How, then, can the eight situational differ- ences enumerated above simply be ignored? Milgram’s response was that the important question, as to whether the obedience studies offered any insight into the psychology of the Holocaust perpetrators, is in the comparison of (what he considered) the more critical, core psychological processes oper- ating in the two situations. Before turning to the question of core processes, however, it may be appropriate to ask whether Milgram’s dismissal of the relevance of “surface” differences is warranted. Given the vast differences in time and place, and the obvious dissimilarities, not only in their specific details, but in the enormity of the events under discussion, why would Milgram assume that the underlying psychological processes involved in the obedience studies were similar to those of the Nazi perpetrators? Because, as Milgram (1974, pp. 175–178) explained, there were recognizable similarities between the two events. In both cases, innocent victims were being hurt; the agents of harm were apparently ordi- nary persons who were, at the time of the harm-doing, subordinates in a hierarchical relationship, and an authority figure was present or in close proximity. In effect, Milgram’s (and others’) assumptions regarding a relationship between his research and the Holocaust, may very well have been based on an extremely selective consideration of the “surface” features of the two events. Milgram assumed an underly- ing psychological correspondence because he focused on similar elements of the two

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Fenigstein 587 events; however, with regard to the dissimilarities of the two events, he simply dismissed those features as irrelevant to the question of underlying processes. The question, of course, is whether the similarities he emphasized are any more crucial to the question of core psychological mechanisms than the dissimilarities he dismissed as superficial and irrelevant.

Comparing the underlying psychological mechanisms Despite apparent differences in the situational details of his laboratory studies and the Holocaust, Milgram believed that comparable psychological processes—essentially, a predilection toward obedience—was essential to explaining both events. Milgram’s (1974) argument for a common psychological mechanism actually involved two differ- ent levels of analysis. At a micro-psychological level, the specific mechanism that Milgram posits to explain obedient behavior, both in his laboratory participants and in the Holocaust perpetrators, is the agentic state, in which subordinates relinquish a sense of personal responsibility for their behavior and become thoughtless agents of action. At a macro-psychological level, he suggested that the essence of destructive obedience involves a situation in which a person is ordered by a legitimate authority to harm a third person. That, he asserted, is the fundamental situation that confronted both his partici- pants and the Nazi perpetrators. Problematic questions exist with respect to both levels of analysis (see Brannigan 2013, pp. 12–15).

The micro-psychological: The agentic state Milgram argued that ordinary persons, as a result of their history, become well-practiced in adopting the mentality of an agent who, in an essentially mindless fash- ion, performs an action that is authorized by someone else. The most far reaching conse- quence of this submission to authority is an “extraordinary psychological transformation,” in which a sense of conscience or responsibility for one’s actions disappears. All initia- tive is attributed to the authority, and both the obedient experimental participant as well as the Nazi perpetrator, see themselves simply as an instrument of the authority, and in no way morally accountable for their actions. This “agentic” state, for Milgram, is not merely a thin alibi; it represents a fundamental change in the individual’s mode of think- ing and self-understanding. This perspective certainly has some appeal, particularly as it relates to the Nazi murder- ers. The concept of the agentic state is frighteningly evocative, for example, of the litany of “I was just following orders” heard repeatedly at the Nuremberg trials. Although Milgram argued that these defense claims actually represent serious and significant psychological truths, the possibility that these men were simply trying to avoid or mitigate punishment for their crimes must obviously be considered (as the Nuremberg jurists, in fact, often did). Thus, it is difficult to find convincing evidence for the agentic state by examining the tes- timony (or psychiatric protocols, e.g., Lifton, 1986) of Nazi war criminals. Another, more rigorous, means of assessing the validity of the agentic state is through controlled research that attempts to determine whether experimental participants, who have complied with the demands of an authority, are prepared to simply “give away”

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 588 Theory & Psychology 25(5) responsibility for their behavior. In general, the research has not offered a great deal of support for the idea of an agentic “shift” in responsibility (e.g., Blass, 1992; Miller, 1986). Milgram (1974) himself, contrary to the predictions of his own theory, found that obedient participants attributed just as much responsibility for their behavior to the experimenter as did participants who defied the experimenter’s demands. Similarly, other studies (e.g., Mantell & Panzarella, 1976; Tilker, 1970) have shown essentially no relationship between the participants’ degree of obedience to the authority, and their assignment of responsibility for their behavior. In general, the research findings seriously challenge the explanatory value of an “agentic” state which involves the easy removal or denial of responsibility for one’s own immoral actions. Given both the empirical weak- ness of this concept, as well as the questionable evidence regarding its existence in the Nazi mind, it would be difficult to argue that Milgram’s conception of the agentic mental state has any relevance to the Holocaust.

The macro-psychological level: Parallel situations With regard to the macro-psychological argument, although the “obedience to authority” explanation has been questioned, even with respect to the Milgram studies (e.g., Nissani, 1990; Orne & Holland, 1968; Perry, 2012; Reicher et al., 2012), the critical issue in the present paper is whether obedience to the demands of a military authority constitutes an essential element of the behavior of the Nazi murderers. Milgram was not the only scholar to assert that the Nazi perpetrators murdered out of a sense of duty or diligence or the demand to obey. Most famously, Hannah Arendt’s (1963) notion of the “banality of evil,” claiming that Nazi henchmen like Adolph Eichmann were little more than uninspired bureaucrats simply doing what they were told, was seen by Milgram as a powerful confirmation of his argument that mindless, but dutiful, obedience was guiding the behavior of Nazi murderers. An examination of the evidence, however, raises troubling doubts about whether Nazi bureaucrats were ever simply following orders in a dutiful, dispassionate way (e.g., Cesarani, 2004; Lipstadt, 2011; Lozowick, 2002; Vetlesen, 2005). Functionaries like Eichmann not only performed their duties, but showed a good deal of pride and personal satisfaction in carrying out their work. Although a minority may have acted in this fash- ion (see Nicholson, 2011), most of the participants in Milgram’s studies acted very dif- ferently. Unlike Eichmann, they were very much troubled by what they were doing (Milgram, 1974; Perry, 2012). Anyone involved in the experiment who thought they were harming an innocent other experienced great tension, distress, and while shocking the victim. That is, on the basis of how they acted when engaged in the process of hurting others, there is little reason to believe that the behavior observed in Milgram’s laboratory participants involved the same psychological processes guiding someone like Eichmann. Their behavior may have differed, however, not because of a lack of correspondence in the underlying processes, but rather because of critical differences in their two situa- tions. Eichmann only had to sit at his desk to participate in mass murder. He rarely expe- rienced, in any direct, personal way, the consequences of his actions (but when he did, touring the concentration camps, even he was sickened). The participants in Milgram’s

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Fenigstein 589 experiment, on the other hand, were given graphic and immediate confirmation of the effect of the shocks they were administering. These circumstantial differences could eas- ily account for the different emotional responses. Perhaps a more appropriate test of corresponding psychological mechanisms would be a comparison of Milgram’s laboratory findings with the behavior of the Reserve Police, whose men confronted the effects of their actions in an extremely direct and explicit way— they literally became [saturated in the blood of their victims] (Browning, 1992). Browning, in evident agreement with Milgram’s analysis, felt that [many of Milgram’s insights (regarding obedience) find graphic confirmation in the behavior and (subsequent) testi- mony of the men of Reserve Police Battalion 101] (p. 174), who carried out shootings and deportations to death camps of thousands of Jewish victims. Relying extensively on Milgram’s research and theoretical analyses, Browning argued that pressures to obey the directives of authority, “even to the point of performing repug- nant actions in violation of ‘universally accepted’ moral norms” (p. 171), were an impor- tant influence in explaining how the “ordinary men” of Reserve Police Battalion 101 became active participants in mass murder. At one point, he even suggests that the mas- sacre these police carried out one day at Józefów where 1500 men, women, and children were systematically slaughtered by gunfire, might have been “a kind of radical that took place in a Polish forest with real killers and victims” (pp. 173–174). It is difficult to imagine a more emphatic endorsement of Milgram’s studies as directly applicable to the Holocaust. But one needs to be careful in accepting this analysis by analogy. There is no question that when faced with orders to commit murder, the police- men carried out those atrocities. But it is necessary to ask whether, despite the appear- ance of obedience, there are alternative, and perhaps more valid, explanations that account for the killings.

An alternative explanation for the killings More specifically, it may be argued that although orders to kill Jews existed, those orders were not the primary motivating force for the killings. Rather, the orders may have pro- vided an opportunity for the expression of more powerful, emotion-based motives. A central, compelling element of Nazi ideology, and an attitude that prevailed among hun- dreds of thousands of the people of Nazi Germany, was toward the Jews (Bauer, 2001). From 1933 onward, building on a long history of hatred, vehement and unrelent- ing anti-Jewish propaganda had vilified and demonized the Jews, depicting them as “other,” as racially inferior, and as the embodiment of vileness and filth. Jews were seen as a disease and “misfortune,” and were blamed for whatever had gone wrong in Germany: war, moral corruption, economic distress, defeat, and postwar humiliation. Only through their elimination could health be restored to the German body politic (Dawidowicz, 1975; Friedlander, 1989; Gordon, 1984). This incitement of contempt and hatred for the Jewish enemy, together with the inces- sant proclamation of German superiority, could easily have provided the primary motive for the killings (e.g., Fenigstein, 1998). Browning (1992), in fact, acknowledged that the men of the Reserve Police were “immersed in a deluge of racist and anti-Semitic propa- ganda that thoroughly prepared them, ideologically, for the mass murders they were

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 590 Theory & Psychology 25(5) about to perpetrate” (p. 183). If the Nazi perpetrators were “ideologically prepared” to see their Jewish victims as vermin, as less than human, and as hated enemies that threat- ened their way of life, murdering them could hardly be explained as “just following orders.” A more plausible explanation is that many of those who did the killing believed it was just and necessary, and would have been willing to kill Jews, even in the absence of orders to do so (e.g., Baumeister, 1997; Fenigstein, 1998; Goldhagen, 1992, 1996). Thus, before accepting Milgram’s and Browning’s argument that the same kind of obe- dience pressure at work in the laboratory was also working during the killing operations by the Reserve Police, three significant points of departure between Milgram’s studies and the events at Józefów need to be examined. All three suggest that although the killings, in some ways, appeared to be a matter of simply following orders, in fact, the perpetrators’ behav- iors were much more strongly motivated by personal antipathy toward the victims and beliefs in the value of the killings, than by situational obedience pressures.

Where was conscience? A critical element of the Milgram research was that obedience pressures were necessary to compel destructive actions in the face of powerful moral resistance to those actions. As Milgram (1974, p. 13) has argued, the potency of obedience can only be effectively demonstrated when it is opposed by a powerful force, such as the moral imperative against harming an innocent other, that works in the direction of disobedience; in the absence of moral opposition to an action, obedience is unnecessary as an explanation. Milgram (1974) conceptualized his laboratory situation as a “dilemma posed by the con- flict between conscience and authority” (p. 179). In that situation, obedience pressures were needed to overcome significant moral resistance by the participants; without that pressure, most participants would never have engaged in hurtful behavior that they con- sidered to be wrong or unjust. In the laboratory studies, the extreme tension and distress exhibited by many of the participants offered direct evidence of the moral conflict they were experiencing between the demands of a legitimate authority and the demands of conscience. Many participants, in addition to their emotional ordeal, felt they had acted against their own moral values, voiced disapproval of their actions and denounced them as wrong, and frequently drew attention to the victim’s suffering (Milgram, 1974, p. 41). Thus, it is reasonable to infer a significant amount of moral resistance among the experimental participants. Milgram (1974) assumed that the Nazi perpetrators had also experienced a similar “conflict between conscience and authority.” If such a psychological process had been experienced by the men of Reserve Police Battalion 101, it should have been evidenced by similar moral concerns. On the basis of their actions and testimonies, however, such evidence is sorely lacking. There were a few men in the battalion—12 out of a total of almost 500—who, from the beginning, extricated themselves from the impending mass murder; a few more removed themselves from the killing squads only after they had already committed several murders; and some managed on occasion to avoid killing when an opportunity, such as the absence of direct surveillance, arose. In all, Browning (1992) estimated that perhaps 10 to 20% sought to evade the killings, once the shootings began. This means, of course, that at least 80% of those called upon to shoot helpless

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Fenigstein 591 victims continued to do so until the last Jew had been killed—and often did so with a sense of relish and bravado that clearly belied the presence of any moral repugnance or distress (e.g., Goldhagen, 1996). Was the reluctance of the few evaders motivated by moral concerns or empathy with their victims’ plight? Browning (1992) admitted that in their testimonies, given 20 to 25 years after the fact (and after considerable opportunity for reflection), “those who quit shooting … overwhelmingly cited sheer physical revulsion as the prime motive, but did not express any ethical principle behind this revulsion” (p. 74). A particularly telling and egregious, but not uncommon, example of the source of the killers’ concerns, comes from the testimony of one of the evaders who described the horrifying results of the kill- ing action as follows: “The shooters [emphasis added] were gruesomely besmirched with blood, brains, and bone splinters; it hung on their [emphasis added] clothing” (p. 65). A similar picture emerges from a “Wehrmacht psychiatrist who had treated large num- bers of Einsatzgruppen for psychological disorders,” and reported that although 20% of the killers experienced such symptoms, about half of them “associated it with the unpleasantness of what they had to do” (Lifton, 1986, p. 15). Thus, based on the reports of Nazi physicians themselves, although some of the killers may have experienced dis- tress, it was not the result of any sense of guilt or remorse. Browning (1992) hinted at other indications of a sense of moral wrongdoing among the perpetrators, but few of those arguments were compelling. For example, the men of Reserve Police Battalion 101 were often described as bearing an enormous psychological burden, but the precise nature of that burden remains to be explained; their own testimony suggests that it was not the result of principled moral opposition to the destruction of innocent lives, but rather in response to the sheer horror and grisliness of the mass slaugh- ter. Browning also described a feeling of “shame” that pervaded the room when the men returned to their barracks, which he claimed was related to a sense of moral transgression. But he also found that “by silent consensus … the massacre was simply not discussed” (p. 69) by the men, making overt indications of such shame difficult to identify. That these former Nazis failed to express any ethical qualms concerning their partici- pation in the unspeakably brutal murders of innocent, unresisting men, women, and chil- dren is especially remarkable considering the circumstances under which these testimonies were given. Speaking to representatives of the criminal justice system in a democratic Germany, these men had powerful motives to express remorse, whether gen- uine or fabricated, in the hope of mitigating their guilt and perhaps minimizing their punishment. In the absence of any such expression, it is difficult to escape the conclusion that these men simply did not feel that they had performed any morally repugnant actions. The failure of the Reserve Police to voice any moral opposition to the killing of Jews is critical to the present analysis in several ways. First, it represents a crucial difference between Milgram’s participants, many of whom were reluctant to participate and experi- enced moral distress over their destructive actions; and the Nazi perpetrators, whose reluctance to kill, if it existed at all, was not a matter of principled conscience, but rather of physical disgust. Second, the absence of moral opposition to the killings not only undermines the necessity, and thus the viability, of obedience as an explanation, it also suggests that the perpetrators did not perceive the killings as evil or morally repugnant. Rather, by the time the murders were committed, the victims had already lost their

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 592 Theory & Psychology 25(5) humanity in the eyes of the perpetrators, and the killings had no moral force (Koonz, 2003). Moreover, the killings were not only destroying a subhuman enemy, they were also asserting the Nazis’ superiority, which may help explain the willingness and dedica- tion of the perpetrators to the killing process (see Fenigstein, 1998).

Was the killing mechanistic? The concept of obedience, as formulated by Milgram, and used by Arendt and Browning, suggests an image of the Nazi executioners as dutiful agents mechanically carrying out the murderous commands of the leader, without hate or malevolence toward their victim. However, much of the Nazis’ behavior toward their Jewish victims was anything but detached and emotionless. Arendt’s (1963) description of Eichmann has been seriously questioned by a number of critics who point out that Eichmann pursued his goal of ship- ping as many Jews as possible to the concentration camps with an overzealousness and perseverance that was clearly beyond the call of duty (Cesarani, 2004; Lipstadt, 2011; Robinson, 1965). Arendt (1966) herself, in apparent contradiction of her own thesis, offered some horrifying descriptions of the torture and brutality that accompanied the murders committed by the Nazis, clearly acknowledging that there was another face to the Holocaust than that of the dutiful soldier. Browning (1992), as well, in describing the behavior of Reserve Police Battalion 101, provided many examples of their penchant for wanton cruelty, barbarism, and sadism toward Jews. In one case, for example, totally naked Jews, particularly the old and bearded, symbols of classic Jewry, were forced to crawl in front of their intended graves and sustain beatings with clubs, before being killed. In several instances, the killers themselves spoke proudly of their routine and unnecessary cruelty (Goldhagen, 1996). It should be noted that in the case of the Reserve Police, these perpetrators were not all SS men or ardent supporters of Hitler’s regime. They were, for the most part, ordinary Germans who were not, presumably, possessed of any virulent anti-Semitism (Friedlander, 1997) or obvious inclination toward mass murder. Yet, based on their actions at Józefów and their subsequent testimony, these men were not morally opposed to the Holocaust; when the time came, they proved themselves capable of killing Jews with eagerness, dedication, and zeal. In general, the historical evidence on the spontaneity, initiative, enthusiasm, and pride with which the Nazis degraded, tortured, and killed their victims, is utterly incompatible with the concept of obedience, and simply has no counterpart in the behavior that Milgram observed in his laboratory studies.

Resisting authority The nature of authority in Milgram’s laboratory was not absolute; participants, presum- ably, could have quit at any time, and the experimenter had no real power to prevent them from doing so. But surprisingly few participants in the baseline condition exercised that option. There were a number of powerful “binding” factors—both social and percep- tual—that, Milgram argued, essentially locked those participants into a very narrow view of their role, a view which, in effect, prevented them from recognizing the possibility of escape. That is, although a real choice existed, many participants never fully realized or

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Fenigstein 593 seriously considered the option to leave the experiment. But when participants were given an opportunity to extricate themselves from the situation or defy the authority, which occurred in many experimental modifications of the “baseline” condition, a sig- nificant majority of participants seized that opportunity (Mandel, 1998). In ironic contrast, the Nazi perpetrators may have been very much aware of the possibil- ity of escape or avoidance of the killing task. Consider, for example: what happened to those men who were unwilling to participate in the mass murder of Jews? It has been widely assumed (e.g., Baumeister, 1997; Miller, 1986) that if an SS man refused to carry out a killing order, he himself would be severely punished or killed. This assumption is largely based on repeated and emphatic assertions of this supposed truth by defendants at postwar trials. In addition, given the brutality and fanaticism of the SS, it seemed reason- able to assume that the externally directed terror of the SS could easily be turned inward. But these assumptions are largely mistaken. Himmler himself had issued a written order to the effect that only those dedicated to the task should be killing Jews. At some level, Himmler understood that only the most fervent Nazis, those most deeply committed to the cause, could carry out the essential, but difficult task—in the sense of physical, rather than moral, revulsion—of extermination. Failure to carry out the task was not to be punished; it was merely a matter of shame and disgrace for not measuring up to the Nazi ideal. There is not a single instance on record of harsh punishment ever being used, or even being possible, for disobeying a killing order (Browning, 1992; Goldhagen, 1985). That many perpetrators knew they had a choice of not participating in the killing is made explicit by the testimony of one of the Reserve Police: “It was in no way the case that those who did not want to or could not carry out the shooting of human beings with their own hands could not keep themselves out of this task. No strict control was being carried out here” (Browning, 1992, p. 65). The same point is made by Commander Ohlendorf of Einsatzgruppe D, who testified at Nuremberg that he [had sufficient occa- sion to see how many men of my Gruppe did not agree to this (killing) order in their inner opinion. Thus I forbade the participation in these executions on the part of some of these men and sent them back to Germany] (Goldhagen, 1985, p. 23). Again, it should be understood that “inner opinion” was not necessarily a matter of conscience. In one case, the SS officer requesting a transfer made it clear that his incapacity to continue killing resulted not from any principled disapproval of the slaughter, but rather from the physi- cal revulsion to the act, rendering him psychologically unfit for the duty. He was deeply ashamed of his inability to “sacrifice himself to the very last for the cause of Germany,” but did not want to “disgrace Germany’s image” by “presenting the spectacle of one … who has succumbed to cowardice” (Goldhagen, 1985, p. 29). This Nazi did not oppose the extermination of the Jews out of moral concerns. In fact, every indication was that he believed in the justice of the Einsatzgruppe’s task; he simply could not endure the emo- tional strain of killing. These observations again seriously challenge the argument that the murderers were at the mercy of unyielding obedience pressures. When ordered to kill innocent, defenseless Jews en masse, dispatching them by machine-gun fire into mass graves, the men drafted into both the mobile killing squads of the Einsatzgruppen and the Reserve Police units, for the most part, complied with the order. But it is difficult to argue that compliance was due to obedience. The order to kill was not absolutely compelling and most of the men

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 594 Theory & Psychology 25(5) knew that. Those who were unwilling to kill innocent, defenseless humans could avoid doing so; those who did not extricate themselves from the task, therefore, were choosing to participate in the killings. Given that choice, it makes little sense to attribute the behavior of the perpetrators to powerful obedience pressures that supposedly over- whelmed their moral sensibilities; far more compelling is the conclusion that the perpe- trators were willing, perhaps even eager, to follow orders because they believed that the order to kill innocent Jews was right and just.

Conclusion In the face of these arguments, the idea that obedience to authority was an essential ele- ment of the perpetrator’s behavior is simply untenable. With respect to external appear- ances, Milgram may have perceived surface similarities in the two events, but as demonstrated earlier, those similarities have no greater weight in the determination of generalizability than equally striking surface differences between the two events. With respect to more basic processes, Milgram argued that the underlying mechanism of “obedience to authority” could inform an understanding of the murderous behavior of the Nazi perpetrators. But the research evidence in support of an agentic state—the core psychological transformation that explains the removal of conscience and responsibil- ity—is extremely tenuous, if not nonexistent. Most critically damaging to his case, however, is the evidence suggesting that the kind of obedience pressure that is at the heart of Milgram’s analysis—that is, authoritarian orders that compelled atrocities, in spite of powerful moral and emotional resistance to those actions—simply did not exist. There is little or no evidence of moral opposition or distress by the killers; for example, a sense of guilt or shame or wrongdoing or concern for the victims. When actions are taken that are not considered to be wrong or immoral, there is no need to invoke external causes, such as obedience pressures, to account for those actions. There is also considerable evidence that the Nazis’ behavior toward the Jewish vic- tims was especially brutal and inhumane. Such excesses cannot be explained by obedi- ence pressure, precisely because they exceeded any orders from above; these cruelties, however, are consistent with a sense of vindictiveness and personal satisfaction that comes out of degrading and destroying the lives of a hated, despicable enemy. Finally, and perhaps most significantly, the great majority of perpetrators had choices. Most of the men who participated in the killings were not compelled or coerced by authority to do so, but rather, chose to do so. In summary, the overwhelming differences in both surface appearances, and in the underlying theoretical mechanisms that may be inferred, make it difficult to escape the conclusion that the obedience studies have little or no relevance to the psychology of the perpetrators. While Ross and Nisbett (1991) have observed that Milgram’s findings “have become part of our … shared intellectual legacy” (p. 55), as has been argued throughout this paper, there is little reason to believe that obedience was the crucial fac- tor for the perpetrators of the Holocaust. Although destructive obedience may exist as a powerful and frightening potential, that does not mean that when innocent people are killed, it is necessarily the result of obedience.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Fenigstein 595

If we compare the Holocaust with Milgram’s research in a less analytical, but more prosaic fashion, the differences are brutally clear. Although we may be upset, saddened, even disappointed by the behavior of Milgram’s participants, the terms that are routinely used to describe the horrors of the Holocaust—for example: atrocity, inhumanity, hate- fulness, wickedness—are simply preposterous in the context of Milgram’s studies. Those terms suggest a psychological state that is almost the antithesis of that observed in the lab studies. Milgram’s research participants may have obeyed a malevolent authority, but unlike the Nazi perpetrators, they were not willing participants; they were not responding to malevolent drives within themselves. In contrast, the eagerness and enthusiasm with which so many men participated in the Nazi genocidal program simply cannot be explained as merely a matter of duty and discipline, but must instead take full account of the powerful enmity that was operating toward the victims. In conclusion, the differences between the laboratory participants and the perpetra- tors of the Holocaust are too overwhelming to be dismissed: Milgram claimed that the fundamental lesson of his research is an understanding of human weakness, and the frailty of conscience, in the face of malevolent demands by authority to engage in evil that is universally recognized as wrong or immoral. In stark contrast, the fundamental lesson of the Holocaust is an understanding of the human willingness to engage in evil when that evil has been transformed, by social conditioning and state sanction, into something that is right and just, a source of great personal, national, and racial pride, and a matter that has almost nothing to do with conscience, morality, or obedience pressures.

Funding This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.

Notes 1. Although the Nazis murdered millions of other innocent victims, the nature of Nazi crimes against the Jews was both qualitatively and quantitatively different from the crimes com- mitted against other victim groups; no other group was targeted for total annihilation, or as zealously brutalized, humiliated, and persecuted. Because the focus of this paper is on Jewish victims of , the use of the term “victim” will imply only Jewish victims. Similarly, the use of the term Nazi is intended to refer to all those who participated in the Nazi exter- mination program against the Jews, including non-German collaborators, and other agents of Germany who were not Nazis. 2. Methodological problems concerning Milgram’s research, for example, whether Milgram’s participants really believed they were injuring another person (e.g., Mixon, 1979), or whether there is a plausible alternative to the “obedience to authority” explanation for their destructive behavior, such as “demand characteristics” (e.g., Orne & Holland, 1968), are not at issue in the present analysis. 3. The great majority of Milgram’s participants were male, as were the great majority of Nazi perpetrators, and so references to these persons throughout the paper will use the male pro- noun forms. 4. Subsequent studies have shown this finding to be reasonably reliable (see reviews by Blass, 1992; Miller, 1986).

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 596 Theory & Psychology 25(5)

References Arendt, H. (1963). Eichmann in Jerusalem: A report on the banality of evil. New York, NY: Viking. Arendt, H. (1966). Introduction. In B. Naumann, Auschwitz: A report on the proceedings against Robert Karl Ludwig Mulka and others before the court at Frankfurt (J. Steinberg, Trans.). New York, NY: Praeger. Aronson, E., Brewer, M., & Carlsmith, J. M. (1985). Experimentation in social psychology. In G. Lindzey & E. Aronson (Eds.), The handbook of social psychology (3rd ed., pp. 441–486). New York, NY: Random House. Askenasy, H. (1978). Are we all Nazis? Secaucus, NJ: Lyle Stuart. Bauer, Y. (2001). Rethinking the Holocaust. New Haven, CT: Yale University Press. Bauman, Z. (1989). Modernity and the Holocaust. Ithaca, NY: Cornell University Press. Baumeister, R. F. (1997). Evil: Inside human and cruelty. New York, NY: Freeman. Baumrind, D. (1964). Some thoughts on ethics of research: After reading Milgram’s “Behavioral studies of obedience.” American Psychologist, 19, 421–423. Berger, L. (1983). A psychological perspective on the Holocaust: Is mass murder part of human behavior? In R. L. Braham (Ed.), Perspectives on the Holocaust. Boston, MA: Kluwer- Nijhoff. Blass, T. (1992). The social psychology of Stanley Milgram. In L. Berkowitz (Ed.), Advances in experimental social psychology (Vol. 25, pp. 277–329). New York, NY: Academic Press. Blass, T. (2002). Perpetrator behavior as destructive obedience. In L. S. Newman & R. Erber (Eds.), Understanding genocide (pp. 91–109). New York, NY: Oxford University Press. Brannigan, A. (2013). Beyond the banality of evil: Criminology and genocide. Oxford, UK: Oxford University Press. Browning, C. R. (1992). Ordinary men. New York, NY: HarperCollins. Cesarani, D. (2004). Eichmann: His life and crimes. London, UK: Heinemann. Charny, I. W. (1982). Genocide: The human cancer. New York, NY: Hearst. Darley, J. M. (1992). Social organization for the production of evil. Psychological Inquiry, 3, 199–218. Dawidowicz, L. S. (1975). The war against the Jews. New York, NY: Holt, Rinehart, & Winston. Fenigstein, A. (1998). Reconceptualizing the obedience of the perpetrators. In D. Shilling (Ed.), Lessons and legacies of the Holocaust (Vol. II). Evanston, IL: Northwestern University Press. Friedlander, S. (1989). From anti-Semitism to extermination. In F. Furet (Ed.), Unanswered ques- tions: Nazi Germany and the genocide of the Jews. New York, NY: Schocken. Friedlander, S. (1997). Nazi Germany and the Jews: The years of persecution, 1933–1939 (Vol. 1). New York, NY: Harper Collins. Gilbert, S. J. (1981). Another look at the Milgram obedience studies: The role of the gradated series of shocks. Personality and Social Psychology Bulletin, 7, 690–695. Goldhagen, D. (1985). The cowardly executioner: On disobedience in the SS. Patterns of preju- dice, 19, 19–32. Goldhagen, D. J. (1992, July 13–20). The evil of banality. The New Republic, 49–52. Goldhagen, D. J. (1996). Hitler’s willing executioners. New York, NY: Knopf. Gordon, S. (1984). Hitler, Germans, and the “Jewish question”. Princeton, NJ: Princeton University Press. Herf, J. (2006). The Jewish enemy. Cambridge, MA: Press. Hoffman, E., Myerberg, N. R., & Morawski, J. G. (2015). Acting otherwise: Resistance, agency, and subjectivities in Milgram’s studies of obedience. Theory & Psychology, 25, 670–689.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Fenigstein 597

Kelman, H. C., & Hamilton, V. L. (1989). Crimes of obedience: Toward a social psychology of authority and responsibility. New Haven, CT: Yale University Press. Koonz, C. (2003). The Nazi conscience. Cambridge, MA: Harvard University Press. Lifton, R. J. (1986). The Nazi doctors: Medical killing and the psychology of genocide. New York, NY: Basic Books. Lipstadt, D. E. (2011). The Eichmann trial. New York, NY: Schocken Books. Lozowick, Y. (2002). Hitler’s bureaucrats: The Nazi security police and the banality of evil. London, UK: Continuum. Mandel, D. R. (1998). The obedience alibi: Milgram’s account of the Holocaust reconsidered. Analyse & Kritik, 20(1), 74–79. Mantell, D. M., & Panzarella, R. (1976). Obedience and responsibility. British Journal of Social and Clinical Psychology, 15, 239–245. Mastroianni, G. R. (2015). Obedience in perspective: Psychology and the Holocaust. Theory & Psychology, 25, 657–669. Milgram, S. (1963). Behavioral studies of obedience. Journal of Abnormal and Social Psychology, 67, 371–378. Milgram, S. (1967). Obedience to criminal orders: The compulsion to do evil. Patterns of Prejudice, 1, 3–7. Milgram, S. (1974). Obedience to authority: An experimental view. New York, NY: Harper & Row. Miller, A. G. (1986). The obedience experiments: A case study of controversy in social science. New York, NY: Praeger. Miller, A. G., Collins, B. E., & Brief, D. E. (1995). Perspectives on obedience to authority: The legacy of the Milgram experiments. Journal of Social Issues, 51, 1–20. Mixon, D. (1979). Understanding shocking and puzzling conduct. In G. P. Ginsburg (Ed.), Emerging strategies in social psychological research (pp. 155–176). New York, NY: Wiley. Nicholson, I. (2011). “Shocking” masculinity: Stanley Milgram, “Obedience to Authority,” and the crisis of manhood in Cold War America. ISIS, 102, 238–268. Nicholson, I. (2015). The normalization of torment: Producing and managing anguish in Milgram’s “obedience” laboratory. Theory & Psychology, 25, 639–656. Nissani, M. (1990). A cognitive reinterpretation of Milgram’s observations on obedience to author- ity. American Psychologist, 45, 1384–1385. Orne, M. T., & Holland, C. H. (1968). On the ecological validity of laboratory deceptions. International Journal of Psychiatry, 6, 282–293. Perry, G. (2012). Behind the shock machine: The untold story of the notorious Milgram psychology experiments. Melbourne, Australia: Scribe. Reicher, S. D., Haslam, A., & Smith, J. R. (2012). Working toward the experimenter: Reconceptualizing the obedience with the Milgram paradigm as identification-based follow- ership. Perspectives on Psychological Science, 7, 315–324. Robinson, J. (1965). And the crooked shall be made straight: The Eichmann trial, the Jewish catastrophe, and Hannah Arendt’s narrative. New York, NY: Macmillan. Ross, L. D., & Nisbett, R. E. (1991). The person and the situation: Perspectives of social psychol- ogy. New York, NY: McGraw-Hill. Sabini, J. P., & Silver, M. (1980). Destroying the innocent with a clear conscience: A socio- psychology of the Holocaust. In J. E. Dimsdale (Ed.), Survivors, victims, and perpetrators: Essays on the Nazi Holocaust (pp. 329–358). Washington, DC: Hemisphere Press. Tajfel, H. (1981). Human groups and social categories. Cambridge, UK: Cambridge University Press.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 598 Theory & Psychology 25(5)

Tilker, H. A. (1970). Socially responsible behavior as a function of observer responsibility and victim feedback. Journal of Personality and Social Psychology, 14, 95–100. Vetlesen, A. J. (2005). Evil and human agency. Cambridge, UK: Cambridge University Press.

Author biography Allan Fenigstein is a professor of psychology at Kenyon College in Gambier, Ohio. His research interests include self-consciousness, paranoid thought and behavior, self-deception, media vio- lence, aggression, and the psychology of genocide perpetrators.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 TAP0010.1177/0959354315605392Theory & PsychologyOppenheimer research-article6053922015

Article

Theory & Psychology 2015, Vol. 25(5) 599­–621 Designing obedience in the lab: © The Author(s) 2015 Reprints and permissions: Milgram’s shock simulator and sagepub.co.uk/journalsPermissions.nav DOI: 10.1177/0959354315605392 human factors engineering tap.sagepub.com

Maya Oppenheimer Imperial College London London College of Communication

Abstract This article probes the design history of Stanley Milgram’s simulated shock generator by comparing drawings and notes from Milgram’s archive in the Sterling Memorial Library at Yale with laboratory equipment and apparatus catalogues from the Archives of the History of American Psychology, University of Akron. By applying contemporaneous human factors engineering principles to the generator’s control panel layout, sequencing, and display optimisation, an argument emerges that suggests the tailor-made device had an influential role in facilitating the behaviour witnessed in the laboratory and generalised as obedience. Such an approach puts forward a new reading of Milgram’s experiment design, his penchant for dramaturgy, and reconsiders his generalisation of obedience to social authority.

Keywords affordance, Alphonse Chapanis, control panel, design history, experiment design, human factors engineering, Stanley Milgram, modelling, obedience

An undeniable mythology surrounds Milgram’s obedience experiments: replications, textbooks, documentaries, and popular references reiterate it with obsessive fre- quency, bestowing that shallow familiarity of the over-rehearsed icon. Strangely, the generator, perhaps one of the noisiest and flashiest instruments in the history of psychol- ogy, is little considered as a contributing factor to Milgram’s results. Bruno Latour assigns the term spokesperson to the scientist who objectively reports data and commu- nicates meaning to public audiences (Latour, 1987, p. 71). Milgram was particularly skilled, and I suggest overactive, in his role as spokesperson and deserves a title more akin to dramaturge. With a rich mix of compelling variations (some of which were never

Corresponding author: Maya Oppenheimer, Imperial College London, Sherfield Building, South Kensington Campus, London, SW7 2AZ, UK. Email: [email protected]

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 600 Theory & Psychology 25(5) published), trial runs, publications, interviews, and a documentary comprised of re- enactment footage (see Perry, 2015), Milgram’s authorial hand is omnipresent across these representations that convince and contrive a generalisation towards obedience to authority. The closest material witness to the behaviour made so compelling by Milgram and those who study him, however, is the generator, which upon close examination is rather particular in its operations: Milgram was a keen designer and applied his drama- turgical import to the interface of this human–machine interaction too. To begin, I accept that Milgram’s simulated shock generator has always been a model. Models are a means of testing and disseminating speculative analogies of how phenom- ena operate by bridging the interstitial, abstract space between theory and data (Hesse, 1966; Morgan & Morrison, 1999, p. 14). Applying this status to a scientific instrument, particularly of the tailor-made variety, is easily done. Historians of science Sturm and Asch (2005) explain how instruments can take on this pivotal role as both an influence of and mirror for the investigator’s motives:

an important theoretical assumption is literally embodied in or enacted by the use of an instrument, and this is not an assumption that concerns the functioning of the tool as such. In a sense, it is an a priori assumption about the mind that constrains what research questions are considered sensible, and helps determine the directions in which answers to such [questions] have to be sought. (p. 18)

The simulated shock generator, therefore, is not only an explanatory device, but it is also bound up with specific, subjective intentions that are very much acknowledged in design discourse but less so in psychology. Milgram vaguely hinted at this in his interview with Carole Tavris in Psychology Today; “it was an incandescent moment, the fusion of a general idea on obedience with a specific technical procedure. Within a few minutes, dozens of ideas on relevant variables emerged, and the only problem was to get them all down on paper” (Tavris, 1974, p. 80). There is intentionality where the behaviour Milgram sought becomes “characterised in terms of technologies that are culturally dom- inant at a given time” (Sturm & Asch, 2005, p. 21). This is a modelling as well as a design process that creates an interface between an historical context (the experiment, technol- ogy) and human interaction. Milgram (1974) himself affirmed the device “constitutes an important buffer, a precise and impressive instrument that creates a sharp discontinuity between the ease required to depress one of its thirty switches and the strength of impact on the victim.” While the “depression of a switch is precise, scientific, and impersonal” (p. 20), it is also a particular gesture of modernisation that references 1960s American push-button consumerism and Cold War nuclear anxieties—from the everyday Hotpoint washer-dryers to the fearful launch of a nuclear missile. Further support for focusing on the generator’s design as a significant factor is readily available. It is the only standardising feature across all experimental conditions; it show- cases numerous aesthetic features that link it to contemporaneous technology and there- fore context; and it is the pivot between numerous boundaries beyond real and simulation including public/professional, subject/experimenter, and authority/self. Barry Curtis (2011) explains the function of design as “a creative process that pro- duces objects and systems in ways that respond to need and environment, it interpolates

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Oppenheimer 601

Figure 1. The simulated shock generator on permanent display [Instrument and Apparatus Collection]. Courtesy of the Archives of the History of American Psychology, The Cummings Center for the History of Psychology, The University of Akron. users and negotiates priorities, and it runs a gamut from rational problem solving to flamboyant exercises in signification” (p. 264). This description of the design process alludes to Milgram’s work on the generator but also my analysis of its role in facilitating his obedience research. First, I assess the generator’s design development and subse- quent production as a response to the twinned requirements of a laboratory instrument: data collection and a reliable user interface. Milgram designed his experiment procedure with the purposeful goal of testing obedient behaviour, and the generator’s evolution to the final version demonstrates this initiative. Several sketches, at least two prototypes, and evidence from trial runs all point to the intentional and particular mechanical affordances of the final instrument: the device’s interface, for Milgram, quantified behav- ioural obedience. Second, I apply human factors engineering principles to lend insight into how the generator’s unique design contributes to the overall experiment results, and third, I consider what this specific design closure means with reference to current Milgram studies, especially the suitability of the obedience label of behaviour. Instruments are of course real objects with considered operations, interfaces, and standards, and their existence in a laboratory can be layered (implied, actual, deceptive) or they can be directly functional. But they are never neutral presences.

Designing the simulated shock generator Figure 1 depicts the shock generator as it appears now at the Center for the History of Psychology, protected beneath Plexiglas and accompanied by the original electrode wrist- bands that strapped James McDonough into the alleged electric chair. A rather contrived

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 602 Theory & Psychology 25(5) encounter presents the object to the museum visitor, which was conceived by curators for the APA centennial travelling exhibition Psychology: Understanding Ourselves (Marsh, 2000). Ordered to approach the display by stepping only on the black tiles of a chequered floor, the observers are primed by their own obedience before they see the device. Not much is revealed in the text panel, and a nearby History Channel video-loop plays clips of Obedience (Milgram, 1965) and interviews noting the experiment’s fame. Information regarding design development is absent. Documents of Milgram’s design process such as sketches, notes, descriptions, and receipts provide more insight than many industrial designers leave in their wake. While he alluded to the integrity and technical gravitas of the shock generator in his published work, Milgram did not publish much on the development of the generator nor openly discuss its systems operations even after the experiment was complete. In his first article (1963a) he wrote, “details of the instrument were carefully handled to insure an appear- ance of authenticity … No subject in the experiment suspected that the instrument was merely a simulated shock generator” (p. 373). Milgram later elaborated, “the occurrence of tension provided striking evidence of the subject’s genuine involvement in the experi- mental conflict” (1974, p. 171). This is false modesty. Unpublished material in Milgram’s archive exhibits a more enthusiastic view of the generator’s development and potential. Personal correspondence and reports to the National Science Foundation (NSF) are a case in point: a letter addressed to his research assistant, Alan Elms (1995) dated June 27, 1961, boasts “The apparatus is almost done and looks thoroughly professional, just a few small but important pieces remain to be built” (p. 23). An NSF grant committee update mentions “the new device passed the acid test when two electrical engineers examined the instrument and failed to realise it was a simulated device” (Milgram, 1961a). Speculation on its credibility persists among Milgram scholars, but this relatively quiet landscape provides ample imaginative space to explore a new narrative. Milgram’s penchant for design appealed to a contemporaneous test-market, a point reflected in the sensationalist article for Esquire magazine by Philip Meyer (1971) that contains narratives of Milgram, the tinkerer-scientist,

roaming around the electronic shops until he found some little black switches at Lafayette Radio for a dollar a piece. [Milgram] bought thirty of them… [and] drilled the thirty holes for the thirty switches himself in a Yale machine shop … to create his authentic-looking machine with very scarce resources except for his own imagination. (p. 130)

This anecdote carries a DIY fantasy imperative of electronics and technology that read- ers, particularly Esquire’s male audience, could relate to. At the time, a vernacular of industrial design was emerging with electrical appliances, power-tools, fix-it electronics manuals (depicting the man as protagonist/hero), vehicle dashboards, and a mechanised workplace contributing to the rising phenomenon of the push-button (Nicholson, 2011). This was brought to a state of longing by some observers such as Monte Calvert (1962) who wrote in The Mechanical Engineer in America, “if there was one thing the average American considered himself, it was a mechanic, and as such he was qualified to judge engineering design and correctness” (p. 266).

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Oppenheimer 603

Figure 2. Left (a): Milgram’s early sketch of the simulated shock generator from Spring 1960, before NSF funding and the pilot trials. Published in Milgram, 1992, p. 129. Right (b): Milgram’s sketch from SMP 11/46/165, also in Russell, 2011, p. 148. Courtesy of Alexandra Milgram.

In the end, Meyer points out, Milgram was triumphant in realising a “splendid-look- ing control panel dominated by the run of switches, each labelled with its voltage, and each having its own red light that flashed on when the switch was pulled” (1971, p. 130). Although the scenario of a busy Milgram wandering through New York collecting parts for his experiment is unlikely, Meyer is one of the few interviewers who lingers on the device at all. His repetition of the 30 switches also hones in on a key factor in the device’s development. Figure 2a is the earliest known sketch of the generator and dates to Spring 1960 at which point Milgram was deep into planning the entire experiment procedure. This ren- dering is contemporaneous with another sketch (Figure 2b) that combines a rough plan for a control panel betwixt lines of text that set out an early conception of the experiment that placed participants on teams with role-playing nationalities. These sketches prove Milgram’s early incorporation of a simulated shock generator in the procedure with par- ticular attention paid to planning the control panel with a horizontal row of toggle switches. Other noteworthy features present here include index labels for each switch, what Milgram called “verbal designations,” and a subject response indicator. If Milgram had wanted to use an instrument that resembled these sketches he would have required a custom-made device from a manufacturing company or he would have had to build one himself. What he needed was certainly atypical compared to commer- cially available instruments. Thomas Blass (2004) claims Milgram attempted to order a device from a commercial provider but was told his specifications could not be met until December 1960. This was too late for Milgram who was eager to execute his pilot studies that autumn and apply for further funding for the upcoming year. Causation aside, that Milgram built his generator in-house is very important as it allowed him to manage the

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 604 Theory & Psychology 25(5) design outcome. To delineate the uniqueness of what Milgram built, I will briefly outline what custom services American instrument companies offered at the time. Commercial instrument companies were eager to diversify their products to meet the needs of a developing discipline: they branched into a ready-made electronic instrument market with updatable features, modular derivations, and a broad selection of panel com- ponents. Clients could order specific parts and assemble them in the laboratory or have company engineers assemble the device according to specification. The Archives of the History of American Psychology (AHAP) cares for such objects, many cruder and more utilitarian in appearance than Milgram’s design, as well as numerous apparatus cata- logues that document their components and context of sale. The rhetoric of flexibility and versatility in these catalogues speaks to the accommodating service such companies attempted to pitch: Lee Valley Electronics with their Behaviour Analysis Apparatus range is a good example for shock generator products. Introduced as “an organisation of qualified mechanical and electrical engineers, designers, technicians and machinists,” their product range was said to possess the same varied capabilities of their personnel: “Presentation of an idea or a problem is the stimulus which channels the design and engineering capabilities of our men into our scientific objective—a practical and eco- nomical solution.” Each product embodies “craftsmanship which is designed and built into each piece of equipment manufactured” (Lee Valley Electronics Behaviour Analysis Apparatus Catalogue, c. 1960, p. 2). Grason-Stadler’s Behavioral Research (1963; see Figure 3) line was similarly marketed and shows a modular interface component similar to Milgram’s pilot trial generator. Readymade devices were also available and met the needs of diverse laboratory requirements, including adverse stimuli devices for behavioural research. Again, from Grason-Stadler, consider the Shock Generator E1064GS designed to control the intensity and duration of an electric shock. It also contained an electronic timer and grid scrambler to regulate the delivery of the shocks to different outputs (Figure 4). The E1064GS model exhibits a more complex control panel scheme: two dials set the shock duration in sec- onds, and the left-hand knob sets numerical values at 0.5, 0.75, 1.0, 1.5, 2, or 3 seconds. This setting is multiplied by the right-hand frequency knob, which controls settings in either tenths of a second, full seconds, or via a remote timer. Programming the volts manually in this way, which is representative of most commercial models, involves a very different level of engagement and capability from the user—one that would cer- tainly complicate Milgram’s tidy data collection methods, a point I will return to shortly. It is difficult to infer the degree to which Milgram built the shock generator himself (as Meyer, 1971 suggests), but he certainly had a supervisory role. Yale archive folders contain several sketches rendered in his hand, and his experiment notebook lists hours worked and receipts in a smattering of inconsistent detail. Ronald Salmon’s name appears under the subheading “Temporary Services” beneath “List of Everyone Connected with the Project,” dated August 6, 1961. Salmon “aided in constructing and wiring simulated shock generator Model II,” for which he was paid $1.3 per hour to a total of $111.80—not a great sum considering the importance of the device and given other expenses such as Roy Superior’s fee of $50.00 for an illustration of a classroom for the Obedience Apperception Test (Milgram, 1961b). Another name, John Hartogensis appears below Salmon’s. His role is even more vague: evidently a graduate student in

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Oppenheimer 605

Figure 3. Catalogue page from Grason-Stadler Company Inc. advertising the stimulus source section, 1963 [Box No. 2, Apparatus Catalogs]. Courtesy of Archives of the History of American Psychology, The Cummings Center for the History of Psychology, The University of Akron. engineering, he was paid over twice as much as Salmon for 14 hours’ work. The nota- tion simply says “on the shock generator and designed other equipment” (Milgram, 1961b). This may have been either for the prototype built for the pilot studies or for the second, final model used in the main conditions, assisting Salmon. It is unclear to what extent Milgram would have provided the component parts for Salmon to assemble or to what specifications beyond the presented sketches either of them employed. Such

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 606 Theory & Psychology 25(5)

Figure 4. Grason-Stadler shock generator model E1064GS [No 604, Instrument and Apparatus Collection]. Courtesy of Archives of the History of American Psychology, The Cummings Center for the History of Psychology, The University of Akron. vagueness allows us to infer the human factors perspective that helps isolate specific design choices in the simulated shock generator’s development and their contribution to the resulting behavioural evidence. In other words, to accommodate the experimental conditions, a significant manipulation of the usual choices designers follow was required. It goes without saying that Milgram needed to make his own instrument. Rendering the essential function of the generator as one of simulation is quite obviously counter to the selection of any regular products available, but the resulting features attest to a design process bent on demonstrating panel optimisation, which Milgram interpreted as obedience.

Introducing human factors engineering Organisations such as the Social Science Research Council (SSRC) and the NSF were involved in shaping the mandate of the social sciences not only through their funding regimes but also by recommending and upholding “the creation of consistent tools” to establish and maintain levels of generalised authority in systems (political, administra- tive, penal), the efficacy of which could be tested in a laboratory setting (Farish, 2010, p. 105). In other words, administrative bodies and industry had vested interests in har- nessing the ultimate capacities of the individual agent as aggregates within the social body. The designation of consistent tools was quite vague and ranged at the time from new laboratory instruments to the development of deception in the experimental environ- ment. During and after World War II and particularly during the Cold War years, govern- ment and military departments benefited from funding research in universities such as the Johns Hopkins University psychology laboratories that collaborated with the Office of Research and Inventions as well as the Office of Naval Research, Harvard, and Yale (where Milgram studied and taught). These projects investigated minutiae from enhanced operations via control panel sequencing and layout to clearer display communications for reduced user error. Behavioural science were invaluable in these trials and helped implement effective technologies into civilian life.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Oppenheimer 607

Military and industrial technologies have both relied upon the work of experimental psychologists and studies of perception, reaction times, and pattern recognition, but an implied directive of efficiency in design takes precedence here and prioritises objectives such as decreasing operational input, increasing human comfort when operating user– machine systems, and increasing productivity. A branch of study necessarily emerged to translate knowledge and resources from militarised and governmental operations to rationalised system applications in industry, the workplace, and at home. Materials such as aluminium for military aircraft were re-appropriated for commercial buildings just as aptitude tests for officers were re-worked for office employees. The work of adaptation and development became variously known as human factors engineering, applied psy- cho-physiology, design psychology, and similar derivations (Brennan, 2004). Industrial designers and human factors engineers were preoccupied with control pan- els as access points to the larger efficiency of mechanical operations. Meister and Farr (1966, p. 11), writing from the Aerospace medical research laboratories in the mid-1960s, claim, “It is postulated that a logical relationship exists between conditions of system operation … and the characteristics of the panel to be designed.” If the designer fails to perform an analysis relating the needs of the operator to the functional design of the system, the result will be “an inappropriate and ineffective outcome with features inadequate for task accomplishment.” A useful definition of the field concurrent with Milgram’s obedience work exists in Applied Experimental Psychology: Human Factors in Engineering Design, published first in 1948 and reprinted into the 1970s. Its authors, Alphonse Chapanis, Wendell Garner, and Clifford Morgan (1948/1963) were all contributors to applied psychology research at the Johns Hopkins University laboratories and were widely recognised for their human factors approach to design engineering (Capshew, 1999, pp. 144–146). Their practice filled a gap in the “engineering of machines for human use and engineer- ing of human tasks for operating machines” (1948/1963, p. v). Studies published by human factors pioneer Alphonse Chapanis provided influential research in the alignment of human perception and abilities with control panel design to foster efficient user inter- faces. He worked with industrial giants such as Bell Telephone and General Electric as well as universities, and he published various studies and prototype trials. His work influ- enced innumerable user–machine interfaces from airplane cockpits to everyday appli- ances such as touchtone telephone pads and stovetop dials. Chapanis’ developments on the relationship between mechanical controls, layout, and user agency provide valuable perspective for critiquing Milgram’s device design. The latter’s operational affordances (what mechanical tasks the user can accomplish given the design) were framed as obedience to social authority as opposed to device compliance or optimisation. The extent to which Milgram was personally influenced by such studies is difficult to determine. Articles saved in his archive suggest at least a surface familiarity. One exam- ple that displays annotations typed by Milgram hails from professors at the University of Texas (affiliated with the United States Air Force School of Aviation Medicine at Randolph Field, Texas) who investigated the effects of stress and participants’ ability to use control interfaces. Milgram’s typed comment, “This appears to be very useful for the obedience study. There is a good summary of stress inducing techniques” (Milgram, 1970, front cover), indicates some degree of consultation, although indeterminate in

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 608 Theory & Psychology 25(5)

Figure 5. Chapanis’ visualisation for a user–machine system in Man-machine Engineering, 1965, p. 20. (Please refer to author’s note). application. Incorporation of human factors, however, is fruitful and indeed plausible given Milgram’s sustained interest in new technologies and inventing devices. Milgram’s participants did not know that the ideal way to use the machine was to defy its design and not use it at all, despite the easy functional controls and encouragement from the environment. This is clearly counter-intuitive for a participant pool of citizens increasingly accustomed to technology designed to optimise daily tasks, a process sculpted by principles in human factors engineering. Dashboards, appliances, and elec- tronics, for example, were modernised via push-button technologies that simplified their functions and took over operational responsibilities and even comprehension. Milgram’s generator, by comparison, framed a specific task (shock application) that made it easy for participants to project responsibility not only upon the experimenter and the situation but also upon the device itself. My application of human factors research in this paper relies heavily upon the writing of Chapanis and begins with a visualisation for user–machine system requirements pub- lished in his 1965 text, Man-machine Engineering (Figure 5). It is particularly fruitful in bringing human factors in conversation with Milgram’s own design plans. Behaviour does not occur in a vacuum but is the outcome of situational and social forces. This is something both human factors engineers and experimental psychologists hold in common. The system pictured in Figure 5 consists of inputs and outputs from the working environment that affect the user–machine relationship. Human factors

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Oppenheimer 609 engineering assesses these case-specific environmental variables, which are often sen- sory in nature (temperature, noise, acceleration) and balances them with a suitable design that enables the user/agent to work effectively in an optimal condition (low stress, safe). There is a sequential relationship here logically beginning with the controls, machine operations, and displays that cycle back to the user who then inputs adjusted or continued applications to the machine. Synergies between this visualisation and Milgram’s apparatus quickly emerge. Forces in the working environment relate to dependent variables and social factors in the laboratory: the sponsorship of Yale, the rigged delegation of roles, the sample shock, the learner’s recorded responses, the experimenter’s scripted prods, the grey coat, the appurtenances of the laboratory that have no function beyond artifice. The simulated shock generator is the gateway for the participant to become an agent in the experimen- tal environment by comprehending and initiating the controls (volt switches), enabling the machine operations (shock administration), and subsequently confirming and pro- cessing their action via display feedback (meters, lights, noise). Applying Chapanis’ model to Milgram’s own offers two advantages: it helps outline how Milgram designed the behavioural setting, and it also questions the appropriateness of his obedience label. The participant-operator is acting in a system designed, as all effective devices are, for user–machine compatibility and optimisation. Chapanis (1965) explains: “One essential task of the human factors engineer is to design machine dis- plays, controls, and working environments so that they are most compatible with man’s natural abilities” (p. 21). Intuitive association of a device with its function is as also known as optimisation or mapping (Norman, 2013, p. 23).

Stage I: Factors in control design Control panel organisation is the important, initial stage of contact between the user and the machine. No single feature stands alone in detail at this stage; rather, design aspects are a summative presence and should be well and simply organised to coordinate the overall goal of the system’s operational programme (Meister & Farr, 1966). Chapanis’ model of the user–machine system visually compresses the design process: Milgram did not choreograph all of the device elements at once. Rather, he conducted a thorough pilot test process to trouble-shoot the system and improve its operations. His methodology reflects a human factor engineer’s pre-occupation with “decisions about the functions that will be performed by the different parts of the system … keep[ing] in mind that the separate parts must cooperate effectively” (Chapanis, 1965, p. 18). There are a number of control features Milgram clearly favoured, starting in the early stages of development. These are traceable via sketches developed for grant reports and mock-ups and include the two already presented in Figure 2. Particularly noticeable of these is the horizontal, left–right arrangement of switches present again in the following two illustrations, one dated to Spring 1960 and not yet published and the second, “Specifications of Simulated Shock Generator” sketch, also from Milgram’s archive and dating to April 1961. Milgram decided early on to label the voltage switches with textual as well as numeri- cal indicators. Sketches suggest a lengthy period of experimentation lasting until the spring of 1961: the number of switches, their voltage, and the manner of labelling vary

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 610 Theory & Psychology 25(5)

Figure 6. Left (a): Milgram’s sketch of apparatus similar in format to that in Figure 3 to accompany report for Higgins Committee at Yale, Spring 1960. There are several, similar versions of this sketch with subtle variations characterising meticulous attention to control panel design. Right (b): “Specifications of Simulated Shock Generator”, April 1961, likely a draft sketch for an NSF progress report. Figure 6a, Milgram (1960a); Figure 6b, Milgram (1961c). Courtesy of Alexandra Milgram. throughout. Switches are fewer in his earlier drawings and hover around 9 to 11 intervals. Each numerical value also has a text label, ranging from Safe, Very Mild Tingle, and Lethal in Figure 6a and Very Light Shock, Strong Shock, and Extreme Pain, as seen in Figure 6b. Milgram’s experimentation reflects thinking common to the design stage of human factors engineers who are concerned with the visual problems of the device and the con- trols an operator would encounter. Chapanis (1965, p. 36) mentions several key ques- tions that arise here: What is the best kind of dial to use; How should dials be designed?; What size and design of lettering should be used?; What colours or lights are best designed for signalling? These priorities, according to Meister and Farr (1966), contrib- ute to a conceptual order wherein a user learns the names and functions of the controls and associates them with an organisational whole. A user then infers the “holistic proce- dure of operation” (p. 11) and forms an association between the controls and the task allowing them to perform the procedural act presented via the controls. An innocuous piece of scrap paper from the same file as Milgram’s sketches docu- ments further planning regarding the names and function of the controls (Figure 7; Milgram, 1960b). A range of word groupings is set in columns to describe the level of shock inferred: tingle, slight; mild, limited; moderate, moderately strong; very strong, potent; intense, danger: severe shock. Below these word couplings is a grid with two circled numbers per label, laid out as follows: These notes not only suggest further experimentation with language (some labels making it into the final design) but also the control map for the first prototype model of the shock generator, Model I, which broke away from the horizontal lay out of switches otherwise constant in planning sketches. This attention to labelling is an effort to rein- force an authentic appearance, but also as Milgram claimed, was meant to introduce an opportunity for reflexive consideration and user awareness.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Oppenheimer 611

Tingle Mild moderate very strong Intense Slight [scribble] Limited moderately strong Potent Danger: severe shock. 1 3 5 7 9 11 Tingle Mild Moderate Strong potent Intense Slight shock Limited Moderately strong Very strong very potent Danger: Severe shock 2 4 6 8 10 12 — repeat 10x

Figure 7. Milgram’s plans for control panel (Milgram, 1960b). Courtesy of Alexandra Milgram.

Milgram’s students built the first generator prototype for the series of pilot runs administered by his Psychology of Small Groups class in 1960 with the support of a small grant from the Higgins Fund of Yale University. Results of this run did not yield enough data variation for subsequent analysis. Milgram suspected that the student par- ticipants (“Yalies”) either completed the procedure because of their competitive natures or because they suspected the apparatus was a fake. Elms (1995) recalls that the pilot runs were disjointed, the device was inferior, and the stooges were poor actors. Milgram’s opinion on the generator prototype was not so critical; he provided captions for the images shown in Figure 8 in his article for Yale Scientific Magazine, claiming the pilot device “worked well, but an improved version was designed for subsequent experi- ments” (Milgram, 1963b, p. 7). As for the departure from the consistent rendering of the horizontal control configuration, it is likely this was a matter of limited financial and mechanical resources. Recall the earlier presentation of Grason-Stadler’s modular con- trol panel formats in Figure 3: Milgram may have acquired this control layout from the department or catalogue resources for practical, budgetary reasons with the intention of adhering to his original design in due course and with further funds. There is not much information regarding construction or cost of the prototype, and what exists could easily be a reference to the final device, Model II. Such a lack of tangible evidence does not mean a conversation should be missed here, since the pilot

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 612 Theory & Psychology 25(5)

Figure 8. Images from the pilot study, Autumn 1960, Yale. Top (a): the prototype model and Bottom (b): a student administers shocks. Published in Yale Scientific Magazine (Milgram, 1964), photos taken from prints in Milgram (1963b, p. 7). Courtesy of Alexandra Milgram. device is a very fruitful stage for the coming-into-being of Milgram’s behavioural model. Some features from the 12-switch prototype were carried over into Model II: separate switches for each volt interval (although originally as push buttons rather than up/down toggle switches), a left–right orientation, comprehensive labels, numbers, and light indicators per switch, a brand name in the upper left-hand corner (Dyson Instrument Company, which he retained in name also), and a needle-indicator to meas- ure implied voltage. Several new features, however, were introduced based on this prototype’s performance. While early sketches reveal a number of different volt-switch orientations, Milgram decided the final switch should be depressed a number of times to signal the participant’s full obedience. An early note in Figure 7 stipulates this should happen 10 times, which is a cumbersome repetition. By contrast, the final arrangement added more incremental

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Oppenheimer 613

Figure 9. Diagram from Chapanis’ Man-machine Engineering (1965, p. 107) laying out optimal control selections with respect to command response. Industry models use dials to regulate shocks whilst Milgram separates 30 individual toggle command switches—a very inefficient layout. (Please refer to author’s note). switches (30 in total) and the final switch “X X X” should be depressed three times only, one for each X. Milgram, therefore, broadened his potential for data points by adding more switches; he also achieved what is known in human factors as control sequencing or more specifically escalation: the ritualisation and simplification of control presenta- tion to enforce a specific mechanised task. Chapanis recommended control sequencing for efficient panel design, but how this manifested was a derivation for Milgram. It becomes very significant that Milgram finely differentiated the various shock levels and added so many switches, particularly up/down toggles. As evidenced in Figure 9, such switches were industry choice for a system response command requiring on/off functionality only, with a bank of no more than four discrete settings. Controls should match limb and task (Chapanis, 1965, p. 101; Chapanis et al., 1948/1963, p. 25; Javitz, 1952). In fact, more than 24 settings should be represented with a knob, dial, or thumbwheel, the bottom row of Chapanis’s visualisation in Figure 9, but Milgram claimed the same reasoning as the verbal designations: separate toggles aided data accumulation and provided a repetitive act that would give partici- pants pause to consider the increasing strengths of the shocks. In human factors research, however, escalation of this nature reinforces the inter- action, which suggests that Milgram not only designed an instrument to aid in his experimental reality but to prop and sustain the behaviour under investigation, and most importantly, to even encourage the action that supports its analysis. Furthermore, Milgram’s design reduces human error commonly encountered in control misuse. Since the panel had such limited interactive capabilities, users would have sensed limited

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 614 Theory & Psychology 25(5) agency, little choice. Likely, if a participant was required to set a number of dials (as with the device in Figure 4) to administer the shock, results would have become complicated due to interference, confusion, and error. Relevant here is Condition 11 wherein Milgram allows participants to choose the level of shock administered rather than work methodi- cally up the scale. This yielded the lowest participation rate: 2.5%. As a result he did not consider obedience since participants could choose their level of punishment (Milgram, 1974, p. 60, footnote). The final generator design would not pass a human factors checklist due to its ineffi- cient and redundant control features. What Milgram built was a custom-made instrument for the thorough and articulated display of how a participant would carry through a task despite knowing that it was harmful. DC power supplies and capacitators were com- monly used in contemporaneous research involving shock administration to human par- ticipants, not an involved prop like Model II (Ekman, Frankenhauser, Levander, & Mellis, 1964; Frankenhauser, Froberg, & Mellis, 1965; Izard & Livsey, 1964). Even the controversial similarity of Arnold Buss’s experimental procedure used a real generator that required the complicit participant to disconnect the electrodes (Buss, 1961). Typically these devices are used in laboratory experiments to test the effects of punishment on the participant, meaning the trained experimenter would set and determine the shocks as part of the independent variable. Given this use-scenario, the controls could exhibit some intricacy since experienced investigators or technicians typically used them. Other con- trol panel features commonly found in commercial models were incorporated by Milgram for aesthetic purposes only and included contrasting labelling, brand identity and model number, indicator lights, and activation switches.

Stage II machine operations Up until this point in the user–machine interface development, Milgram’s generator could capably deliver shocks. The infrastructure and controls certainly made it possible. This is, however, where the verisimilitude to a generator ends and artifice begins: Milgram’s design becomes a model of a device itself, and the manipulation of its controls sets in motion a string of questionable operations. Deception in laboratory methodology continues in contemporary psychology practice (Pettit, 2013): instruments are often an integral part of these investigations, and they either mask or measure the true focus of an experiment. The key point here is one of eth- ics: psychologists have a responsibility to weigh the costs and benefits of any harm their participants (physically or emotionally) might experience, and if the extent of harm is part of the apparatus (as it was with Milgram), it must be simulated somehow. Devices were often used to help disguise the deception and were seldom the simulation itself, save one particular niche where this is ubiquitous: systems and personnel testing. The individual in behavioural research as pertains to human factors became a malleable resource for assessment, measurement, and training. According to Branden Hookway (2004), who writes about the design development of mechanised spaces, this emphasis on testing the individual demonstrates a specific, historical trend:

The feedback loop between human beings and technologies (and also between groups and systems) grows ever more intimate. In addition, as the language of systems became at once

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Oppenheimer 615

Figure 10. Interior of the simulated shock generator showing the wiring and hollow body [Instrument and Apparatus Collection]. Courtesy of Archives of the History of American Psychology, The Cummings Center for the History of Psychology, The University of Akron.

more abstract and generalisable, and more specifically applicable to problems in the world, those processes once deemed irretrievably embedded within natural, social, or material interactions became accessible to a rationalisation that seeks not so much to denature them as to reanimate them within a controlled environment. (p. 44)

Hookway’s feedback loop resonates with design critiques of Milgram’s research, particularly in terms of what operations were afforded by the laboratory control panel and whether this is generalisable behaviour in a controlled environment or interface optimisation. Hookway links efficient military spaces like cockpits to the industrious post-war con- structs of everyday office design and production systems. Following Hookway’s prece- dent by comparing his published man-machine model for a WWII aircraft cockpit and Chapanis’ own visualisation in Figure 5 further suggests Milgram’s generator advanta- geously streamlines operations (2004, p. 23). Safety provisions as well as efficiency and comfort factors do not translate appropriately across these models. I have already addressed the latter two: Milgram designed the generator to be inefficient so as to extend—arguably to enforce—user interaction and to yield data. That it was uncomfort- able to administer was not a physical conflict in the first instance but a psychological- come-physiological struggle under adverse conditions, which Milgram generalised as blind obedience to the experimenter and not the increasingly nefarious device. Safety provisions existed insofar as the shocks were simulated, a detail users should not deci- pher. How would a mechanism such as an abort button or mute switch—a domesticated eject button—affect Milgram’s results? An ON/OFF switch was included for decorative authenticity: Gina Perry (2012) sites an example of a female participant rising to turn off the device presumably at its power-source, but the provision of an abort switch would

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 616 Theory & Psychology 25(5) diversify the mechanical operations available to the user offering an appropriate user agency in the technical as well as social procedure. Some scientists read an essential separation between the design of a laboratory experi- ment and the design of any other system that measures machine development. David Meister (1985) takes this stance in Behavioural Analysis and Measurement Methods where he explains the hectic pace of development, onset of obsolescence, and high cost of developing simulation equipment as forbidding factors for such highly resourced test- ing. The incompatibility of these frames of reference is logistical, making Meister’s grievance important but not debilitating: I am not conflating the two contexts but cross- pollinating procedures to point out how Milgram usurps systems design to ensure that the machine operation (simulation) is effective for his motives.

Stage III device displays A device display in man-machine systems refers to those features that provide informa- tion that “an operator cannot or does not get directly through his senses” (Chapanis et al., 1963, p. 118). Displays (dials, counters, meters, lights) are perceptual challenges for designers and are highly tested within applied psychology according to two main concerns: (a) what does the operator need to know and (b) How can this information be effectively communicated? Chapanis et al. (1963) outline three functions of communi- cation displays to keep in mind when satisfying these concerns. First, displays can offer information in an either/or fashion, which indicates if a component is operational, engaged, or malfunctioning. Second, qualitative indicators give information best with reference to range rather than precise measurement. Finally, quantitative readings facili- tate precise numerical readings that communicate information related to machine opera- tions (Chapanis et al., 1963, p. 120). These determinants were of course the outcome of detailed trials that investigated how a good display should present “information in a form that can easily be converted into correct decisions and appropriate actions … with a particular environment and system in mind” (Chapanis, 1965, p. 36; Sleight, 1948). Findings from these experiments were extremely detailed, and necessarily so since this research influenced the design of cockpits, factory generators, and hospital equipment to name a few. Milgram’s use of display dials, like his toggle switches, is unnecessarily abundant. That said, their presence does abide by main points in human factors display rationale: functionally related controls are clustered together, displays should be logically located and in proportion to the controls (known as the control-display ratio), and interstitial spaces should be plain, making the overall impression one of functionality and breadth. Two shock supply units from the AHAP collection offer fitting comparison of good prac- tice (Figure 11). They are functional stimulus generators used by scientists in behav- ioural laboratories and donated to the instrument collection at the University of Akron. Light indicators signalling a machine operation are not uncommon on commercial models as well as laboratory devices. They signal power activation, administration, or malfunctioning. Ralph Gerbrands’ Shock Box (Figure 11b) for operant conditioning on

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Oppenheimer 617

Figure 11. Left (a): Lafayette Shock Supply Unit; Right (b): Ralph Gerbrands Co., Operational Conditioning Shock Box [Nos 302, 510, Instrument and Apparatus Collection]. Courtesy of Archives of the History of American Psychology, The Cummings Center for the History of Psychology, The University of Akron. rats and small animals demonstrates a possible layout: the pilot light seen on the lower panel next to the power switch indicates the device’s readiness while the amber shock light seen on the top portion of the panel illuminates when the red shock button was depressed. Placing the lights near the related controls is an important feature of effective panel design, principles Milgram carried out in the extreme with an indicator light for each toggle. Their colouring and arrangement held further ultimate effect: Chapanis et al.’s manual (1963) stipulates warning lights should be red and aligned along the cen- tral line of sight on the panel with steady lights being more effective than flashing lights (pp. 185–187). Lafayette Instruments’ shock supply unit (Figure 11a) demonstrates a simple and clean control panel: instead of light indicators, the designer placed a voltme- ter to communicate the strength of current delivered to the participant. A needle travels a curved scale from left to right in logical sequence using legible numbering systems and graduation marks. Milgram combined all possible display features on his control panel from across both examples in Figure 11. Like a macabre lightshow, the indicator lights are plentiful and centred in a horizontal row; there is a voltage energiser light, a General Electric panel meter, pulse frequency and attenuator dials, and over-abundant labelling of them all. All these features offer simulated feedback that enforces the machine operation of deception. The participant-operator, however, is not required to read any of the information cued by the voltmeter, nor are they meant to touch the dials. They simply follow the toggles, one by one. Such over-abundant features reinforce the participants’ interaction with the machine in a gradual, sequential manner, but also serve to limit their agency in its opera- tion: what Milgram later refers to in his book as the “foot in the door” phenomenon.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 618 Theory & Psychology 25(5)

Conclusion: Designing obedience Some of Milgram’s critics point out that the simulated shock generator is not the sort commonly found in psychology laboratories; this point is made to raise ethical or theo- retical reservations in Milgram’s overall procedure. Don Mixon (1989), for example, takes issue with the situational dilemma facilitated via the apparatus and disagrees with Milgram’s generalisation of malevolent or immoral obedience. The conflict at hand was not so easily discernible to participants, and the generator a considerable source of confusion and projection. Nestar Russell (2011), writing more recently on the origins of Milgram’s experimental apparatus, sees a failure to properly generalise data and explain why most of his participants obeyed the framework. Applying human factors to the discussion of Milgram’s procedure is a useful lens that helps penetrate Milgram’s experiment framework and questions whether the obedience label, or indeed an assessment of behaviour—as opposed to action or job, compliance, or aptitude to a designed environment—is appropriate. Guided by Chapanis’ visualisation of human factors principles, I demonstrated how Milgram made strategic use of the con- trol panel, machine operations, and display features to design an instrument that invited users to pursue an escalating, reinforced technical procedure. I set out specific points where Milgram adhered to (control sequencing, display feedback, simulation) and diverged from (amount of controls and nature of feedback) accepted human factors instruction, and this was in order to facilitate a specific user–machine interaction. Prioritising the generator as a material presence with agency reveals the extent to which Milgram engineered the behaviour documented at Yale and Bridgeport. The con- ditions resemble demonstrations rather than experiments and reveal the dramaturgical hand Milgram also extends to his strategically framed publications. Recalling Curtis’ explanation of the function of design as “a creative process that produces objects and systems in ways that respond to need and environment” (2011, p. 264), it appears Milgram has something to remind us about the role of designed objects in our social/ intellectual world, and it is not about blind behavioural obedience to authority but rather a larger manipulation of our deference to structures of investigation and objective knowledge formation. I suggest the generator is a material representation of conflicted motives and histories that we continue to cantilever around Milgram’s experiment. We can make good use of the generator as a critical tool or entry point and not just a relic of experimental science, silenced behind a Plexiglas enclosure. As the generator continues to appear in references to Milgram’s experiment—both within and beyond the laboratory—it becomes instantly recognisable and a visual representation for his work. It is important to consider how and why this is so, how the generator enriches and challenges our deference to Milgram’s research, which sustains such varied and prolonged interest long after any contradictions (have, and may continue to) emerge.

Author’s Note Attempts were made without reply, publisher out of business and no rights determined after asking AHAP who holds Chapanis’ archive. Was told Fair dues arguable.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Oppenheimer 619

Funding The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: Funding support for this research was generously provided by the Design History Society via their Student Travel Award (2013).

References Blass, T. (2004). The man who shocked the world: The life and legacy of Stanley Milgram. New York, NY: Basic Books. Brennan, A. (2004). Forecast. In B. Colomina, A. Brennan, & J. Kim (Eds.), Cold war hothouses: Inventing postwar culture from cockpit to playboy (pp. 55–90). New York, NY: Princeton Architectural Press. Buss, A. (1961). The psychology of aggression. London, UK: John Wiley and Sons. Calvert, M. (1962). The mechanical engineer in America 1830–1910. Baltimore, MD: Johns Hopkins Press. Capshew, J. H. (1999). Psychologists on the march: Science, practice and professional identity in America 1929–1969. Cambridge, UK: Cambridge University Press. Chapanis, A. (1965). Man-machine engineering. London, UK: Tavistock. Chapanis, A., Garner, W. R., & Morgan, C. T. (1963). Applied experimental psychology: Human fac- tors in engineering design. London, UK: John Wiley & Sons. (Original work published 1948) Curtis, B. (2011). Dinosaur design. In L. Atzmon (Ed.), Visual rhetoric and the eloquence of design (pp. 245–276). London, UK: Parlour Press. Ekman, G., Frankenhauser, M., Levander, S., & Mellis, I. (1964, May). Scales of unpleasantness of electrical stimulation. Scandinavian Journal of Psychology, 5, 257–261. Elms, A. (1995). Obedience in retrospect. Journal of Social Issues, 51, 21–23. Farish, M. (2010). The contours of America’s cold war. Minneapolis: University of Minnesota Press. Frankenhauser, M., Froberg, J., & Mellis, I. (1965). Subjective and physiological reactions induced by electrical shocks of varying intensity (Report No. 182). Psychological Laboratories, The University of Stockholm. Grason-Stadler Behavioral Research Equipment Catalog. (1963). Archives of the History of American Psychology (Box No. 2, Apparatus Catalogs). The Cummings Center for the History of Psychology, University of Akron, OH. Hesse, M. (1966). Models and analogies in science. Notre Dame, IN: University of Notre Dame. Hookway, B. (2004). Cockpit. In B. Colomina, A. Brennan, & J. Kim (Eds.), Cold war hothouses: Inventing postwar culture from cockpit to playboy (pp. 22–54). New York, NY: Princeton Architectural Press. Izard, C. E., & Livsey, W. J. (1964, February). The effects of experimenter attitudes and feel- ings on response to painful stimulation (Unpublished Technical Report No. 21, Contract No. 2149(03)). Vanderbilt University, Nashville, TN. Javitz, A. E. (1952). Introduction to human engineering in product design. Electrical Manufacturing, 49, 90–95. Latour, B. (1987). Science in action: How to follow scientists and engineers through society. Cambridge, MA: Harvard University Press. Lee Valley Electronics Behaviour Analysis Apparatus Catalogue. (c. 1960s). Archives of the History of American Psychology (Box No. 2, Apparatus Catalogs). The Cummings Center for the History of Psychology, University of Akron, OH. Marsh, C. (2000). A science museum exhibit on Milgram’s obedience research: History, descrip- tion, and visitors’ reactions. In T. Blass (Ed.), Obedience to authority: Current perspectives on the Milgram paradigm (pp. 145–159). London, UK: Taylor & Francis.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 620 Theory & Psychology 25(5)

Meister, D. (1985). Behavioural analysis and measurement methods. New York, NY: Wiley- Interscience. Meister, D., & Farr, D. E. (1966, September). The methodology of control panel design: Checkout and hazard control techniques (Unpublished report, project 8119). Performance Requirements Branch, Human Engineering Division of the Behavioural Sciences Laboratory, Aerospace Medical Division, Wright-Patterson Air Force Base, OH. Meyer, P. (1971, February). If Hitler asked you to electrocute a stranger, would you? … Probably. Esquire, 72, 73, 128, 130, 132. Milgram, S. (1960a). [Untitled drawing]. Stanley Milgram Papers (Box 46, Folder 165). Yale University, New Haven, CT. Milgram, S. (1960b). [Untitled drawing II]. Stanley Milgram Papers (Box 46, Folder 165). Yale University, New Haven, CT. Milgram, S. (1961a). Correspondence. Stanley Milgram Papers (Box 43, Folder 127). Yale University, New Haven, CT. Milgram, S. (1961b). Obedience notebook. Stanley Milgram Papers (Box 46, Folder 163). Yale University, New Haven, CT. Milgram, S. (1961c). Specifications for simulated shock generator [Drawing]. Stanley Milgram Papers (Box 46, Folder 165). Yale University, New Haven, CT. Milgram, S. (1963a). Behavioural study of obedience. Journal of Abnormal Social Psychology, 67, 371–378. Milgram, S. (1963b). Draft of “Technique and first findings of a laboratory study of obedience to authority”. Stanley Milgram Papers (Box 55, Folder 17). Yale University, New Haven, CT. Milgram, S. (1964). Technique and first findings of a laboratory study of obedience to authority. Yale Science Magazine, 39, 9–11, 14. Milgram, S. (Producer & Director). (1965c). Obedience [DVD]. United States: Penn State University Audio-visual. Milgram, S. (1970). Writings of others S-Z. Stanley Milgram Papers (Box 47, Folder 187). Yale University, New Haven, CT. Milgram, S. (1974). Obedience to authority: An experimental view. New York, NY: Harper & Row. Milgram, S. (1992[1977]). Some conditions of obedience and disobedience to authority. In: J. Sabini & M. Silver (Eds.), Individual in a social world: Essays and experiments (2nd ed., pp. 136–161). London, UK: McGraw-Hill Inc. Mixon, D. (1989). Obedience and civilisation: Authorised crime and the normality of evil. London, UK: Pluto Press. Morgan, M. S., & Morrison, M. (1999). Models as mediating instruments. In M. Morgan & M. Morrison (Eds.), Models as mediators: Perspectives on natural and social science (pp. 10– 37). Cambridge, UK: Cambridge University Press. Nicholson, I. (2011). “Shocking” masculinity: Stanley Milgram, “Obedience to Authority,” and the “Crisis of Manhood” in cold war America. ISIS, 102, 238–268. Norman, D. (2013). Design of everyday things. New York, NY: Basic Books. Perry, G. (2012). Behind the shock machine: The untold story of the notorious Milgram psychology experiments. Brunswick, Australia: Scribe. Perry, G. (2015). Seeing is believing: The role of the film Obedience in shaping perceptions of Milgram’s obedience to authority experiments. Theory & Psychology, 25, 622–638. Pettit, M. (2013). The science of deception: Psychology and commerce in America. Chicago, IL: University of Chicago Press. Russell, N. J. C. (2011). Milgram’s obedience to authority experiments: Origins and early evolu- tion. British Journal of Social Psychology, 50, 140–162.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Oppenheimer 621

Sleight, R. B. (1948). The effect of instrument dial shape on legibility. Journal of Applied Psychology, 28, 170–188. Sturm, T., & Asch, M. G. (2005). The role of instruments. History of Psychology, 8, 3–34. Tavris, C. (1974, June). The frozen world of the familiar stranger. Psychology Today, 70–80.

Author biography Maya Oppenheimer is a design writer, educator, and researcher based in London and holds a PhD in Humanities and Cultural Studies from the London Consortium. Her work currently explores object-centred transactions across art, science, and design, particularly laboratory instrumentaria and methodologies of experiment and deception. Her teaching reflects this interest: she is a visiting tutor in the School of Design at the Royal College of Art, lecturer in Visual Culture and Senior Lecturer in Critical & Theoretical Studies at Imperial College London and London College of Communication. Maya is also an Executive Trustee of the Design History Society and has pub- lished work in multidisciplinary forums.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 TAP0010.1177/0959354315604235Theory & PsychologyPerry 604235research-article2015

Article

Theory & Psychology 2015, Vol. 25(5) 622­–638 Seeing is believing: The role © The Author(s) 2015 Reprints and permissions: of the film Obedience in sagepub.co.uk/journalsPermissions.nav DOI: 10.1177/0959354315604235 shaping perceptions of tap.sagepub.com Milgram’s Obedience to Authority experiments

Gina Perry University of Melbourne

Abstract Stanley Milgram’s film Obedience is widely used in teaching about the Obedience to Authority studies. It is frequently a student’s first introduction to Milgram’s research and has been a powerful force in establishing the scientific authority of the experiments. This article contextualizes the filming, selection of footage, and final editing of the film against growing ethical and methodological criticisms of Milgram’s research. I argue that Milgram’s film should be viewed as a response and reply to the criticisms expressed by the National Science Foundation when they refused funding for further experiments. Obedience, the film, originally conceived as a record for future researchers, transformed into a visual document aimed at disarming critics and establishing the universality and profundity of Milgram’s findings. Milgram aimed in the film to reconcile the quantitative and the qualitative aspects of the experiments through a scientific narration and footage of participants in action. A close reading reveals that while the film is scientifically unconvincing, and an unreliable account of the Milgram’s research, it succeeds spectacularly as arresting and compelling drama.

Keywords history of psychology, obedience to authority, psychology films, social psychology, Stanley Milgram

Of course, research participants can resist the power of the experimenter. However this is not always recorded for posterity: Because experimenters have control of the write-up, they have the power to write history. (Spears & Smith, 2001, p. 320)

Corresponding author: Gina Perry, School of Culture and Communication, Faculty of Arts, University of Melbourne, Parkville, VIC 3010, Australia. Email: [email protected]

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Perry 623

Stanley Milgram’s obedience research is as well-known for its dramatic results as it is for its theatrical execution. Residents of New Haven who volunteered for Milgram’s experi- ment found themselves part of an elaborate hoax aimed at covertly studying obedient behavior in the laboratory. Milgram collected an enormous and varied amount of quanti- tative and qualitative data from the 780 participants who took part in the research. Yet the best known representation of Milgram’s participants is from the film Obedience (Milgram, 1965c). During the final days of his research, Milgram recorded a group of participants on film for the purposes of informing researchers intending to replicate the study. However, between the filming in 1962 and the editing in 1965, NSF funding for the research was withdrawn and the experiments became the subject of considerable controversy. The film, initially accessible only to college faculty, has escaped the confines of aca- demia and reappeared in numerous television documentaries. Its various incarnations have been viewed over 2 million times on YouTube. In this paper I will argue that view- ers of the film are themselves subject to an elaborate deception. The original footage was edited to produce a dramatic document aimed at the general public that vindicated Milgram’s methods and confirmed the validity and generalizability of his findings. The film was carefully crafted to give the appearance of verisimilitude and to convince view- ers that the events depicted on the screen were representative of his research. I will argue that the film, while failing in its original intention to provide a blueprint for other researchers interested in replicating the study, succeeds spectacularly as propaganda. While the role of the film Obedience (Milgram, 1965c) in generating publicity for Milgram’s obedience experiments has been acknowledged, historians have tended to focus on journalistic and textbook accounts of the experiments in disseminating and establishing the authority of the research findings (Laurent, 1987). The ambiguity of participants’ performances in the film and its subversion of Milgram’s accounts of blind obedience and his theory of the agentic state have already been explored elsewhere (Perry, 2013). DeVos (2010) examines the film as part of a discussion of post-war psy- chologization and McCarthy (2008) positions the film as part of the post-war tradition in which “real life drama” became “art with a larger, liberal-reformist social purpose” (McCarthy, 2008, p. 26). This article frames the film Obedience (Milgram, 1965c) as a reply to critics and as an attempt by Milgram to bridge the gap between the qualitative and the quantitative information he collected about participants in the course of the obedience studies.

Quantitative focus A rudimentary survey of psychology textbooks reveals a common depiction of the Obedience to Authority (OTA) findings as unequivocal evidence, in statistical form, of the human capacity for evil. The statistics, it would seem, speak for themselves. Stanley Milgram’s obedience research—which appears to demonstrate that the majority of peo- ple can be swayed by the commands of an authority to torture another person, bases its claims on the statistical results Milgram achieved in the course of his studies at Yale. The finding most often associated with Milgram’s research is that 65% of participants will continue to maximum voltage on the shock machine despite the learner’s protests, cries,

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 624 Theory & Psychology 25(5) and complaints. Standard accounts of Milgram’s research commonly report that 65% of people will obey an authority’s commands to inflict seemingly painful electric shocks on another person. Hard data certainly lends the weight of authority to any conclusions we may like to draw from the obedience research. However, contrary to common depictions of his find- ings, quantitative conclusions from Milgram’s research are widely misunderstood. This is partly to do with the piecemeal and sporadic nature of their publication and the fact that Milgram’s first journal article, which attracted considerable media attention, reported an obedience rate of 65%. Subsequent variations were reported in different journal arti- cles over a 10-year period and some experimental variations were not reported at all. The first complete list of all experiments and results was published 51 years after the experi- ments were completed (Perry, 2013). In fact, taken as a whole, less than half or 43.6% of Milgram’s participants went to maximum voltage and 56.4% of participants disobeyed (Haslam, Loughnan, & Perry, 2014). This quantitative focus is not surprising. Milgram was part of a psychological tradi- tion in which ideal experimentalists were seen to be objective and unbiased, concerned with empiricism and pursuit of factual truths. Danziger (1990) argues that by the turn of the 20th century the “mystique of the laboratory and the mystique of numbers” (Danziger, 1990, p. 185) and the use of jargon in lieu of everyday language was part of a process of psychology distancing itself from common knowledge and being accepted as “expert.” Morawski (2005) notes how unwelcome the reflexive self became in a discipline where subjectivity had quickly become both “dangerous” and a potential contaminant (Morawski, 2005, p. 80). Rhetoricians such as Gross argue that far from being “the route to certain knowledge” (1990, p. 3), science is a field in which its practitioners must argue persuasively for their knowledge claims to be accepted. In the social sciences, there is growing recognition that the field has adopted its own “persuasive rhetoric” (Billig, 1994, p. 309) and that the suc- cess of a discipline can be attributed in part to scientists’ ability to convince themselves and others that their research and the knowledge generated from it is valid. Being able to write persuasively and in an appropriate form and style increases the likelihood of pub- lication, circulation, and validation of research findings. While Gross and Billig are describing rhetoric in written texts, I will argue in this paper that a close analysis of Milgram’s film Obedience (1965c) reveals a similar constructive process aimed at per- suading the viewer of the validity and generalizability of Milgram’s findings. Gross notes that the standard style of scientific writing typically reads as if it is “nothing less than the description of reality” (Gross, 1990, p. 17). The film Obedience adopts a style that reinforces the impression that viewers are in effect watching the experiment “live” behind the two way mirror.

Qualitative data extensive but largely unpublished Qualitative data about Milgram’s participants has been published in a similarly sporadic and piecemeal fashion. However, far less of the qualitative than the quantitative informa- tion that Milgram gathered from his participants, has made its way into print. And yet Milgram spent more research funding gathering, processing, transcribing, and analyzing

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Perry 625 participants’ feedback after the experiments were over than he did on the conduct of the experiments themselves. The bulk of this material however remains unpublished. Yet Milgram had an enormous amount of qualitative data about his participants to draw on. This included background data and audio recordings of each participant, post experimen- tal and follow up interviews with individuals and groups of participants, comments from participants supplied with completed follow up questionnaires, and notes on phone calls and letters from participants after the experiments were over. Milgram did publish some qualitative information about his participants in journal articles and in his 1974 book, Obedience to Authority. Specifically, he reported selec- tively on participants’ responses to some questions on the follow up questionnaire, as well as drawing on conversations with participants and his own observations to depict typical participant reactions. For example, in his first journal article Milgram described a businessman reduced to a “twitching stuttering wreck” (1963b, p. 377), presumably to illustrate the stress participants underwent during the experiment and to offer evidence of their belief in the experimental illusion. In his 1974 book, Milgram presented a series of brief portraits of both obedient and defiant participants—“firm and resolute” medical technician Gretchen Brandt who calmly resists (Milgram, 1974, p. 84), “fluent and garrulous” housewife and youth worker Elinor Rosenbloom (Milgram, 1974, p. 80), “brutish” and “mesomorphic” Bruno Batta the welder (Milgram, 1974, p. 45). These participants are described in terms that suggest something about how their personality and background has contributed to their behavior in the lab—a message at odds with Milgram’s explanation that it was situation rather than personality that affected obedience rates. Despite the wealth of qualitative data Milgram collected, his participants appear most typically in his publications as little more than an extended anecdote. Taken together, these brief glimpses of “typical” par- ticipants, with their snatches of dialogue and depiction of body language and appearance serve to reinforce the quantitative conclusion and illustrate a larger point in Milgram’s theorizing.1 Correspondence between Milgram and his funding agency sheds some light on how and why this focus on the participants’ perceptions and reactions to the research became an issue. It was the National Science Foundation that provided the impetus and the fund- ing for Milgram to gather and analyze the qualitative data about his participants. In July 1961 the National Science Foundation approved a grant of $24,700 to fund Milgram’s initial round of obedience experiments. Six months later, with just over 16 of the 24 vari- ations complete, Milgram submitted a second application in time for the February 1962 closing date. This fresh application was for $36,000 to conduct more experiments explor- ing links between obedience and aggression, conformity, , organiza- tional structures, and constructive obedience (Milgram, 1962a). Within days Milgram had had a reply from Robert Hall, Program Director for Sociology and Social Psychology, saying the NSF would let him know of their decision “in early May” (Milgram, 1962b). In April, Hall and two other members of the panel arranged to visit Milgram at Yale and watch the experiments in action (Milgram, 1962d). The NSF rejected his application for funding to conduct more research. Instead of the $36,000 he had requested, the NSF gave Milgram $3700 to cover the costs of completing the final two experiments in Bridgeport by the end of May 1962 (Milgram, 1962c).

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 626 Theory & Psychology 25(5)

Milgram likely sensed during their visit that the NSF was unlikely to fund further research. Within a fortnight of the panel’s visit, he was seeking quotes for the cost of making a 20-minute film and on May 22 authorized Ed English from Yale’s Audio Visual Centre to buy 6 x 1200 feet rolls of 16mm film, enough for 3 hrs 20 mins of filming (Milgram, 1962f). The NSF had been wary about the ethics of the research and the effect of the experiment on people taking part, and there had been some delay on the first installment of Milgram’s initial grant as a result (Perry, 2013). In the letter rejecting Milgram’s second application, Hall outlined the reactions of the 20-member panel to Milgram’s completed research and explained their grounds for refusing to fund any more experiments. Hall wrote that he was providing detailed feed- back to “help guide SM in the writing up of his research” (Milgram, 1963c). Hall’s letter (Milgram, 1963c), written in the period before the research was public, is extraordinarily prescient, foreshadowing the next half-century of ethical and methodological criticism of the research.2 Hall described the weaknesses the panel had identified that caused them to refuse Milgram’s request for further experiments:

Many (though not all) reviewers were uncomfortable about the effects of the experiment on the subjects. Some felt the study was worthwhile despite this others felt that no matter what the scientific merit of the research, or the value of the results, the experiments should not be done. Many emphasized the need for careful de-hoaxing and reassurance of the subjects before they left the laboratory. A second most frequent criticism concerned the lack of theoretical guidance for the research. Many reviewers felt that your research demonstrates without explaining. Although the effects you were getting were startling, these reviewers felt that you could not say anything about the reasons for the behavior or the psychological mechanisms associated with it. They placed a low priority on scientific research concerned with parametric variations in a standard experimental situation, partly because of doubts about the generality of the results and partly because you seemed little concerned with using all kinds of available evidence to place the results in some broader explanatory framework. (as cited in Milgram, 1963c)

Hall continued by describing some of the specific criticisms from members of the panel who noted that Milgram made no real use of related research in the conceptualization of his study. Hall continues by providing direct quotes from members of the panel about Milgram privileging quantitative over qualitative data:

Principal investigator makes too much of his operational dependent variable (the last lever depressed by S) simply because it yields a “precise numerical value.” There are several motives and restraints operating in the situation aside from compliance or obedience, but these are not considered. The concentration on one precise measure seems to distract his attention from much other information. (as cited in Milgram, 1963c)

Milgram had ignored the participants’ point of view. Hall urged him to take a leaf from Asch’s book and interview participants about how they regarded the experiment. Without this evidence, Hall pointed out that Milgram had no basis on which to generalize beyond the lab:

Interpretation of the observed phenomena depends upon how subjects define the experimental situation: what kind of a situation does S find himself in, what are his expectations, what does

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Perry 627

he regard as its limits, how does he view its requirements? How does he conceptualize the setting, the experimenter, the stooge? Without this knowledge, the principal investigator cannot say what “obedience” means in this setting. He ignores the subjects’ phenomenology.

The principal investigator should get post-experimental interviews to find out how Ss perceived various features of the experimental situation. Asch did conduct such interviews and they provided invaluable data, facilitating his interpretation. In fact, without such data, it would have been impossible to assess whether his results were generalizable to real-life situations or were artefacts of a laboratory situation. (as cited in Milgram, 1963c)

Finally, Hall argued that Milgram had provided no proof that participants believed the experiment was real:

How do we know if the situation is really credible to the Ss? It seems likely that some of them suspect there was a “catch.” This may be operating below the level of awareness of the subjects. (as cited in Milgram, 1963c)

By the next funding round in April 1963, Milgram was able to report to the NSF that he had administered a questionnaire to all participants as well as follow-up group interviews with psychiatrist, Dr. Paul Errera. Milgram applied for a grant to process the qualitative material he had gathered, reassuring the NSF in his application that he was not conduct- ing any further experiments (Milgram, 1963a). He applied for $26,400 for the coding, transcribing of recordings, and conduct of content analysis of participant questionnaire data, all of which would presumably generate answers to the questions raised in Hall’s letter about participants’ point of view or “phenomenology,” and lack of theory to account for the participants’ behavior. Milgram noted in a letter to the NSF that transcribing recordings and conducting content analysis of participants’ comments on the question- naire consumed an enormous amount of time and energy. His application this time was successful. As a result of this funding, Milgram had the opportunity to process the huge amount of qualitative information he had about his participants. The second batch of NSF funding allowed Milgram to conduct a content analysis of participants’ comments on the questionnaires, a re-analysis of obedience rates by degree of suspicion expressed by participants, and full transcripts of audio recordings of group interviews conducted by Dr. Paul Errera. Neither the full content analyses nor Errera’s interview material were ever published. With his qualitative analysis complete but unpublished in October 1963, Milgram’s first journal article about his research was published (Milgram, 1963b), receiving an extraordinary degree of media attention, all of it focusing on the dramatic statistical result in which 65% of people had obeyed. The findings spoke to contemporary preoc- cupations raised by the Eichmann trial about the mechanisms by which seemingly ordi- nary people can become agents of destruction. The results were dramatic, arresting, and seemed to offer unequivocal evidence of the human capacity for evil. As Nicholson (2011) has noted, Milgram’s self-doubt vanished from his private papers once the experiment was published. Despite the evidence he had gathered about participants’ skepticism, trauma, and distress, and doubts about the ethics of his research that he confided in notebooks, over time Milgram misremembered events in favor of an

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 628 Theory & Psychology 25(5) ethically unproblematic view of his own research. For example, Milgram told inter- viewer Maury Silver in 1972:

Overall the reaction we get from the participants, is that it was a worthwhile experience. There were some subjects who were angry at the experiment. But as Paul Harara [sic], the interviewing psychologist said, they were chronically angry with their environment. But I think on the whole the experiment was a positive and instructive experience for subjects. And there is a difference between such experiments and others which are reprehensible. Again, if a person were in an experiment in which his little pinkie were cut off, he would emerge from the laboratory angered, irate, and immediately press charges and the experiment would be ground to a halt. And it would be justified, because people do react against a source of mistreatment. The fact that this did not occur, in the obedience experiment, is very significant. (Silver, 1972)

In fact more than one participant did feel irate enough to take action and Milgram was well aware of this.3

Seeing is believing When it came to convincing people of the scientific status of his research, Milgram’s intuition was that what he wrote about it was never going to be as convincing as seeing it first-hand. Milgram knew the experiment was powerful theater even if it was flawed as science. This intuition about the dramatic potential of the experiments was borne out by his observation of the powerful effect the experiments had on observers who watched with him:

Several men of intelligence, having observed the experiments, felt that the procedures bared for them profound and disturbing truths of human nature … three young Yale professors, after witnessing an evening session, declared that the experience was a brilliant revelation of human nature, and left the laboratory in a state of exhilaration. Similar reactions were forthcoming from other observers. Whether all of this ballyhoo points to significant science or merely effective theater is an open question. I am inclined to accept the latter interpretation. One reason is that almost all witnesses say to their friends: “You have to see it to understand it,” or “You can’t imagine what happens unless you see it yourself; words simply won’t do.” This is precisely the kind of talk one would expect to hear in connection with a play or some other artistic performance. In genuine science a mathematical or verbal description of the phenomenon is good enough. But the truth or significance of music, or a theatrical performance, or a painting, depends on direct confrontation and experiencing of the event. So the drawing power of the experiments stem in part from their artistic, non-scientific component. This makes them more interesting; it does not necessarily make them more valuable for a developing science of man. (Milgram, 1962e)

A film was an ideal vehicle for recreating “the direct confrontation and experiencing” of the experiments as well as establishing the value of the research “for a developing sci- ence of man.” And Milgram had a ready model in Alan Funt’s Candid Camera (Funt, 1948–1992). Milgram was influenced by and admiring of Funt’s work, which revealed the intricacies of social interaction at the same time that it adopted a socially progressive agenda of “teaching responsible citizenship” (McCarthy, 2008, p. 22).

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Perry 629

Milgram had a long interest in film-making and at one point toyed with the idea of it as an alternative career. Soon after starting at Yale he wrote to a friend,

perhaps I should not be here, but in Greece shooting films under a Mediteranean [sic] sun, hopping about in a small boat from one Aegean Isle to the next. In fact, when in Paris last April, I nearly sold my car to buy movie equipment … but went back to Harvard instead. Fool! (Parker, 2000, p. 125)

His film of the experiments signaled not only his desire to record his experiments for posterity, but a shift in his conception of himself as artist. As McCarthy points out, Milgram would later take courses in film production and add “filmmaker” to his resume (McCarthy, 2008). The first mention of a film about his experiments appears in a letter from Milgram to the NSF in April 1963, in his application for funds to analyze and transcribe data with a view to developing a theory that would explain his results. He reported that a “short sound film” had been made and that this would be used “as an aid to other investigators who wish to replicate the experimental procedure” (Milgram, 1963c). By July 1965, over two years later, he wrote to Hall at the NSF in a budget update that he had used some of the NSF money to “process and edit the raw footage” of the film and described the film as a template for other researchers who had already expressed interest. Consequently, he asked permission to make five prints (Milgram, 1965a). Milgram’s film began life as a “do-it-yourself” guide for other researchers. Cameraman Ed English recalled that, like the other films he made at Yale around the same time, the film of the obedience experiment was to act as a straightforward visual record (E. English, personal communication, August 22, 2013 ). It was filmed in May 1962, over the final weekend of Milgram’s 10-month program. However, the footage languished until October 1964 when Milgram arranged for a former Harvard filmmaker, Chris Johnson, to edit and prepare it for release. The two-and-a-half year period between the original recording and the final stage of editing of the film was a dramatic one for Milgram. During that period, Milgram had experienced complaints by participants to Yale, refusal of further funding from the NSF, an APA review of his membership, and published criticism of the ethics of his research. Events during this time shaped and altered his perspective on the raw footage and reshaped his purpose in editorial selection and promotion of the film. While the Obedience film (Milgram, 1965c) appears to be a “slice of life” through the keyhole, behind the scenes Milgram put much time and effort into selecting the cast, writing the script, and editing the film to tell a particular story (Perry, 2013). The film was clearly intended for an audience far broader than his professional colleagues. The result is a slick depiction, and a further deception, this time practiced on the general viewer. Ostensibly the film aims to educate viewers about the perils of obeying orders from an authority. The “didactic and prophylactic objective” (DeVos, 2009, p. 223) is a clear subtext of the film. But it also functions to reinforce the scientific credibility of the research, the credentials of the experimenter, and the universality of the results. The documentary can be mistaken for a straightforward recording of what happened in Milgram’s lab. The black and white footage and the hidden camera give the viewer the

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 630 Theory & Psychology 25(5) sense of watching the experiment as-it-happened in Candid Camera style. McCarthy positions Obedience (Milgram, 1965c) as a film that aims to expose the dynamics of human behavior and in particular the relationship between “the state and its citizens” (McCarthy, 2008, p. 30). While viewers are aware that the film has been edited, the air of verisimilitude is reinforced by the low budget, no frills production values. There is no music and seem- ingly no special effects. The final cut, the one known today, is a far cry from being an instructional document for researchers keen to duplicate Milgram’s research. Instead of being simply an audio-visual record of his procedures, the film is a docu-drama with a strong narrative, that builds to a climax in which a main character struggles to overcome obstacles that are repeatedly placed in his path. The narrative is structured to capture the viewer’s attention and build a sense of intrigue and immediacy. The narration begins in the present tense and the viewer is not informed of the deception until midway through the film when all is revealed to both participant and viewer (DeVos, 2009). The simulta- neous revelation of the hoax to both viewer and participant establishes that the experi- ment was conducted not through “cruelty and sadism, but rather in the interest of knowledge” (McCarthy, 2008, p. 34). Milgram wanted the film to be arresting and compelling. His folder on the film contains notes on Hitchcock’s Strangers on a Train (Hitchcock, 1951), as well as notes on how suited each man filmed was for the final cut. Seven of the 11 men filmed made it into the final version. In his notes on their performance, Milgram rated them on how animated and convincing they were, how much tension they showed, and whether they were anti-author- ity or showed “complete abdication” (Milgram, n.d.). Those who were left out of the film included the skeptical, the unconvincing, and those whose appearance might raise trou- bling issues about experimental ethics. One man who failed the final cut says on camera, “I didn’t believe the experiment was real. The groans and moans were not real” (Milgram, 1965c). The presence in the outtakes of a pair of friends, one of whom is still agitated and talks of his distress suggests that Milgram filmed a version of the Relationship condition for the film, but later abandoned the idea of including it (Perry, 2013). In contrast, the man Milgram would call “Fred Prozi” was immediately cast. In his notes, Milgram described the footage of Prozi as “brilliant” not just once but three times because of his “complete abdication and excellent tension.” Milgram wrote: “He should be used in the final film as a demonstration of our obedient subjects” (Milgram, n.d.). The edited footage was indeed compelling. In July 1965, filmmaker Chris Johnson reported to Milgram on the audience reactions when he first screened it:

I ran it in Holt, Rinehart and Winston’s small screening room. A number of other employees from the Foreign Language Dept. were also present … no one was fooled by the “punishment— learning” front, and all found it incredible that subjects could have been deceived. They thought the subjects’ protests phoney … They also felt their lunch hour had been ruined and their faith in mankind shaken. Yet for a week now several of the women have been pestering me to rerun the film on a Saturday for them and their husbands. (Milgram, 1965b)

But the film was not simply an edited version of the experiment. The deception of participants is not the only sleight-of-hand at work in the film. The film includes a

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Perry 631 debriefing conducted in which the hoax is revealed. Again, this is depicted as “typical.” In point of fact, 70% of participants were not dehoaxed before they left the lab, and Milgram was only occasionally involved in the debriefing that did take place (Perry, 2013). Secondly, the voice-over describes the onscreen experiment as involving 40 par- ticipants, but this is not possible given that the film was shot over a single weekend. We see only four of the seven participants featured in the film complete the experiment. Of these, three break off and one proceeds to 450 volts, providing visual evidence of a 25% obedience rate. To confuse matters further, Milgram says in the voice-over that the experiment we are watching is “Condition 2” and that 50% of participants obeyed. But Condition 2 or Voice Feedback condition, unlike the variation depicted in the film, does not involve the Learner mentioning a heart condition and resulted in an obedience rate of 62.5%. The data gathered from filmed participants, a variation Milgram called Condition 25, were excluded from Milgram’s summary of results, suggesting that Milgram regarded the data as unreliable, invalid, or both. So are we watching the “real” experiment or an approximation? If the former, why were the data gathered from this group of participants not included in Milgram’s listing of variations? If it was unreliable or deemed to be somehow artificial, then where does that leave the viewer? During the film the experiment is initially depicted as highly standardized. The first three participants are intercut to demonstrate the various stages of the experimental set up. But a clear mismatch for the wary viewer is in the footage of the first man to break off. He first appears early in the film where the teacher and learner are shown drawing slips of paper. He now reappears—having shed his suit jacket—in his white short-sleeved shirt and tie and just after the narrator tells us that “the early stages of the test pass une- ventfully.” Then we cut to this man who at the sound of the Learner’s grunt at 75 volts, points out to Mr. Williams (the Experimenter) that the Learner “did some kind of yelling in there.” From this point on he begins to challenge Williams, asking how far he is expected to go, saying he is “skeptical about electricity,” asking and then insisting that Williams “go and check” on the Learner, and then finally saying he won’t go any further until Williams checks that the Learner is OK. What is striking in this vignette is the man has to reiterate his explicit refusal five times before Williams concedes the experiment is over, not the “standard” four. Defiant Participant 2 also subverts the standard depiction of the experiment. We see him in his plaid shirt and large glasses about to give the Learner 75 volts. He addresses the Learner, “75 volts Jim.” What’s left out of the film and not explained is that this par- ticipant is a friend and neighbor of the Learner4 and has clearly recognized him as Jim McDonough rather than the “Mr. Wallace” that Williams introduces him as. Again, this man has to refuse explicitly and repeatedly, and in the footage, we see Williams using seven prods to urge him to continue. The participant’s relationship to and recognition of McDonough is not raised by Milgram either, when he asks the man why he was laughing. Instead Milgram tells us in a solemn voice-over that this laughter was widespread and “a puzzling sign of tension.” Milgram offers no explanation for the laughter, simply noting dispassionately its “reg- ular occurrence … 14 of 40 subjects showed definite signs of nervous laughter and smil- ing.” Just which 40 participants Milgram is referring to is not clear, but once again the viewer is led to believe that the men depicted on screen are part of a larger group.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 632 Theory & Psychology 25(5)

Milgram frames the laughter as a sign of tension and “strain” but offers no further expla- nation. “In the post experimental interviews subjects took to point out they were not sadistic types and the laughter did not mean they enjoyed attacking the Learner” (Milgram, 1965c). We are shown still photographs of different participants grinning and guffawing. The first-time viewer of the film is still at this point not aware of the hoax, and the film has a nightmarish quality. This is not just because the participants so far have agreed to participate, not just because their laughter makes us uneasy, but also because the Experimenter’s persistence and lack of emotion are inexplicable and disturbing (DeVos, 2009). Milgram’s narration is disinterested and scientific, his disengagement from the events on the screen is intended to reinforce his objectivity and the sense that he has simply set up a camera and pressed “record.” This mismatch between what the narrator knows and what we, the viewers, know means that the revelation that follows, in which Milgram is shown telling Defiant Participant 2 that the Learner was not being shocked, is a relief and a release as well as an added source of tension for the viewers. We are 15 minutes into the film before the Learner reappears unharmed and has a “friendly reconciliation” with the Teacher. The narrator points out that “we had a special obligation to protect the welfare and dignity of the persons who took part in the study.” But just in case the viewer might think such a process is inadequate, Milgram tells us that participants were “sent a detailed report” while onscreen a hand slowly turns page after page of a typed report and that an assess- ment of the debriefing procedures “points to their overall effectiveness.” Now all is revealed to the viewer. We see Mr. McDonough, the Learner, setting up a tape recorder to broadcast his cries, shots of a timer, an event recorder, the technology employed to provide “an objective record.” But before we have time to fully digest this new information, Defiant Subject 3 is introduced. We see him first at 195 volts, so he has gone further than any other participant so far. We then jump forward to him still admin- istering the test at 330 volts before he refuses to continue four times in the face of Mr. Williams’ repeated prods, and the experiment is terminated. Foreshadowing the final participant’s appearance on screen, Milgram the narrator pits commonsense against experimental evidence. He asks a participant to guess how many people will go on to the final switch on the machine. Then we are told 40 psychiatrists too got it wrong. Instead of the less than 1% they predicted, “50% of subjects obeyed the Experimenter’s commands fully in the experiment depicted in this film.” Enter “Fred Prozi” (Milgram, 1974, p. 77), the only participant in the film who goes to maximum voltage. Prozi dominates the narrative. He appears onscreen for more time than all the other participants combined. Prozi’s segment begins when he is about to give the ninth shock. For 13 minutes, or almost a third of the film, we watch Prozi’s excruciat- ing attempts to convince the experimenter to stop. His tension is both physical and ver- bal: he gets up, walks around, pleads with Williams, and calls out to the Learner. It is in Prozi’s performance that we are confronted with the moral ambiguity that Milgram seemed to overlook. While the voice-over invites us to take a scientific and dispassionate view of the events as they unfold on the screen, it is during Prozi’s segment of the film that the boundary between objective and subjective, morality and immorality, victim and torturer is blurred. The casting of Prozi as star of the film demonstrates

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Perry 633

Milgram’s awareness of the dramatic potential of Prozi’s performance. However, this focus on demonstrating a powerful narrative and his choice of Prozi to play the lead role creates a disconnect between the vision and the voice-over, the viewer’s perspective and Milgram’s. Prozi’s appearance in the film demonstrates the ambiguity of the situation, the con- flicting cues for participants and the stress that they endured, their anguish and the strug- gle for some resolution. But the tension in the film results from Prozi’s performance. It is here that Milgram loses control of the narrative. It is difficult for viewers not to feel sympathy as they watch Prozi’s desperate attempts to make the Experimenter see reason. Watching Prozi in the film we glimpse a more complex picture of the experience of the event from the participant’s point of view. Ultimately, the narration directs our attention to Prozi’s hand on the switch and the definition of him as “obedient,” but we have wit- nessed the range of strategies Prozi employs to convince the Experimenter to call the experiment off. In a letter to the NSF just four months before he shot the film (Milgram, 1962a) Milgram informed them that he had discovered that the degree of obedience “reflects the experience and social history of the participants.” And that he had found a “striking rela- tionship” between level of education and obedience. He reported that the more educated were less obedient. In stark contrast to his later situationist theory, Milgram concluded in this application that

it is clear that persons become less obedient as their education level is increased. Seventy per cent of those who had not gone to high school obeyed fully, while this is true of only 30 percent of the persons who had completed graduate or professional schools.

In his 1974 book, Milgram described Prozi as 50 years old and unemployed. “He has a good-natured, if slightly dissolute, appearance. He employs working-class grammar and strikes one as a rather ordinary fellow.” Despite his subsequent argument that it is the power of the situation that dictates obedience, Milgram’s film individualizes obedience (DeVos, 2009) and links obedience with people of a particular class. The single obedient participant depicted in the film, Prozi, compared to the other participants in the film, is less articulate, less well educated, and clearly more working class. When we compare Mr. Williams’ interactions with the other men depicted in the film we have to ask why it is that Williams continues to pressure Prozi to continue so many more times than the other men in the film. Although the voice-over makes reference to the power of the situ- ation, Williams’ behavior suggests that Milgram’s class theory of obedience and his hypothesis that the working class were more likely to go to maximum voltage may have shaped Williams’ responses to participants. This may explain why Williams abandons the “four standards prods” and insists 27 times onscreen that Prozi continue. The purpose of the film underwent a shift in emphasis after it was shot. By 1965 the film had become a means of teaching students about the obedience research, rather than as a record for other investigators into how to run a similar study. Milgram road-tested the film with faculty at six universities and, when the film was ready for distribution, wrote to them asking for comments and “permission to quote you” in promotion of the film. And at least one of them obliged. “One of the finest films in the area of psychology,” Dr. Roy

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 634 Theory & Psychology 25(5)

Feldman is quoted as saying in the 1967 New York University Film Library catalogue (Milgram, 1969). The film was being marketed heavily as a teaching tool. “The film is a powerful stimulant to discussion in social psychology, education, sociology and political science” (Milgram, 1969). When Penn State University took over distribution two years later in 1969 the film was described as a “classic.” “Compelling and dramatic, this is the only authentic film footage of Milgram’s famous experiment” (Milgram, 1969). Feldman’s quote was omit- ted but the range of fields to which it was applicable had been extended to include Holocaust studies. In a 1969 UCLA course guide the film was described as “a filmed experiment showing how ordinary individuals can be induced to commit moral trans- gressions, provides a disturbing insight into the roots of authoritarian behavior” (Milgram, 1969). At the time of filming Milgram asked participants’ permission to show the film to other psychologists. On screen Prozi hesitates, but agrees. However, television producers recognized the potential in the material, and Milgram was approached repeatedly for permission to broadcast it (McCarthy, 2008). He initially turned down such requests, honoring the agreements he had made with those who had been filmed. But in the late 1960s, he granted permission to broadcast the film on Italian and German television. In 1974, Milgram allowed CBS’s 60 Minutes to screen part of it to coincide with the publi- cation of his book. As McCarthy notes, this had the effect of highlighting obedience as the norm. In the ensuing decades, footage from the film has been permitted in documen- taries from Alex Gibney’s The Human Behavior Experiments (Gibney, 2006) and the BBC’s documentary The Brain: A Secret History (Mosley, 2010). The film has escaped from the constraints of academia and is now freely available on YouTube where it con- tinues to garner attention and continuing comment on social media by students “experi- encing” the experiment for the first time. Milgram’s film Obedience (1965c), a staple of university psychology programs since the late 1960s, introduces students to Milgram’s research at the same time that it presents a false and misleading picture of both the conduct of the experiment and the results. The power of the film is that we see individual people undergoing the experiment in contrast to reading about the percentages of a faceless and nameless collective. Milgram’s instincts about the drawing power of his research and, in particular, the experience of watching real participants undergoing the experiment on film, were correct. While the panel of his 20 professional peers viewed the obedience research as methodologically flawed, ethically troubling, statistically problematic, and theoretically impoverished, offering no explanation or insight into the behavior Milgram had recorded, and ignoring the participants’ point of view, the same criticisms outlined by the NSF can be applied to the film Obedience. We are encouraged by the voice-over to view Fred Prozi as an obedi- ent participant because of his hand on the 450 volt switch. We are invited to ignore his pleadings, bargaining, and attempts to intervene on the part of the Learner. Similarly, we are invited to turn a blind eye to the unstandardized nature of Williams’ prods. The skep- tical participants who questioned the reality of the experiment were simply left out. Milgram knew something that the 20-member NSF panel did not—the dramatic power of the experiments to quell misgivings, override doubts, and deliver a strong and seem- ingly profound message. By the end of the film, obedience has not been explained, but

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Perry 635 that hardly matters. The film is ultimately unsatisfying from a research perspective. It demonstrates the experiment-in-action but fails to explain the behavior of the people involved. The intellectual poverty of the film is eclipsed by the dramatic footage. Despite the rejection of a request for further funding on methodological and ethical grounds, Harvard’s rejection of his application for a tenured position, and the growing ethical criticisms of his research, Milgram produced a film that was a publicity coup. By allowing us to think we are peeking through the keyhole to watch a sensational experi- ment-in-action we are invited to both recognize the perils of blind obedience and the power of psychology to reveal disturbing truths about human nature.

Conclusion In this article I have argued that Stanley Milgram’s film Obedience (1965c) was shaped and edited in response to early and detailed critiques of the obedience experiments. The film attempts to address the ethical, methodological, and theoretical criticisms of the obedience experiments by Milgram’s professional peers and the National Science Foundation. In the two-year period between the filming in 1962 and the final edit of the film in 1964, the NSF had refused further funding of the obedience experiments, com- plaints had been made to both Yale and the APA about the ethics of the studies, and Harvard had rejected Milgram’s application for a tenured position. I have argued that the film’s original purpose as a visual record altered during this time and that Milgram employed rhetorical strategies in the final version of the film that aimed to persuade viewers of the experiments’ scientific validity, ethical integrity, and the importance of the findings. Using the same sleight of hand that characterized the experiments themselves, viewers are led to believe they are watching an experiment in action that is representative of the obedience experiments as a whole. In contrast to what we know from archival evidence (Perry, 2013), Milgram is shown debriefing participants before they leave the lab, and footage of participants who were skeptical or suspicious of the experimental set- up and those involved in more ethically problematic variations of the experiment are not depicted in the final film. Far from its original intention as a document of the research, the final product is a carefully scripted and edited docu-drama aimed at persuading view- ers of the universality and significance of the studies’ findings. However, a close reading of the film reveals that the visual narrative subverts the message of scientific scrupulous- ness that Milgram was attempting to convey. There is a disconnect between what the voice-over is directing us to accept and what we can see in the visual narrative. The foot- age of participants engaged in the experiment, and in particular the single obedient par- ticipant, Fred Prozi, demonstrates the dramatic power of the experiment at the same time that it reveals an experimental situation that is highly unstandardized, ambiguous, and ethically troubling for the viewer. Milgram attempted to marry two elements—the quan- titative and the statistical in the objective scientific voice-over and the qualitative subjec- tive experience of the participant in the visual footage. I have argued for the importance of the film in establishing the authority of the study and have highlighted the irony of a “scientific” experiment being reliant on a highly constructed visual narrative for its suc- cess. Milgram’s obedience experiments may have failed as science, but his film is a tri- umph of propaganda.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 636 Theory & Psychology 25(5)

Acknowledgements I would like to acknowledge the helpful feedback and comments provided by the editors on earlier versions of this article. Special thanks to Ian Nicholson for pointing out the irony of the science of the experiment being established by a dramatic film.

Declaration of Conflicting Interests The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding The author(s) received no financial support for the research, authorship, and/or publication of this article.

Notes 1. Subsequent re-analysis of unpublished qualitative data has yielded insights into debriefing practices and delayed dehoaxing, unreported variations of the experiments, evidence of par- ticipant suffering, uncontrolled and coercive experimental procedures, and evidence of par- ticipant skepticism (Perry, 2013). 2. Milgram’s insistence that Diana Baumrind’s 1964 criticism of the ethics of his research was a “complete surprise” is disingenuous, implying as it does that her objections were both unex- pected and unusual. Apart from the NSF’s explicit feedback on just this issue, former par- ticipants had complained to Yale, and the APA had conducted a review of his suitability for membership, all well before Baumrind’s article was published (Perry, 2013). 3. In December 1961, Aaron Aronow, a New Haven city alderman and participant in condition 9, “Group Pressure to Obey” telephoned and wrote indignantly to Milgram that “as an alderman of New Haven with a sense of responsibility for the welfare of its citizens,” he had no choice but “to report this matter to Yale University ” (Aronow, 1961). Aronow failed to halt the obe- dience research but he did succeed in turning the University’s attention to research that Milgram had until then kept relatively low profile. Milgram had already rebuffed inquisitive participants wanting to know more about the study in which they’d taken part as well as being secretive with colleagues (Perry, 2013). Aronow’s son recalls: “He was so enraged that he used his political clout to arrange an appointment with Kingman Brewster who was then the Provost of Yale. He told Brewster that Yale was harboring lunatics and sadists in the Psychology Department and he demanded that the study be halted.” Aronow’s intervention alerted the university to the actions of the assistant professor and the potential impact of his experiments on the local community. At Aronow’s funeral 49 years later in 2010, his nephew recalled in his eulogy how proud the family was of the stand Aronow had taken (J. Lampner, personal communication, July 31, 2013). 4. Three years later this same man would perform CPR on McDonough after he suffered a heart attack at his home (Perry, 2013).

References Aronow, A. (1961, December 19). [Letter to Stanley Milgram]. Stanley Milgram Papers (Box 46, Folder 169). Yale University Archives, New Haven, CT. Baumrind, D. (1964). Some thoughts on the ethics of research: After reading Milgram’s “Behavioral Study of Obedience.” American Psychologist, 19, 421–423. Billig, M. (1994). Repopulating the depopulated pages of social psychology. Theory & Psychology, 4, 307–335. doi:10.1177/0959354394043001

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Perry 637

Danziger, K. (1990). Constructing the subject: Historical origins of psychological research. Cambridge, UK: Cambridge University Press. DeVos, J. (2009). “Now that you know, how do you feel?” The Milgram experiment and psycholo- gization. Annual Review of Critical Psychology, 7, 223–246. DeVos, J. (2010). From Milgram to Zimbardo: The double birth of postwar psychology/psycholo- gization. History of the Human Sciences, 23(5), 156–175. Funt, A. (Director & Executive producer). (1948–1992). Candid camera [Television series]. Los Angeles, CA: Allen Funt Productions. Gibney, A. (Director). (2006). The human behavior experiments [Television documentary]. United States: Fearful Symmetry Production. Gross, A. G. (1990). The rhetoric of science. Cambridge, MA: Harvard University Press. Haslam, N., Loughnan, S., & Perry, G. (2014). Meta-Milgram: An empirical synthesis of the obe- dience experiments. PLoS ONE, 9, e93927. Hitchcock, A. (Producer & Director). (1951). Strangers on a train [Motion picture]. United States: Warner Brothers. Laurent, J. (1987). Milgram’s shocking experiments: A case in the social construction of science. Indian Journal of History of Science, 22, 247–272. McCarthy, A. (2008). Stanley Milgram, Allen Funt, and me: Postwar social science and the “first wave” of reality TV. In S. Murray & L. Ouelette (Eds.), Reality TV: Remaking television culture (pp. 19–39). New York, NY: New York University Press. Milgram, S. (1962a, January 25). [Application to National Science Foundation]. Stanley Milgram Papers (Box 43, Folder 127). Yale University Archives, New Haven, CT. Milgram, S. (1962b, February 2). [Letter from National Science Foundation]. Stanley Milgram Papers (Box 43, Folder 129). Yale University Archives, New Haven, CT. Milgram, S. (1962c, March 15). [Letter from National Science Foundation]. Stanley Milgram Papers (Box 43, Folder 129). Yale University Archives, New Haven, CT. Milgram, S. (1962d, April 13). [Letter from the National Science Foundation]. Stanley Milgram Papers (Box 43, Folder 128). Yale University Archives, New Haven, CT. Milgram, S. (1962e). Evaluation of obedience research: Science or art? Stanley Milgram Papers (Box 46, Folder 16). Unpublished manuscript. Yale University, New Haven, CT. Milgram, S. (1962f). [Film costs]. Stanley Milgram Papers (Box 85, Folder 448). Yale University Archives, New Haven, CT. Milgram, S. (1963a, April 29). [Application to National Science Foundation]. Stanley Milgram Papers (Box 45, Folder 160) Yale University Archives, New Haven, CT. Milgram, S. (1963b). Behavioral study of obedience. Journal of Abnormal Psychology & Social Psychology, 67, 371–378. Milgram, S. (1963c, November 13). [Letter from National Science Foundation]. Stanley Milgram Papers (Box 43, Folder 128). Yale UniversityArchives, New Haven, CT. Milgram, S. (1965a, July 10). [Letter to National Science Foundation]. Stanley Milgram Papers (Box 43, Folder 129). Yale University Archives, New Haven, CT. Milgram, S. (1965b). [Letter from Chris Johnson]. Stanley Milgram Papers (Box 75, Folder 435). Yale University, New Haven, CT. Milgram, S. (Producer & Director). (1965c). Obedience [DVD]. United States: Penn State University Audio-visual. Milgram, S. (1969). [Publicity]. Stanley Milgram Papers (Box 85, Folder 448). Yale University, New Haven, CT. Milgram, S. (1974). Obedience to authority: An experimental view. London, UK: Tavistock. Milgram, S. (n.d.). [Notes]. Stanley Milgram Papers (Box 76, Folder 440). Yale University Archives, New Haven, CT.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 638 Theory & Psychology 25(5)

Morawski, J. (2005). Reflexivity and the psychologist. History of the Human Sciences, 18, 77–105. Mosley, M. (2010). The brain: A secret history [Television documentary]. UK: BBC Four. Nicholson, I. (2011). “Torture at Yale”: Experimental subjects, laboratory torment and the “reha- bilitation” of Milgram’s “Obedience to Authority”. Theory & Psychology, 21, 737–761. doi:10.1177/0959354311420199 Parker, I. (2000). Obedience. Granta, 71, 99–125. Perry, G. (2013). Behind the shock machine: The untold story of the notorious Milgram psychology experiments. New York, NY: The New Press. Silver, M. (1972). Interview with Stanley Milgram. Stanley Milgram Papers (Box 23, Folder 382). Unpublished manuscript. Yale University Archives, New Haven, CT. Spears, R., & Smith, H. J. (2001). Experiments as politics. Political Psychology, 22, 309–330.

Author biography Gina Perry is a PhD candidate in the Schools of Historical and Philosophical Studies and Culture and Communications at the University of Melbourne and is author of Behind the Shock Machine: The Untold Story of the Notorious Milgram Psychology Experiments. She has been a freelance journalist and broadcaster and has written and produced an award-winning radio documentary on Milgram’s obedience experiments, incorporating original audio recordings as well as contempo- rary interviews with former subjects. Her research interests are in the parallels and intersections between social psychology and journalism, rhetorical aspects of science writing, and the psychol- ogy of social psychological research.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 TAP0010.1177/0959354315605393Theory & PsychologyNicholson research-article6053932015

Article

Theory & Psychology 2015, Vol. 25(5) 639­–656 The normalization of © The Author(s) 2015 Reprints and permissions: torment: Producing and sagepub.co.uk/journalsPermissions.nav DOI: 10.1177/0959354315605393 managing anguish in Milgram’s tap.sagepub.com “Obedience” laboratory

Ian Nicholson St. Thomas University

Abstract Milgram framed his “Obedience” experiments as an inquiry into the Holocaust, posing state directed mass murder as a “conflict between conscience and authority.” However, recent research into atrocities suggests that “moral conflict” is often absent; murder is frequently undertaken willingly in a spirit of idealism and “normalcy.” The question is not why do people obey orders they find morally objectionable as Milgram suggested, but rather how does it become “normal” and “ok” to torture or kill defenseless people? I examine this question through a reinterpretation of the Obedience study. Instead of focusing on the confused and entrapped participants, people who were tricked into “immoral” action, I study the scientists themselves—individuals who applied enhanced stress techniques on innocent people repeatedly and enthusiastically, fully aware of what they were doing. Inverting Milgram’s Holocaust analogy, I suggest that recent scholarship on Nazi doctors can provide insights into the various ways that torment became “normalized” for Milgram and his assistants.

Keywords enhanced interrogation techniques, ethics of research, Nazi doctors

Stanley Milgram’s Obedience experiments are an academic and cultural blockbuster. At once entertaining and supposedly informative, psychology and the public cannot get enough of them; the appetite for things “Milgram” seems insatiable. In addition to the obedience work featuring prominently in undergraduate psychology textbooks, news broadcasts, and documentaries, Milgram is the subject of four recent special issues: The Psychologist (Reicher & Haslam, 2011), Theoretical and Applied Ethics (Herrera, 2013),

Corresponding author: Ian Nicholson, Department of Psychology, St. Thomas University, 51 Dineen Dr., Fredericton, NB, E3B 5G3, Canada. Email: [email protected]

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 640 Theory & Psychology 25(5) the Journal of Social Issues (Reicher, Haslam, & Miller, 2014) and the current special issue of Theory & Psychology (Brannigan, Nicholson, & Cherry, 2015). What is notable about much of this voluminous commentary is not simply the extent of the scholarship, but also the nature of the interest. As I have discussed elsewhere (Nicholson, 2011b), most of the recent literature on the Obedience experiments has been largely flattering and uncritical in character (see Blass, 2009; Burger, 2009; Miller, Collins, & Brief, 1995). A recent special issue of The Psychologist is a case in point. Throughout this issue, Milgram is presented in saintly terms. He is described as having an “epic vision for social psychol- ogy” (Reicher & Haslam, 2011, p. 650); and as a “guide” who can help psychology “go back to that heroic era of great field studies” (Reicher & Haslam, 2011, p. 652). In con- trast, many of the discussions of the obedience research in the 1960s, 70s, and 80s fea- tured a number of searching ethical and epistemological critiques (see Baumrind, 1964; Helm & Morelli, 1979; Mixon, 1972; Patten, 1977a, 1977b; Schuller, 1982). The efforts to render the Obedience study as a kind of psychological “classic” that is timeless and beyond significant criticism have been extensive and in a certain sense “successful.” Milgram has been effectively “rehabilitated” within psychology to the point that the paradigm has been declared by one of its enthusiasts to be “alive and well after all these years” (Burger, 2011, p. 654). However, for all the success of the discipli- nary “makeover,” challenging questions of the sort raised by Diana Baumrind (1964) in her famous critique of the Obedience research remain. Among these questions include the propriety of using “enhanced” stress techniques on innocent, unsuspecting people and the validity of using staged “quasi-theatrical” laboratory performances for under- standing complex historical events like the Holocaust. In the last 5 years, several scholars have built on Baumrind’s formative critique drawing extensively on resources that she never had access to: Milgram’s unpublished archival records (see Gibson, 2013; Nicholson, 2011a, 2011b; Perry, 2013). One of the most important revelations of this new, archive-based Obedience scholarship is the discovery that Milgram was not always forthcoming with the truth. We now know that Milgram misrepresented several impor- tant facets of his research, including (a) the extent and nature of his debriefing proce- dures, (b) the risk posed by the experiment, (c) the harm done to his participants, (d) the role of standardization in the study, and (e) his private views on the ethics and meaning of the research. Looking behind the “archival curtain” makes it clear that the received view of the Obedience study is not the “reality” of what happened but an idealized ver- sion constructed by Milgram for rhetorical purposes and those of professional self- advancement (Nicholson, 2011b; Perry, 2013). In light of these archival investigations, it is apparent that distinguished critics such as Baumrind (1964) and Kohlberg (1974) had more reason than they supposed to question the ethics and validity of the Obedience research. Indeed, one is tempted to consign the Obedience research to the past where other curiosities of 1960s era psychological excess lie buried (see Nicholson, 2007; Raz, 2013). However this temptation is not one that I will indulge in the present study. My intention here is not to “bury” the Obedience study but to examine it from a different perspective. Instead of taking the behavior of Milgram’s participants as the “puzzle” to be explained, I will scrutinize the conduct and ideological mindset of Milgram and that of his research team—the individuals who knowingly and repeatedly “went all the way” with their participants.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Nicholson 641

This alteration in focus is inspired by the disjuncture that emerges when considering the behavior of the majority of Milgram’s participants to that of individuals involved in “real-life” scenarios of unlawful killing and torment. Milgram admitted that his partici- pants were operating in a context of confusion and uncertainty. They were led to believe they were participating in something that was benign, expressly told that what they were doing would involve “no permanent tissue damage” only to then find themselves in a confusing world of collapsing expectations in which experimental cues increasingly col- lided with their intellectual understanding of and emotional reaction to the situation. As is now well known, this scenario produced extraordinary tension and feelings of anxiety and self-doubt for many participants. In contrast, what stands out in both the conduct and statements of American military police at Abu Ghraib or that of prominent Nazis such as Auschwitz Commandant Rudolph Höss or SS Lieutenant Colonel Adolph Eichmann is their serenity in the face of a clear and unambiguous understanding of the brutality they had inflicted. For example, Höss (1992) was quite matter of fact about his role as the “greatest destroyer of human beings” (p. 189) and, while he admitted to being misled by Nazi propaganda, given the circumstances and commitments of wartime military service he did not think he had done anything wrong, concluding his autobiography with the statement that he “had a heart and was not evil” (p. 186). Eichmann was equally una- bashed when considering his role as one of the principal “production managers” of the Holocaust: “I was no ordinary recipient of orders. If I had been I would have been a fool. Instead, I was part of the thought process. I was an idealist” (quoted in Brannigan, 2013). American military police showed a similar equanimity while torturing prisoners and later when charged for abusing detainees (Alkadry & Witt, 2009). The now infamous photos of American military police abusing prisoners shows the soldiers unperturbed and in some cases smiling. Private Lynndie England, one of the military police court martialed for prisoner abuse, was, like several of her colleagues, completely mystified by the charges, telling ABC News that “we don’t feel like we were doing things that we weren’t supposed to do” (cited in Alkadry & Witt, 2009, p. 139). Unlike the participants in Milgram’s study, Lynndie England, Adolph Eichmann, and Rudolph Höss were not tricked into torture or mass murder. There was no deception con- cerning the benign nature of the undertaking nor was their brutal conduct dependent on lies that the victims were “suffering no permanent tissue damage” (Milgram, 1963, p. 374). These “real life” perpetrators knew exactly what they were doing and many were willing participants (Brannigan, 2013; Goldhagen, 1996). The key question that such cases pose is not the one Milgram put forward of “why do people obey orders they find morally objectionable,” but rather, how does it become “normal” and routine to abuse, torment, or, in some cases, kill defenseless people? The confused, entrapped, and frequently reluctant Obedience study participants are of limited use in answering this important question. However, as someone who knowingly and repeatedly applied “enhanced” stress techniques to innocent people, the conduct of Milgram himself and that of his research team is relevant to the issue of how torture becomes “normalized.” We now know that Milgram created a functioning theatre of pain that reduced many able, self-possessed participants to trembling wrecks, several of whom conveyed their anguish to Milgram shortly after their participation (Nicholson, 2011b; Perry, 2013). These disturbing historical revelations are unsettling for our sense

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 642 Theory & Psychology 25(5) of Milgram as an ethical researcher, but they also raise a wider-ranging and possibly more important question of how the Obedience study was sustained in the face of such visceral pain and distress? The question of how to sustain torment was alluded to by Milgram (1963) himself in an analysis of why many participants went along with the directives of the experimenter. Milgram noted that his participants were expressly told that they were participating in something benign and causing no physical harm. The situation was highly ambiguous involving a perceptual collision between a calm experimenter and a highly agitated “vic- tim.” Finally, the participants were given no time for reflection. They had to think through this confusing situation and decide on a course of action “on the spot.” What is significant about these observations is that they refer to the maintenance of torment “in the experimental moment,” but they do not address the more important “real life” scenarios that Milgram claimed to be interested in—the Nazi who murders not once in a state of confusion but repeatedly in the full knowledge of what he was doing (Goldhagen, 1996). Ironically, it is this scenario that Milgram’s role as experimenter typified—namely that of a person who torments others repeatedly, fully aware of the pain they are inflicting. After the first few trials Milgram and his colleagues were in no doubt concerning the intensity of the stress in the experiment and the potential for physi- cal breakdown. Moreover, unlike the participants who experienced the trauma of the study once with “little time for reflection” (Milgram, 1963, p. 378), Milgram and his team visited the anguish over and over again, luring hundreds of innocent people into the lab in the full knowledge that many of these individuals would experience convulsions, trembling, and other extreme stress-related conditions. This deliberate and systematic application of enhanced stress techniques to vulnerable people raises the very question that so frequently comes to mind when considering grim contexts of torment like that of Abu Ghraib: How did it become “normal” and “ok” for a small group of scientists to subject innocent American citizens as a matter of routine to such extraordinary levels of abuse, experiences which by Milgram’s (1963) own admission “reached extremes that are rarely seen in sociopsychological laboratory studies” (p. 375). Although there is an extensive literature on the ethics of the obedience research, the issue of the “normalization of torment” among the experimenters (i.e., Milgram and his team) has received relatively little consideration. For example, well-known Milgram scholars such as Blass (2004), Elms (1995, 2009, 2014), and Miller (1986, 2009) have undertaken extensive commentary on various aspects of the obedience research, but always within the logic of the social psychology experiment: “these are the parameters of the experiment, here are the participants, what are they doing and why?” What is lacking in these accounts is any sort of critical regard for the experimenters themselves (a capacity for disciplinary self-examination which, rather ironically, Milgram himself possessed). Why did Milgram and his associates continue with the experiment day after day knowing the anguish that they were inducing? What sort of mentality enables scientists to inflict such extreme levels of psychological trauma on their fellow citizens as a matter of course? This paper will examine how torment was “normalized” and sustained among the Obedience experimenters themselves. I will explain this by inverting the analogy that Milgram famously and repeatedly applied to his innocent and unsuspecting participant- victims, that of the Holocaust. From the initial publication of the obedience research in

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Nicholson 643

1963 through to his 1974 book, Milgram insisted that he had deciphered the Holocaust by way of the psychology laboratory. While he conceded that there were important dif- ferences between the two situations, he argued that he had successfully reproduced the essence of authority such that he could compare and ultimately conclude that ordinary American citizens who had volunteered in good faith for what they thought was a benign psychology experiment were in fact no better morally than the SS who staffed Nazi death camps. “I used to wonder whether there were sufficient moral imbeciles in the United States to man a system of death camps,” Milgram remarked in a 1974 interview. “After doing my experiments, I am convinced I could recruit the necessary personnel in any medium-sized town” (as cited in Nobile, 1974). Equating American citizens in a psychology experiment with members of the Nazi killing machine was and remains a contentious point, and in my own thinking on the topic I find myself in agreement with Fenigstein (1998) who noted that “the terms that are routinely used to describe the horrors of the Holocaust – e.g. atrocity, inhumanity, hatefulness, wickedness – are simply preposterous in the context of Milgram’s studies” (p. 71). However, for the purpose of this article, I will indulge Milgram, suspend my sense of historical specificity and grant that the situations are at least somewhat compa- rable. If that is so, it follows that the comparison works in both directions and that the Holocaust can provide insights into the Obedience experiment itself, a context where supposedly responsible, cultured scientists visited anguish as a matter of routine on hun- dreds of innocent people. In so doing, it is very important to emphasize that this is an analogy of Milgram’s own making. He was the person who framed his study as an inquiry into the Nazi killing machine and, to his credit, Milgram was keenly aware of the moral parallel between his own behavior and that of Nazi officials, even going so far as to compare the conduct of his research team with that of SS Lieutenant Colonel Adolph Eichmann—a point to which I will return later. In this paper, I will draw from the now sizable historiography on those individuals most analogous to Milgram’s role in the obedience research—not that of a guard or sol- dier, but of doctors and scientists involved in the systematic brutalization of innocent people. My intention here is not to equate the two situations, but to follow Milgram’s (1974) logic that the “differences in scale, numbers, and political context may turn out to be unimportant as long as certain essential features are retained” (p. xii). Crucial to this comparison is the idea that both parties in this analysis—Nazi doctors and Milgram and his research team—are acting in ways that they consider to be ethical and responsible. In other words, neither of these historical events is best understood as “evil deviance” or manifestations of “unethical science.” In the case of Nazi physicians, there was an exten- sive and lively debate on throughout the Nazi era (see Harrington, 1996). A “culture of complaint” existed within the Nazi medical profession and physicians were not shy about expressing their concerns when they felt that policies and procedures were wrong or being carried out in too indiscriminate a manner (Proctor, 2000, p. 343). In short, ethics were not absent in Nazi medicine but through an elaborate array of rituals, metaphor, myth, and , they were transformed into a framework that reflected the broader values of the regime. A core element in Nazi medical ethics was the idea of “national race hygiene” to which physicians as guardians of the body felt ethically com- mitted to protect.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 644 Theory & Psychology 25(5)

Although contemporary commentators frequently emphasize the relative absence of ethical standards for Cold War era researchers (e.g., Reicher et al., 2014), Milgram was subject to a 1959 APA code of ethics that required him to “respect the integrity and pro- tect the welfare of the persons … with whom he is working”(American Psychological Association, 1959, p. 280). Well aware of his ethical responsibilities, Milgram developed his own mechanisms of normalization, many of which drew on a kind of “tough guy” ethic shared by several of his colleagues in American psychology. This ethic held that participants in psychology experiments were “resilient selves” capable of handling a significant amount of “enhanced” stress without any appreciable harm or long term effect (Stark, 2010). This ethical framework transformed a context of visceral suffering and anguish that was obvious to many of the participants into one that was somehow “therapeutic” for both the individual participant and society as a whole. As we shall see, the mechanisms of normalization employed by Milgram bore a resemblance—at least in some instances—to the complex frameworks of justification used by Nazi physicians.

Recreating the Holocaust in the laboratory The level of anguish and torment in the obedience study is now something that is easily missed, or at least underappreciated. Through a combination of repetition and the strate- gic exclusion of damaging detail, Milgram’s use of enhanced stress techniques on unsus- pecting people has become “normalized” for psychologists and the public alike, something that is legitimate and permissible—even admirable. Milgram played an important role in rendering the study into something that was both “surprising” but also legitimate and consistent with the social expectations of psychological research prac- tices. In his book-length treatment of the study, Milgram (1974) sanitized the experi- ment, omitting graphic details that might unsettle the reader. He did not include disturbing accounts from witnesses as to what had transpired, nor did he include feedback from the many participants who were traumatized and in some cases physically endangered by the experiment. Most contemporary social psychology textbooks have followed Milgram’s lead and have either minimized or, in some cases, completely ignored the torment that participants were subjected to (Nicholson, 2011b; Stam, Lubek, & Radtke, 1998). Similarly, most of the recent scholarly commentary on Milgram omits specific details of the anguish participants experienced in favor of a brief mention of “ethics” while reas- suring current psychologists that the experience was “overwhelmingly positive and that there is little evidence of any harm” (Reicher & Haslam, 2011, p. 652). To gain an appreciation of the pitiless character of the study, one must go back to the original published account of the experiment. In the initial presentation of the study, Milgram actually boasted of the torturous character of the experiment, noting that he had produced stress reactions among his participants that “reached extremes rarely seen in sociopsychological laboratory studies” (Milgram, 1963, p. 375). In another early paper, Milgram (1965) reported that he had induced “full-blown, uncontrollable sei- zures” in 15 people and that he had tormented another person to a point where he had “a seizure so violently convulsive that it was necessary to call a halt to the experiment” (p. 68). Russell (2014) has noted that Milgram even went so far as to apply his enhanced stress techniques on people who had ties of friendship and family. In a previously

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Nicholson 645 unpublished series of trials known as the “relationship condition” he asked participants to bring a friend to the study to see how amenable people who had close personal ties would be to his ruse (Russell, 2014, p. 195). Some of the participants brought family members including one person who brought his son. Milgram’s papers contain the tran- script of an obviously anguished father being repeatedly bullied by the experimenter into “shocking” his own son. Lest anyone doubt the gut-wrenching intensity of the proceedings, Milgram (1963) quoted one observer at length who commented on the “striking reactions of tension and emotional strain”:

I observed a mature and initially poised businessman enter the laboratory smiling and confident. Within 20 minutes he was reduced to a twitching, stuttering wreck, who was rapidly approaching a point of nervous collapse. He constantly pulled an earlobe, and twisted his hands. At one point he pushed his fist into his forehead and muttered “Oh God, let’s stop it.” And yet he continued to respond to every word of the experimenter, and obeyed to the end. (cited in Milgram, 1963, p. 377)

This description is consistent with the recollections of several participants that Milgram himself gathered after the study was completed. One participant remarked that “I wouldn’t want to do another experiment like that again for any amount of money. I’m still sorry I went to do it. It took me a couple of weeks before I was able to forget about it. I don’t think it is right to put someone through such a nervous tension.” Another com- mented that “I couldn’t remember ever being quite as upset as I was during the experi- ment” (Reaction of subjects, 1962) while a third stated that “I felt real remorse and when I came out—when the experiment was all over, I got home and told my family I had just gone through the most trying thing that I had ever subjected myself to” (Subjects’ con- versation, 1963). It is important to emphasize the industrial scale of all this suffering. The study had more than 780 participants and in his report Milgram (1963) indicated that extreme stress reactions were “characteristic rather than exceptional … to the experiment” (p. 375). Consistently producing this level of anguish requires a very elaborate and sophisticated physical infrastructure and a number of historians have documented Milgram’s extraor- dinary skill as a laboratory impresario (Gibson, 2013; Nicholson, 2011a; Perry, 2013; Russell, 2010). However, the production of torment on such a large scale also requires an equally elaborate “moral” infrastructure, an ideological framework that justifies the anguish and placates any feelings of guilt that arise among the tormentors. As we shall see, Milgram devoted considerable attention to developing just such a structure.

“Why do we feel justified?”: The value of “therapeutic torment” The book length version of the Obedience study received extensive media coverage and it provoked a wide range of responses from the public, many of whom wrote to Milgram. While some writers accepted Milgram’s framing of the study, often quite enthusiasti- cally, others challenged the ethics of the project, seeing little moral difference between participants who “shocked” people in the name of science and Milgram himself who psychologically shocked people in the name of science:

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 646 Theory & Psychology 25(5)

The fact was that not only two thirds of the subjects went on to the end … but you did too. You and the men engaged in running the experiments. Did any of your associates rebel and refuse to go on? If they did not, then the public has an enormously better track record than laboratory workers because a profound psychological shock is if anything in an advanced culture more serious an affair than a physical one. (Paashaus, 1974)

This reader’s curiosity about the mentality of the researchers themselves was under- standable for, in his published work, Milgram donned the guise of “scientific objectiv- ity,” carefully shielding the ways in which his own motivations and ideals maintained the study and its subsequent presentation to the public. Nazism served as a useful screen in this regard. Aware of the media value of the Holocaust analogy, Milgram publicly and repeatedly emphasized the parallel in his published work, highlighting ’s characterization of the study as the “Eichmann experiment.” It is “an apt term” Milgram continued, “for the subject’s situation [is] something akin to the position occupied by the infamous Nazi bureaucrat” (Milgram, 1974, p. 178). In private however, before he became famous, Milgram was much more candid and keenly aware of the degree to which the Eichmann analogy applied to himself. In his notebook retrieved from the archives, he formulated many of the very criticisms that would dog him in later years and he drew a parallel between his own conduct in the experiment and that of his obedient participants:

I—and many others—know that the naïve subject is deeply distressed, and that the tension caused him is almost nerve shattering in some instances. Yet we do not stop the experiment because of this. And no observer has ever thought to interrupt the experiment, although we know a man was suffering deeply … The question to ask then is this: why do we feel justified in carrying through the experiment, and why is this any different from the justifications that the obedient subjects feel? (Milgram, 1962a).

It is quite understandable that Milgram should have felt so uneasy. The type of system- atic torment that he was undertaking was unpleasant work and it did tug at his con- science. In an undated note he remarked that “at times I have concluded that, although the experiment can be justified, there are still elements in it that are ethically questiona- ble, that it is not nice to lure people into the laboratory and ensnare them into a situation that is stressful and unpleasant to them” (Milgram, n.d.a). More revealing still, on another occasion, Milgram expressed a sense of ethical disquiet by applying the Eichmann anal- ogy to his own conduct, drawing the parallel between the logistical challenges of facili- tating torment in the laboratory and the logistical difficulties that confronted the SS in carrying out the “Final Solution.” Writing to his research assistant, Alan Elms, Milgram stated that one of Elms’ tasks would be to “think of ways to deliver more people to the laboratory”: “This is a very important practical aspect of the research. I will admit it bears some resemblance to Mr. Eichmann’s position” (Milgram, 1961a). Equating himself and that of his research team with Adolph Eichmann speaks to Milgram’s sense of the ethical dubiousness of what he was doing. His framing of the study had created awkward equivalency—if his confused and entrapped “obedient” par- ticipants were Nazis, was he not also a Nazi for inflicting anguish on his unsuspecting participants and for continuing to lure innocent people into the lab in the full knowledge

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Nicholson 647 that many of them would suffer one of the most stressful and potentially life-threatening experiences of their entire lives? Indeed, as the very creator of this dark, Kafkaesque space, did he not bear a greater weight of moral responsibility? How was one to justify luring well-meaning people into a laboratory, lying to them, subjecting them to an ordeal of extraordinary anguish and then, as a kind of psychological coup de grace, state that those who did not see through his elaborately staged ruse were “moral imbeciles”? (Milgram, 1961b). At this juncture, it is useful to consider in more detail the Nazi analogy that Milgram applied to himself by examining the historiography on S.S. concentration camp physi- cians. Lifton (1986) described the ideological framework that enabled Nazi doctors to participate in industrialized killing while simultaneously maintaining a “doctorly” image of themselves as responsible, “ethical” healers and scientists. A central feature of this framework was the Nazi embrace of a seemingly life-affirming “modern” vision centered around “scientific” ideals of hygiene and purity. The Nazis were convinced that a wide range of corrosive agents were weakening the German Volkskorper. It was this concern that prompted Nazi policy makers to envision the régime as “applied biology” and to invest heavily in what were then world-leading public education campaigns and laws directed against tobacco, cancer, and asbestos (Proctor, 2000, p. 341). Human genetics was part of this wider vision to medically cleanse Germany of toxins. The remarks of one Nazi doctor illustrate how these seemingly scientific commitments could license killing as something “therapeutic” and in keeping with a physician’s Hippocratic oath: “Of course I am a doctor and I want to preserve life. And out of respect for human life, I would remove a gangrenous appendix from a diseased body. The Jew is the gangrenous appendix in the body of mankind” (cited in Lifton, 1986, p. 16). The plausibility of this imagery was reinforced by a seemingly concrete anti-Semitic race science, which supposedly proved that the deterioration of the Aryan race could be halted by the killing of all Jews. There is of course a vast difference between psychological torture and physical killing and there are differences in the professional commitments of experimental psychologists and physicians. Nevertheless, there was a parallel to be drawn that Milgram himself recognized. Both scenarios involve professions supposedly committed to human welfare enmeshed in a program of systematic brutalization. Milgram developed his own moral infrastructure to assuage the feelings of discomfort that arose from what he described as a “nerve shattering” environment (cited in Perry, 2013, p. 327). A central plank in this framework was the notion of what I will term “therapeutic torment”—the idea of labora- tory brutalization as a “positive character-building” experience for those being tor- mented. So construed, participants were not innocent victims of dishonest, irresponsible science but winners of a rare opportunity to be mistreated for their own good. Milgram laid out the logic of therapeutic torment in the aforementioned letter to Elms. There should be “no misconceptions of what we do with our daily quota” he remarked, invok- ing the Eichmann idiom. “We give them a chance to resist the commands of a malevolent authority and assert their alliance with morality” (Milgram, 1961a). Casting the experiment as “moral therapy” was a clever rationalization, allowing Milgram to at least partially displace feelings of guilt as a perpetrator of torment with a sense of himself as a benevolent father-like figure who was traumatizing people for their own good. As the Obedience study grew in fame and notoriety following its public debut

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 648 Theory & Psychology 25(5) in 1963, Milgram developed the “therapeutic torment” theme in considerable detail. In Obedience to Authority (1974) he wrote at length about the “positive side to participa- tion” reporting that “subjects indicated that they had learned a good deal, and many felt gratified to have taken part” (p. 196). It is important to note that Milgram had some basis for this assertion. In the survey that he conducted following the completion of the study, 83% of participants indicated that they were either glad or very glad to have taken part. The figures brought great comfort to Milgram, and he came to view them as the “chief moral warrant for the continuation of the experiment” (as cited in Abse, 1973, p. 40). However, retrospective consent is a very flimsy foundation on which to base the applica- tion of enhanced stress on unsuspecting people (Benjamin & Simpson, 2009). This approach shows disrespect for individual autonomy by drawing people into the study who, after the fact, indicated that they would have preferred not to have been involved. Moreover, as Benjamin and Simpson (2009) have noted, the judgment of lay participants after the fact is often influenced by issues of —their desire to try to recover a measure of dignity and self-respect out of a gruesome ordeal. Milgram clearly sensed the need to provide a compelling narrative to confer meaning onto his partici- pants’ suffering and to allay lingering feelings of resentment. In the written “debrief” that Milgram sent to all the participants, he instructed them to think of the abuse and devalu- ing that they had endured as a patriotic undertaking, providing insight into what might happen in the event of a nuclear war: “Consider for example the possible day when a man in another country is told by a superior to drop a hydrogen bomb on the United States. Will he participate in this act of destruction?” (Milgram, n.d.b). In the context of the Cold War and the Cuban Missile Crisis, such framing made it difficult for all but the most skeptical and self-possessed to question Milgram even though many spoke at length about what an awful experience it had been (see Nicholson, 2011a). Although Milgram eventually convinced himself that the majority of his participants found the torment beneficial, he also knew that some of his participants were badly trauma- tized by the mistreatment and did not buy his “therapeutic” rationalization. In such cases, “therapeutic torment” was given a wider, society-based interpretation. Conceding the toxic character of the experience for the individual, Milgram argued that torment was nevertheless justified because society as a whole would benefit from the torture of a few hundred inno- cent people. The unrelenting display of suffering revealed “the capacity for man to abandon his humanity, indeed, the inevitability that he does so, as he merges his unique personality into larger institutional structures” (Milgram, 1974, p. 188). Such knowledge could embolden some to defy abusive administrative authority and pursue a more ethical course of action. “We need to be aware of the problem of indiscriminate submission to authority,” Milgram (1992) wrote. “I have tried to foster that awareness with my work” (p. xxxiii). As ethical criticisms of the Obedience studied mounted, Milgram drew heavily on this wider sense of therapeutic torment, even crediting the refusal of one young American to fight in the to his role as a participant in the study (Milgram, 1974, p. 200).

Ritual, scientific idealism, and the authorization of anguish Tormenting people in the name of world peace was one of Milgram’s better justifica- tions, but its significance is undermined by the fact that he did not actually believe it

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Nicholson 649 while the suffering was ongoing. What sustained him in the face of so much routinized anguish was not so much the therapeutic value that he would emphasize so strongly later, but a more straightforward sense of intellectual curiosity, a kind of scientific zealotry that evidently trumped basic considerations of decency and respect for others. In his note- book, Milgram (1962c) spoke to this issue directly:

It would be pleasant to remark that the experiments were undertaken with a view toward their possible benefit to humanity; that knowledge of social man, in this instance, was sought for its possible application to the betterment of social life. “Surely” some may hold “the social good that stems from an understanding of human behavior compensates for, and thus helps justify, the abuses which were necessary for the conduct of the experiment.” A fine argument: but the author does not buy it.

In a moment of remarkable candor, Milgram admitted that he was driven primarily by a sense of wonder at the drama he had produced. Disturbing though it may seem, it was thrilling to lure a cross section of small-town into this disorienting, bizarre environment and watch as many of them collapsed under the strain of the ordeal. To his credit, Milgram saw the sadistic element in all this quite clearly and he wrote deri- sively of those, including himself, “who sit by ‘enjoying the show,’” and did nothing while the naïve experimental participant “was suffering deeply” (Milgram, 1962d; see also Nicholson, 2011b). Unsettled by the pleasure of such power, he couldn’t bring himself to stop; there was too much at stake for him professionally and it was all so very interesting:

Moreover, considered as a personal motive of the author—the possible benefits that might rebound to humanity—withered to insignificance alongside the strident demands of intellectual curiosity. (Milgram, 1962c)

It is impossible to know if Milgram felt this way throughout the experiment, but it is clear from this quote that there was a period during the study when he didn’t believe in the humanitarian value of what he was doing. While the sense of dramatic curiosity— would this next person be able to withstand the psychological torture?—was a compel- ling motivation, by itself it is difficult to imagine how any sensitive person could repeatedly visit so much anguish on so many innocent people. Here again, Milgram’s Nazi analogy is instructive. The bulk of the industrialized violence of the Holocaust was not random or disordered but embedded in a highly controlled context of ritual—a set of “enactments, materializations, [and] realizations” that, as Geertz has noted, help with the “overcoming of ambivalence as well as of ambiguity” (cited in Lifton, 1986, p. 432). Nazi death rituals were central to maintaining the psychological state of the SS involved. At the heart of this process were the selections, a medicalized ritual where physicians would decide who would live and who would die, thereby conferring a per- verse scientific imprimatur on the ghastly proceedings. Lifton (1986) has noted that selections were not simply a duty of these physicians, but a ritual connecting an osten- sibly “medical” task to mythic ideals of racial purity and national survival. Selections were a “cultural performance [that] tended to absorb anxieties and doubts and fuse individual actions with prevailing Nazi concepts” (p. 432). The ritual also served as a

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 650 Theory & Psychology 25(5) test of Nazi masculinity and hardness which, when passed, solidified the physician’s relationship to the group (see also Kuhne, 2008). We have already seen that Milgram was troubled by doubts and anxieties concerning the intellectual significance and ethical dubiousness of the obedience study. Ritual served as a powerful propellant in this situation just as it does in other contexts of torment. The obedience study was a scientific ritual writ large. Set in the sacred space of the labora- tory, it was heavily scripted and the product of hours of preparation and rehearsals. Milgram employed numerous variations on the study and carefully recorded his data. Nothing was left to chance and as assistant Alan Elms (1995) noted, Milgram was “the most well organized researcher I have ever encountered” (p. 24). The precise, exacting protocols of the study helped keep the attention of the researchers on technical proce- dures and away from human suffering. Ironically, the tendency for technique and proto- col to obscure anguish was something that Milgram (1965) noted himself in the behavior of his participants. He wrote very perceptively of a “denial and narrowing of the cogni- tive field so that the victim is put out of mind” (p. 63). However, Milgram struggled to see how this very phenomenon insulated him from the anguish he had orchestrated on his own victims—the participants—something later noted by his colleague and eyewitness of the experiment Lawrence Kohlberg:

I witnessed [the participants’] pain as a scientific observer through the mentality of the one-way vision mirror. That Milgram’s conclusions apply to myself I would be the last to deny. I could more dispassionately observe the suffering of the subjects through a one-way vision mirror, just as the subjects could continue to shock their assumed victims when separated by a screen from them. In this sense, Milgram’s belief in social-science “objectivity” operated as a false screen from the moral and personal understanding of the realities of the situation he created and allowed him to engage in a morally dubious experiment. (1974, p. 43)

As Kohlberg noted, the procedures and design of the study facilitated the routinization of torment. However, what elevated these actions to the level of ritual was their connection to a mythic ideal. Milgram and his team were not grim-faced brutes deceiving and tor- menting purely for pleasure, but scientists seeking something transcendent and redeem- ing. On this point, the ideological context of Nazi medicine is a useful point of reference. Historian Anne Harrington (1996) has outlined the important role played by “visions of salvation and reform” in the working lives of Nazi physicians (p. 199). She referred to the case of SS Colonel Joachim Mrugowsky, chief of the Institute of Hygiene in the Waffen SS. Among his many crimes documented at Nuremberg in 1946 were the execution of Russian prisoners by poisoned bullets and a “research” program that involved injecting lethal tubercular into healthy people. As barbaric as these actions may seem, Mrugowsky did not see himself as a “sadistic” person nor did he view his conduct as an over-zealous application of scientific objectivity—“science run amok.” For Mrugowsky, these actions were an expression of an enlightened holistic biomedical perspective which sought to harness medical science with the highest forms of German culture. As noted previously, there was torment aplenty in the Obedience study, but beyond this experimentally induced “banality of anguish” lay something pure and noble—at least in the minds of those inflicting the extreme stress. Where SS men like Mrugowsky

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Nicholson 651 were inspired by visions of a pure and “authentic” German Volk (Harrington, 1996), Milgram was beguiled by the promise of a pathway to the “real”—an understanding of the social world as it truly was. He was part of post-war tradition in American social psychology that envisioned the psychology laboratory as being akin to a chemistry labo- ratory, a space where one could “condense the elements present when obedience occurs in the larger world such that the essential ingredients are brought into a relatively narrow arena where they are open to scientific scrutiny” (Milgram, 1962b). As Danziger (2000) has noted, the artificial conditions of the social psychological experiment were not seen as a weakness but as a way of attaining “empirical purity” (p. 343). Artifice allowed one to “demonstrate the unadulterated effect of singular manipulable variables” (p. 343). Convinced that the laboratory was an epistemologically privileged place, Milgram envi- sioned social psychology in the grandest of terms. Its “special mission” was to deliver humanity from darkness into light: “Social psychology at its best leads to a mature adjustment of our illusions, a revision of the fictions we harbor about human nature” (Milgram, n.d.c). What made this “mature adjustment” so exhilarating for Milgram was the prospect of acquiring “truth” about the social world, knowledge about human nature equal in sophistication and precision to our understanding of the physical world. On offer was an ordered, disciplined understanding of the social, a periodic table of psychological forces that would allow us, finally, to really know human nature. “Ultimately, social psychology would like to have a compelling theory of situations,” Milgram (1965) wrote, “which will, first, present a typology of situations; and then point to the manner in which definable properties of situations are transformed into psychological forces in the indi- vidual” (p. 74). With this vision of something high and ennobling firmly in mind, the repeated appli- cation of “enhanced” stress techniques to hundreds of innocent people did not seem quite so squalid, at least not in broad outline. The purity of the vision rendered all the wretch- edness and anguish into a more aesthetically pleasing and psychologically palatable form. The pain and suffering of the participant victims was valuable “data” that brought us that much closer to attaining ultimate social truth.

Conclusion Historical analogies are always fraught—anything involving Nazism especially so. However, given the extraordinary deference afforded to the study and the still common invocation of the Obedience paradigm as an “explanation” of all or part of the Holocaust, Milgram’s own role in the experiment bears far closer scrutiny than it has received to date. It should go without saying that Milgram is no more a “Nazi” than his innocent and entrapped participants were. However, it should also be clear by now that the production of torment on the scale undertaken during the Obedience experiments was no “routine affair,” or a set of “simple manipulations” as Milgram (1974) would later disingenuously claim (p. 175). The production and management of anguish was a highly involved matter and it was as much an ideological undertaking for the experimenters themselves as it was a technical process to be applied to others. To be successful, Milgram needed to be able to render the anguish of innocents as something normal and routine, something that was in fact “good for them” and for society as a whole.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 652 Theory & Psychology 25(5)

The archival record clearly shows that Milgram had strong reservations about what he was doing while the Obedience study was running, but once removed from the immedi- acy of the moment, the contrition, regret, and misgivings that were sprinkled throughout his private notebooks disappeared from his written work. Publicly, the Milgram of post- Obedience study fame insisted that he had done nothing wrong. He remarked that he was “totally astonished” by the ethical criticisms of his work and by the “absence of any assumption of good will and good faith” (as cited in Evans, 1980, p. 4). Quick to condemn his confused and entrapped “obedient” participants as nascent Nazis, he never publicly examined his own willingness to “go all the way,” nor did he consider how scientistic idealism and the rituals of the psychology lab could render the indiscriminate application of extreme, life-threatening stress as “normal,” “necessary,” and “respectable.” Viewing the Obedience experimenters against the backdrop of Nazi medicine may strike some readers as jarring. An entire discipline and indeed an entire society has been largely shielded from the sordid details of the study and desensitized to the spectacle of an Ivy League psychologist tormenting unsuspecting American citizens. Until now, the received view has been that “ordinary” Americans who trusted the experimenter are the nascent Nazis or “moral imbeciles” as Milgram (1961b) described them. The scientists who engineered the entire process were “innocent observers” simply trying to under- stand how this “Nazi-like” behavior arises. To suggest otherwise—to consider the Obedience paradigm as an ethically ambiguous space that frequently implicated the sci- entists themselves in the kind of routinized, callous behavior that they would subse- quently condemn in others—is considered by some to be an affront to psychology. Keen to protect the field’s reputation, Milgram’s defenders have brought forth a range of rhe- torical devices all of which are designed to shut down lines of inquiry that run counter to the received view. Critics of the Obedience research have their credentials questioned and are accused of being “angry” and engaging in “Milgram bashing” (see Elms, 2014; Tavris, 2013) a charge which is especially ironic since the most substantial criticisms of the Obedience research come from Milgram’s own unpublished notes. The unwillingness of many in psychology to look honestly and critically at all of the “players” in the Obedience experiment—participants and experimenters alike—is under- standable. As Miller (2004) has noted, the Obedience paradigm confers considerable cachet on the field of social psychology while giving the discipline a role in discussions of the Holocaust, undoubtedly one of the most important events in human history. Any field would be reluctant to let go of such a well-known example of disciplinary prowess. It is just so much easier to indulge the old myths while loudly proclaiming the need for yet more Milgram-style laboratory artifice. Nevertheless, to the degree that psychology is a scholarly discipline and not a religion, it is important to confront new evidence— archival or otherwise—and where necessary to revise our understanding, even when it involves a figure of such renown. The preceding critical examination of how the torment of innocents was “normalized” for Milgram and his research associates is not intended as a refutation of the study’s intellectual utility, but as a challenge to the way it is typically read. My argument has been that the study’s impact and contemporary significance is greatest when it is inverted in such a way that the people who designed the torment regime and knowingly and actively participated in it—that is, Milgram and his collaborators—become the object of

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Nicholson 653 interest. I think this focus is more relevant than the perspective that Milgram presented, which stressed the behavior of ordinary people who were lured into a supposedly benign environment, lied to, and then subjected to highly confusing and contradictory stimuli— a distressed “victim” and a calm experimenter telling them that everything was OK. This context is quite unlike any scenario where the Obedience study is so frequently deployed—Guantanamo, Abu Ghraib, or Nazi death camps. The guards, physicians, and military psychologists in these facilities were not tricked into torture and killing; they were very much aware and generally approving of the torment they were inflicting (see Fenigstein, 2015; Mastroianni, 2015). To the degree that there is any parallel at all between these scenarios and a social psychology laboratory, the connection is with the thoughts and behavior of Milgram and his associates who knew of the anguish they were visiting on innocent people but continued regardless, enmeshed in contexts of scientific ritual, research protocols, and a conviction that the torment was justified by a “higher” moral purpose.

Acknowledgements I would like to thank Augustine Brannigan and Fran Cherry for their helpful comments on this manuscript.

Funding The author(s) received no financial support for the research, authorship, and/or publication of this article.

References Abse, D. (1973). The dogs of Pavlov. London, UK: Vallentine, Mitchell. Alkadry, M., & Witt, M. (2009). Abu Ghraib and the normalization of torture. Public Integrity, 11, 135–153. American Psychological Association. (1959). Ethical standards of psychologists. American Psychologist, 14, 279–282. Baumrind, D. (1964). Some thoughts on ethics of research: After reading Milgram’s “behavioral study of obedience”. American Psychologist, 19, 421–423. Benjamin, L., & Simpson, J. (2009). The power of the situation: The impact of Milgram’s obedi- ence studies on personality and social psychology. American Psychologist, 64, 12–19. Blass, T. (2004). The man who shocked the world: The life and legacy of Stanley Milgram. New York, NY: Basic Books. Blass, T. (2009). From New Haven to Santa Clara: A historical perspective on the Milgram obedi- ence experiments. American Psychologist, 64, 37–45. Brannigan, A. (2013). Beyond the banality of evil. Oxford, UK: Oxford University Press. Brannigan, A., Nicholson, I., & Cherry, F. (Eds.). (2015). Unplugging the Milgram machine [Special issue]. Theory & Psychology, 25(5). Burger, J. (2009). Replicating Milgram: Would people still obey today? American Psychologist, 64(1), 1–11. Burger, J. (2011). Alive and well after all these years. The Psychologist, 24(9), 654–657. Danziger, K. (2000). Making social psychology experimental: A conceptual history. Journal of the History of the Behavioral Sciences, 36, 329–347. Elms, A. (1995). Obedience in retrospect. Journal of Social Issues, 51(3), 21–31.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 654 Theory & Psychology 25(5)

Elms, A. (2009). Obedience lite. American Psychologist, 64, 32–36. Elms, A. C. (2014). Contra Milgram [Review of the book Behind the shock machine, by G. Perry]. PsycCRITIQUES, 59(11). Retrieved from http://dx.doi.org/10.1037/a0036233 Evans, R. (1980). The making of social psychology. New York, NY: Gardner Press. Fenigstein, A. (1998). Were obedience pressures a factor in the Holocaust? Analyse & Kritik, 20, 54–73. Fenigstein, A. (2015). Milgram’s shock experiments and the Nazi perpetrators: A contrarian per- spective on the role of obedience pressures during the Holocaust. Theory & Psychology, 25, 581–598. doi:10.1177/0959354315601904 Gibson, S. (2013). Milgram’s obedience experiments: A rhetorical analysis. British Journal of Social Psychology, 52, 290–309. Goldhagen, D. (1996). Hitler’s willing executioners. New York, NY: Knopf. Harrington, A. (1996). Unmasking suffering’s masks: Reflections on old and new memories of Nazi medicine. Daedalus, 125, 181–205. Helm, C., & Morelli, M. (1979). Stanley Milgram and the obedience experiment. Political Theory, 7(3), 321–345. Herrera, C. (2013). Stanley Milgram and the ethics of social science research. Theoretical and Applied Ethics, 2(2), vii–viii. Höss, R. (1992). Death dealer: The memoirs of the SS Kommandant at Auschwitz. New York, NY: Da Capo Press. Kohlberg, L. (1974, March 24). More authority. New York Times Book Review, 42–43. Kuhne, T. (2008). Male bonding and shame culture: Hitler’s soldiers and the moral basis of geno- cidal warfare. In O. Jensen & C.-C. Szejnmann (Eds.), Ordinary people as mass murderers (pp. 55–77). New York, NY: Palgrave. Lifton, R. (1986). The Nazi doctors. New York, NY: Basic Books. Mastroianni, G. R. (2015). Obedience in perspective: Psychology and the Holocaust. Theory & Psychology, 25, 657–669. Milgram, S. (1961a, June 27). Letter to Alan Elms. Stanley Milgram Papers (Series II, Box 43, Folder 127). Yale University Archives, New Haven, CT. Milgram, S. (1961b, September 21). Letter to Henry Riecken. Stanley Milgram Papers (Series II, Box 43, Folder 127). Yale University Archives, New Haven, CT. Milgram, S. (1962a). Extending the field of observation. Stanley Milgram Papers (Series II, Box 163, Folder 46). Yale University Archives, New Haven, CT. Milgram, S. (1962b, August). Note on methods. Stanley Milgram Papers (Series II, Box 46, Folder 174). Yale University Archives, New Haven, CT. Milgram, S. (1962c, August). Note to self – Ethics of experimentation. Stanley Milgram Papers (Series II, Box 46, Folder 173). Yale University Archives, New Haven, CT. Milgram, S. (1962d). Obedience notebook. Stanley Milgram Papers (Series II, Box 46, Folder 163). Yale University Archives, New Haven, CT. Milgram, S. (1963). Behavioral study of obedience. Journal of Abnormal & Social Psychology, 67, 371–378. Milgram, S. (1965). Some conditions of obedience and disobedience to authority. Human Relations, 18, 57–76. Milgram, S. (1974). Obedience to authority: An experimental view. London, UK: Tavistock. Milgram, S. (1992). The individual in a social world: Essays and experiments (2nd ed.). New York, NY: McGraw-Hill. Milgram, S. (n.d.a). An experimenter’s dilemma, unpublished note. Stanley Milgram Papers (Series III, Box 62, Folder 126). Yale University Archives, New Haven, CT.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Nicholson 655

Milgram, S. (n.d.b) Report to memory project subjects. Stanley Milgram Papers (Series II, Box 163, Folder 46). Yale University Archives, New Haven, CT. Milgram, S. (n.d.c) Unpublished note. Stanley Milgram Papers (Series III, Box 59, Folder 81). Yale University Archives, New Haven, CT. Miller, A. G. (1986). The obedience experiments: A case study of controversy in social science. New York, NY: Praeger. Miller, A. (2004). What can the Milgram obedience experiments tell us about the Holocaust? In A. Miller (Ed.), Social psychology of good and evil (pp. 193–239). New York, NY: Guilford. Miller, A. (2009). Reflections on “Replicating Milgram”. American Psychologist, 64, 20–27. Miller, A., Collins, B., & Brief, D. (1995). Perspectives on Obedience to Authority: The legacy of the Milgram experiments. Journal of Social Issues, 51, 1–19. Mixon, D. (1972). Instead of deception. Journal for the Theory of Social Behavior, 2, 145–174. Nicholson, I. (2007). Baring the soul: Paul Bindrim, Abraham Maslow, and “nude psychother- apy”. Journal of the History of the Behavioral Sciences, 43(4), 337–359. Nicholson, I. (2011a). “Shocking” masculinity: Stanley Milgram, “Obedience to Authority,” and the crisis of manhood in Cold War America. ISIS, 102, 238–268. Nicholson, I. (2011b). “Torture at Yale”: Experimental subjects, laboratory torment, and the “rehabilitation” of Milgram’s “obedience to authority”. Theory & Psychology, 21, 737–761. doi:10.1177/0959354311420199 Nobile, P. (1974, March 10). Uncommon conversations: You and your Auschwitz quotient [Radio interview with Stanley Milgram]. Stanley Milgram Papers (Series II, Box 43, Folder 128). Yale University Archives, New Haven, CT. Paashaus, J. (1974, July 7). Letter to Stanley Milgram. Stanley Milgram Papers (Series III, Box 61, Folder 113). Yale University Archives, New Haven, CT. Patten, S. (1977a). The case that Milgram makes. Philosophical Review, 86, 350–364. Patten, S. (1977b). Milgram’s shocking experiments. Philosophy, 52, 425–440. Perry, G. (2013). Behind the shock machine: The untold story of the notorious Milgram psychology experiments. New York, NY: The New Press. Proctor, R. (2000). Nazi science and Nazi medical ethics: Some myths and misconceptions. Perspectives in Biology and Medicine, 43, 335–346. Raz, M. (2013). Alone again: John Zubek and the troubled history of sensory deprivation research. Journal of the History of the Behavioral Sciences, 49, 379–395. Reaction of subjects. (1962). Stanley Milgram Papers. (Series II, Box 44). Yale University Archives, New Haven, CT. Reicher, S., & Haslam, A. (2011). The shock of the old. The Psychologist, 24(9), 650–652. Reicher, S., Haslam, A., & Miller, A. (2014). What makes a person a perpetrator? The intellectual, moral, and methodological arguments for revisiting Milgram’s research on the influence of authority. Journal of Social Issues, 70, 393–408. Russell, N. (2010). Milgram’s obedience to authority experiments: Origins and early evolution. British Journal of Social Psychology, 49, 1–23. Russell, N. (2014). Stanley Milgram’s obedience to authority “relationship condition”: Some methodological and theoretical implications. Social Sciences, 3, 194–214. Schuller, H. (1982). Ethical problems in psychological research. New York, NY: Academic Press. Stam, H., Lubek, I., & Radtke, H. L. (1998). Repopulating social psychology texts: Disembodied “subjects” and embodied subjectivity. In B. Bayer & J. Shotter (Eds.), Reconstructing the psychological subject: Bodies, practices and technologies (pp. 153–186). London, UK: Sage.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 656 Theory & Psychology 25(5)

Stark, L. (2010). The sciences of ethics: Deception, the resilient self, and the APA Code of Ethics, 1966–1973. Journal of the History of the Behavioral Sciences, 46, 337–370. Subjects’ conversation. (1963, February 28). Stanley Milgram Papers (Series LL, Box 44). Yale University Archives, New Haven, CT. Tavris, C. (2013, September 6). Book review [Review of the book Behind the shock machine: The untold story of the notorious Milgram psychology experiments, by G. Perry]. The Wall Street Journal. Retrieved from http://online.wsj.com/news/articles/SB10001424127887323324904 579040672110673420

Author biography Ian Nicholson is Professor of Psychology at St. Thomas University, Fredericton, New Brunswick and Editor of the Journal of the History of the Behavioral Sciences.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 TAP0010.1177/0959354315608963Theory & PsychologyMastroianni research-article6089632015

Article

Theory & Psychology 2015, Vol. 25(5) 657­–669 Obedience in perspective: © The Author(s) 2015 Reprints and permissions: Psychology and the sagepub.co.uk/journalsPermissions.nav DOI: 10.1177/0959354315608963 Holocaust tap.sagepub.com

George R. Mastroianni US Air Force Academy

Abstract Stanley Milgram’s explanation of the Holocaust in terms of the mechanism of obedience is too narrow. While obedience was one mechanism which contributed to the outcome, the murder of Jews and others was the work of people from a broad swath of German society, from economists who planned mass starvation to ordinary soldiers in the Wehrmacht, often acting without duress or apparent pressures to conform. Psychologists should not ask “why?” the Holocaust occurred, but “how?” Much behavior of perpetrators, bystanders, victims, and instigators can be understood as the consequence of normal mechanisms of perception, learning, socialization, and development. What made genocide possible was not the transitory conditions created in a lab in a few hours but a complex of mechanisms that are the product of generations of human experience and of elaborate rational, emotional, and logical justifications. This requires a more complex future psychology than the narrow focus on situationist obedience.

Keywords genocide, Holocaust, Milgram, obedience, situationism

Stanley Milgram’s studies of obedience, conducted in 1961 at Yale University (Milgram, 1963, 1974) are discussed in every introductory psychology and social psychology textbook, and are almost always explicitly linked to understanding perpetrator behavior in the Holocaust. For most people, especially those who only encounter psychology through textbooks, what psychology has to say about the Holocaust is what psychology has to say about obedience.

Corresponding author: George R. Mastroianni, US Air Force Academy, Department of Behavioral Sciences and Leadership, DFBL, USAFA, CO 80840, USA. Email: [email protected]

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 658 Theory & Psychology 25(5)

Textbooks are not the best place to look for late-breaking disciplinary news, however, and it is the case that the status of the Milgram studies as the accepted psychological explanation for the Holocaust is changing (Cesarani, 2004; Lipstadt, 2011). While the validity of the Milgram studies as a model of perpetrator behavior has been challenged in the past, there is now a growing consensus that at most a subset of perpetrators appear to resemble the participants in Milgram’s studies. Indeed, Thomas Blass concludes that:

Milgram’s approach does not provide a wholly adequate account of the Holocaust. Both the laboratory evidence and the historical details of the destruction of European Jewry raise questions about the degree of fit between Milgram’s conceptual model of obedience to authority and the actuality of the Holocaust. Clearly, there was more to the genocidal Nazi program than the dispassionate obedience of the average citizen who participated in the murder of his fellow citizens who were Jewish out of a sense of duty not malice. At the same time, it could not have succeeded to the degree that it did without the passive or active complicity of Everyman. While Milgram’s approach may well account for their dutiful destructiveness, it falls short when it comes to explaining the more zealous hate-driven cruelties that also defined the Holocaust. (2002, pp. 103–104)

This growing recognition that there is more work to be done in understanding the Holocaust from the viewpoint of psychology is partly the result of developments in other fields, most notably history (Browning, 1992; Goldhagen, 1996; Hilberg, 1961/2003). Growing historical distance from the Third Reich has brought new generations of scholars to the task of understanding the events of the Nazi years. These more recent scholars have added a great deal to the picture that emerged from the immediate post-war years. The refrain “I was only obeying orders” heard so often at Nuremberg had shaped a view of the Holocaust that was hierarchical and bureaucratic (Gilbert, 1947, 1950). Attention was focused initially on the 21 who sat in the dock at the first Tribunal, and on the organiza- tions, such as the NSDAP and the SS, that were the immediate instruments of incitement against the regime’s victims and of the implementation of its eventually exterminatory policies, respectively. The picture of the Holocaust that has emerged from the scholarship of the last few decades implicates a much wider panorama of German society. Individuals and institu- tions thought to have had little or nothing to do with the Final Solution and the events leading up to it are now known to have been deeply involved. The Wehrmacht was exten- sively engaged in the killings in the East after the invasion of the Soviet Union (Bartov, 2001). The supporting cast for the destructive drama that unfolded in German society included some motivated by ideology to support anti-Jewish measures, others by ambi- tion, greed, or even more base motives. Certainly some were obeying orders.

Paradigm shifting Stanley Milgram once stated that:

on the basis of having observed a thousand people in the experiment and having my own intuition shaped and informed by these experiments, that if a system of death camps were set up in the United States of the sort we had seen in Nazi Germany, one would find sufficient personnel for those camps in any medium-sized American town. (Milgram, 1979, 2nd quote)

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Mastroianni 659

Phillip Zimbardo’s Stanford Prison Study (Haney, Banks, & Zimbardo, 1973), seem- ingly confirmed in another way the idea that ordinary people could easily be transformed into evildoers by situations. The rhetorical power of the Milgram and Zimbardo studies lies in part in the apparent ease and rapidity with which ordinary people can be trans- formed into brutes and even killers. The entire experience of the Milgram experiment for individual research participants began and ended in a few hours. The Zimbardo study lasted only six days. The seeming ordinariness of Milgram’s participants, the random assignment of Zimbardo’s participants as prisoners or guards, coupled with the almost instantaneous transformation of these ordinary folk effected by the experiment, appeared at a stroke to moot discussion of ideology, of long-term historical, political, economic, or cultural factors in producing such behavior. Just as B. F. Skinner had claimed to eliminate the necessity for tiresome discussions of internal motivations and rationales for explaining human behavior by appealing to nothing more than the explanatory power of reinforcement histories, so too the situation- ists sought to eliminate the necessity for discussions of ideology, morality, and belief with the explanatory power of the situation. On the situationist account, it simply does not take years of exposure to pernicious propaganda or authoritarian child-rearing and educational practices or ugly beliefs about other people to get ordinary citizens to abuse or even kill other citizens: it just takes a few minutes and the right (wrong) situation. In fact, Milgram’s claim that a system of Nazi-like death camps could be readily staffed by contemporary American citizens raises an important question: If it is so easy to harness the power of destructive obedience latent in all of us, why haven’t we seen more examples outside social psychology laboratories? There appears to be no shortage of people with evil intent and access to introductory psychology texts in contemporary society. Even recent replications of the Milgram study produce compliance rates very similar to those found by Milgram (Burger, 2009), so it is not the case that general aware- ness of these mechanisms has reduced their potential power. And yet it is only in rare and highly specialized circumstances, such as the atrocities at My Lai or the abuse of detain- ees at Abu Ghraib, that the putative power of situational forces to lead to destructive behavior is invoked. Even in these rare instances the obedience and/or conformity expla- nation has not fared well as a legal strategy. Given that defense attorneys are practical people, one must conclude that obedience and conformity do not play as great a role in everyday life as the situationists might have us think, or that the obedience defense is not especially plausible to judges and juries. Phillip Zimbardo’s energetic and persistent identification of the abuses visited upon Iraqi detainees in the fall of 2003 by a small group of American soldiers as real-life instances of the kind of behavior occurring in his Stanford Prison Study is a case in point (Zimbardo, 2007). Zimbardo claims that the soldiers who committed the abuses were exemplary, outstanding soldiers who were transformed by a corrosive situation and induced to treat detainees cruelly. He also claimed that the abuses were the result of “migration” of enhanced interrogation techniques, specifically from Guantanamo Bay, that were applied to the detainees in the now-famous photographs that received world- wide publicity in the wake of the abuses. Just as Milgram’s focus on obedience does not square with the historical truth about the Holocaust, neither does Zimbardo’s focus on role-specific behavior square with the reality of what happened at Abu Ghraib. The soldiers who committed the abuses at Abu

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 660 Theory & Psychology 25(5)

Ghraib were not randomly assigned to their roles as guards, as were Zimbardo’s research participants, and there is ample evidence that the most serious abuses were committed by individuals with significant histories of deviant and aggressive behavior. Moreover, the specific abuses that were prosecuted did not take place in the context of interroga- tion. Virtually none of the Iraqi detainees depicted in the infamous photographs were ever interrogated, as they were not suspected terrorists, but instead were common crimi- nals or innocent Iraqis who had been caught up in massive sweeps by American troops (Mastroianni, 2013). Phillip Zimbardo himself offered expert testimony at the sentencing hearing of one of the Abu Ghraib defendants (Graveline & Clemens, 2010), and Stephan Mestrovic (2007) offered similar situationist-based testimony at the trial of another of the defendants. In neither case did the situationist defense appear to be very effective on behalf of the defendants. This is in marked contrast to the reception Zimbardo’s interpretation has received among psychologists and the general public, where it has been widely accepted: it is to be found virtually unquestioned in most psychology textbooks. Why do judges and juries see things so differently from psychology textbook authors and the lay public? Perhaps because judges and juries are exposed to the facts of these cases more thor- oughly and completely than textbook authors and journalists explain them. The fact pattern associated with Abu Ghraib simply does not support a narrowly situ- ationist explanation of these events. Like it or not, there were dispositional variables that played a role: some of the perpetrators had a history of deviant and aggressive behavior, and had these people not been present, events almost certainly would have unfolded very differently. Some of the individuals present behaved admirably and attempted to report the abuses as they happened, so clearly the situation was not so powerful as to be any- thing close to universally compelling. This is not to say that contingent situational factors played no role in what happened. The failure of unit leaders at every level to establish and sustain a proper leadership cli- mate conducive to good behavior among all members of the unit was identified as con- tributory by every investigation conducted on Abu Ghraib. One of the lessons to be drawn from Abu Ghraib, and one that I emphasize in my own teaching on leadership, is that good leadership establishes conditions that make it easier for soldiers to do the right thing, rather than the wrong thing. At Abu Ghraib, it was all too easy to do the wrong thing on the night shift in the hard site: poor leadership opened wide a door through which some soldiers chose to walk. Good leadership leaves fewer doors ajar, and also promotes and enables better outcomes when soldiers are confronted with difficult choices. A lack of detailed knowledge about events such as Abu Ghraib or the Holocaust may explain why so many people readily accept the situationist explanation of these events. Many people, including many psychologists, have only the most superficial acquaint- ance with the details of what actually happened during the Holocaust. This can be explained by three things. First, the public understanding of the Holocaust was shaped by an incomplete and sometimes inaccurate account emerging from the Nuremberg Tribunals following the defeat of Germany in World War II and the trial of Adolf Eichmann in Jerusalem in 1961 (Arendt, 1963). Second, historical scholarship has con- tinued to expand and refine our understanding of the Holocaust until the present day, and many people have simply not kept up with these new findings. Third, Stanley Milgram

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Mastroianni 661 told psychologists that historical context should be ignored. In his reply to Diana Baumrind’s (1964) critical analysis of the obedience studies, Milgram argued that:

Baumrind mistakes the background metaphor for the precise subject of the investigation. The German event was cited to point up a serious problem in the human situation: the potentially destructive effect of obedience. But the best way to tackle the problem of obedience, from a scientific standpoint, is in no way restricted by “what happened exactly” in Germany. What happened exactly can never be duplicated in the laboratory or anywhere else. The real task is to learn more about the general problem of destructive obedience using a workable approach. Hopefully, such inquiry will stimulate insights and yield general propositions that can be applied to a wide variety of situations. (1964, p. 851)

Of course, it does matter “what happened exactly” in Germany, because if what hap- pened exactly in Germany does not resemble what happened exactly in Milgram’s labo- ratory, then the insights emerging from the latter may or may not be relevant to the former, any more than they may be relevant to any other historical event. In fact, it is precisely our new knowledge of what happened exactly in Germany and occupied Europe and Russia that has eroded support for Milgram’s identification of destructive obedience as a primary explanatory tool for the behavior of Holocaust perpetrators. Newman and Erber [following Blass] aver that:

Milgram might have shed light on an interesting aspect of human behavior, but the phenomenon he studied might have little to do with what happened to the victims of the Holocaust or with the behavioral dynamics involved in any episode of genocide. Indeed, the idea that all, most or even many of the acts of cruelty perpetrated during the Holocaust were carried out by people who were grimly following orders is remarkably easy to disprove. (2002, p. 335)

None of the foregoing is meant to suggest that laboratory experiments must exactly rep- licate the natural phenomena we wish to study in order for them to be valid or interesting. It is important, however, to be careful about generalizing the findings from simplified situations constructed in a laboratory to more complicated real-world interactions. Stanley Milgram ran two dozen variations of his obedience experiment. In one variation, all or nearly all of the participants complied with the experimenter’s instructions. In oth- ers, none or almost none did. That range of compliance perhaps more faithfully reflects the historical record than the obsessive focus on the 62.5% who complied in the voice- feedback condition most commonly cited in discussions of the Holocaust (Perry, 2013). However slow textbooks may be to recognize the fact, the paradigm is shifting when it comes to psychological thinking about the Holocaust, and genocide. While obedience has not been abandoned as an explanatory mechanism for the behavior of Holocaust perpetrators, a process is underway which promises to displace it from center stage and assign it a more modest supporting role.

Toward a more complete psychology of the Holocaust Psychological theorizing about the Holocaust has built on Milgram’s situationist approach but remains mainly located in social psychology. James Waller (2002), Roy

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 662 Theory & Psychology 25(5)

Baumeister (1999), and Ervin Staub (2011) have offered theoretical accounts of the psy- chology of genocide. These three theories usefully expand the psychology of the Holocaust somewhat beyond the narrow situationist approaches of Milgram and Zimbardo. James Waller’s 2002 book, Becoming Evil: How Ordinary People Commit Genocide and Mass Killing, offers a psychological model intended to explain “how ordinary peo- ple commit extraordinary evil.” Waller further limits the scope of his inquiry to the rank- and-file killers:

I am not interested in the higher echelons of leadership who structured the ideology, policy, and initiatives behind a particular genocide or mass killing. Nor am I interested in the middle- echelon perpetrators, the faceless bureaucrats who made implementation of those initiatives possible. (p. 14)

Limiting the explanatory target in this way narrows the kinds of explanations one is likely to find. Waller’s focus on low-level perpetrators seemingly favors the identifica- tion of mechanisms of coercion and as important while downplaying the roles of ideology and belief likely to animate the actions of the instigators, leaders, and bureaucrats who played such an important role in the Holocaust. The explanation of extraordinary human evil outlined by Waller includes four primary elements: (a) our ancestral shadow, (b) the identities of the perpetrators, (c) a culture of cruelty, and (d) social death of the victims. While Waller does allow for cultural and educational histo- ries that might predispose some to genocidal behavior more than others, his theory relies on biological and social mechanisms that tend to universalize the potential for genocidal behavior and downplay contextual historical factors. Waller argues that situ- ational forces are so powerful that “any deed that perpetrators of extraordinary evil have ever done, however atrocious, is possible for any of us to do – under particular situa- tional pressures” (2002, p. 228). Missing from this account is an understanding of why particular situational pressures might not have the same result in different historical, cultural, economic, or political contexts. The universalizing impulse central to the kind of model-making undertaken by Waller glosses over the fact that situational pressures simply are not as powerful as situ- ationists would have us believe: it is not the case that it is possible for any of us to per- form any atrocious deed ever done by perpetrators of extraordinary evil. Most Italian Jews survived German occupation, while most Dutch Jews perished under German occu- pation. Were the “situations” created by Nazi occupation that different in the two coun- tries? Or were the cultural, historical, and ideological differences between these two societies more important than the situational similarities? Roy Baumeister’s psychological explanation of genocide is based on four “roots of evil.” These are idealism, threatened egotism, instrumentalism, and sadism. Of these, Baumeister sees idealism and threatened egotism as the primary factors relevant to the explanation of the Holocaust. The architects of the Holocaust did not see the enterprise in which they were engaged as primarily destructive or anti-social. Rather, they viewed their mission in world-historic terms. The Nazi reordering of Europe, with its attendant dislocation, resettlement, and eventual extermination of millions of human beings, was

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Mastroianni 663 undertaken to secure a brighter future: an idealistic goal. Baumeister argues that the (to the Nazis) noble end of creating a brighter future for Germans and Germany, and indeed the world as a whole was understood by them as justifying the horrific means that were eventually employed to achieve that goal (Baumeister, 1999). Ervin Staub has written extensively about genocide and mass violence, including the Holocaust, but he has also worked tirelessly in both prevention and reconciliation efforts around the world. His theorizing is thus informed by his experience on the ground in Rwanda and Congo, as well as by the historical record of other genocides. Staub’s theory begins with difficult life conditions. Societies confronting economic or political upheaval are more prone to the development of mass violence. Difficult life conditions frustrate basic human needs: needs for security, self-esteem, and control, for example. Attempts by groups or individuals to explain and address these difficult conditions can, given the nature of human and individual psychology, operate to promote intergroup hostility and violence (Staub, 2011). Many countries with histories of inter-group competition or conflict confront difficult life conditions without descending into genocide. What characteristics of societies make them more prone to such a reaction? Staub identifies cultural devaluation, authority ori- entation, cultural factors, an aggressive past, and the lingering effects of past victimiza- tion as risk factors. Staub (2011) points out that dispositional explanations implicating the authority orienta- tion or obedient character of individuals fall short in explaining the rise and initial accept- ance of the Nazi movement, but that such explanations may be relevant to understanding those most immediately responsible for the physical destruction of Jews. Staub cites evi- dence that members of the SS may have had a stronger authority orientation than most people. In addition to personality factors that may have played a role, ideological commit- ment was clearly important in many of those most responsible for the killing. Finally, the vital role played by particular leaders, such as Hitler, in enabling and organizing violence cannot be ignored. Staub’s approach usefully combines sensitivity to the historical and political factors at a societal level that may enhance the risk of genocidal behavior in a country with an awareness of the individual and social psychological mechanisms that may operate to elevate or diminish the risk of such behavior in a particular case.

Do we need a psychological theory of genocide? Psychological theorizing about genocide is complicated by a fundamental problem: the role of free will in a deterministic explanatory framework. The seemingly exculpatory nature of scientific explanations of perpetrator behavior has vexed many psychologists who do not wish to be interpreted as excusing genocidal behavior. Psychology has chal- lenged our common-sense understanding of why we do what we do, and offered alternate causal frameworks rooted in a changing set of explanatory constructs that have in com- mon only that they are not our conscious thoughts and beliefs (Baumeister, Massicampo, & Vohs, 2011). For Freud our behavior was driven by unconscious motivations; for Skinner, by our reinforcement histories; for Milgram, the situation. While many would agree that our internal thoughts and motivations are far from the whole story when it comes to human agency, the issue of free will matters because only a person who is free

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 664 Theory & Psychology 25(5) to choose an act can be blamed for that act. Surely there are constraints and limitations on our freedom to act in specific contexts, but we must admit some level of freedom if our actions are to be judged morally. Invalidation of individual volition has two unhappy consequences. First, it renders moral judgments of behavior problematic. Psychologists often assert that their determin- istic explanations of genocidal behavior should not be construed as exculpatory (Miller, Buddie, & Kretschmar, 2002), but this is wishful thinking: insofar as factors outside an individual are thought to cause particular behaviors, those behaviors cannot be subjected to the same moral calculus as voluntary actions. Second, it deflects attention away from the very beliefs and ideas that seem to play such an important role in genocide. Psychologists have become so attentive to and focused on instances of behavior in which our conscious ideas, beliefs, and motivations have somehow been disengaged or overrid- den that it has seemingly become difficult for many to accept that some, maybe much, human behavior actually is a result of our thoughts, ideas, beliefs, and feelings. William James grappled with the question of free will in an 1884 lecture later pub- lished as an essay entitled, “The Dilemma of Determinism” (James, 1886/1992). James was troubled by the morally exculpatory consequences of so-called hard determinism: determinism that sees every action as the inevitable and accidental consequence of all the accumulated billiard-ball like actions that have previously inevitably and accidentally occurred. James argued that if one accepts human actions as following this same rigid model of causality, it is impossible to praise or blame the actor for them. James was unwilling to accept the personal and social consequences of a world with- out praise or blame, and adopted the pragmatic solution of simply choosing to believe in choice: in free will. He did so with eyes wide open as to his inability to articulate a coher- ent philosophical or scientific basis for such a belief. Many modern psychologists, including some social psychologists who address themselves to matters such as geno- cide, find themselves in the same quandary. Few, however, are as willing as was James to admit and embrace a contradiction that lies (mostly) quietly beneath much of what we do as psychologists committed to a scientific epistemology. Aside from the thorny issue of the moral assessment of the behavior of perpetrators, theories of genocide confront other obstacles. The data available to develop and test theories of genocide are naturally very limited. There are problems of definition, at the outset, and then serious limitations in obtaining and interpreting empirical evidence of what transpires during chaotic and violent upheavals. Memories may fade or be unknow- ingly or deliberately distorted. One way to approach the explanation of genocide from a psychological viewpoint is to acknowledge in advance that psychology is ill-positioned to address the issue of why genocide occurs. We may want or think we ought to have a “theory” of the Holocaust, or of genocide more broadly, but do we really need one? Genocides apparently occur because of a combination of contingent factors, many of which lie in the domains of other disciplines, such as political science, economics, and history. Barbara Harff has built an empirical model (2003) based on political, economic, and historical variables that correctly detects many, though not all, instances of genocide since 1955. This approach is used in the context of prediction and prevention of genocide. Particular his- torical, political, and economic conditions at the societal level may be correlated with

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Mastroianni 665 psychological profiles that can help to further refine our understanding of when genocide is most likely to occur. Perhaps genocidal behavior is not psychologically extraordinary in any meaningful sense, but is readily explicable in terms already familiar to psychologists because such behavior arises from the very same mechanisms we use to explain other behavior. We don’t have a special theory to explain slavery or Jim Crow: these lamentable examples of human behavior are consistent with what we already know about prejudice, stereotyp- ing, and so on. What psychologists are less able to explain is why slavery was eventually abolished in the United States and Jim Crow has gradually given way to sometimes halt- ing but generally positive progress in achieving racial equality. The answer to why this happened is to be found not in psychology but in history, political science, and econom- ics, among other disciplines. Similarly, we don’t look first to psychologists to explain why particular wars or conflicts occur, though psychologists have contributed a great deal to our understanding of human behavior in wars. Psychologists are similarly not in the best position to answer the “why?” question about the Holocaust, but we are superbly equipped to answer the “how?” question. That is, we psychologists know a great deal about the determinants of human behavior—bio- logical, social, psychological—and once we accept that our burden is limited to explain- ing how humans placed in circumstances we might label “genicidogenic” come to commit the behaviors we label genocide, our task is much easier. Why they were in such conditions in the first place is not necessarily squarely within the psychological portfolio. Moreover, “genocide” is a term fraught with definitional ambiguities and subject to political influences: agreement is far from universal on what events deserve that label, and there are sometimes political pressures to apply or withhold that label in light of the international legal consequences of invoking the term. While it might be difficult to abandon the field of psychology to another field, think of it this way: the answer, “obedience,” is demonstrably unsatisfactory and incomplete as a response to the question, “Why did the Holocaust happen?” It is hard to take seriously Milgram’s claim that the Holocaust could easily be reproduced in any medium-sized American city, simply because the (putative) mechanism of destructive obedience exists. But in response to the question “How did the Holocaust happen?,” “obedience” is one very sensible element of a psychological response that would include other psychologi- cal mechanisms as well.

Toward a more complete psychology of the Holocaust It is time to expand the terms of psychological discussion of the Holocaust. Rather than focusing on explanations based on finding abnormal people (characteristic of the early, dispositional era of explanation), or on finding mechanisms that make normal people act abnormally (the situationist era), or quibbling about how much each of these approaches contributes, we should instead seek to understand much of the behavior that enabled and constituted the Holocaust as normal people acting normally, or perhaps as ordinary peo- ple acting ordinarily. This is not to suggest that psychopathology or evil intent or malign social influence play no role in genocide and mass violence: they surely do. But much behavior of

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 666 Theory & Psychology 25(5) perpetrators, bystanders, victims, and instigators can be understood as the consequence of normal mechanisms of perception, learning, socialization, and development operating just as we might expect. Even though behavior may occur in the context of a very special event, the behavior itself can still be quite ordinary. It is not uncommon to detect in post- war testimony of perpetrators a kind of puzzlement as to what all the excitement is about, suggesting that at the time the events seemed to be unremarkable (Browning, 1992, p. 72). Much of the Holocaust, staggering in its scale and barbarity when seen as a uni- tary event, was unexceptional to perpetrators in its pieces and parts. The mechanisms that produced these behaviors may also be unexceptional. Why do we need a theory of unex- ceptional behavior? The approach suggested here is intended to continue the current trajectory of explana- tory expansion and draw broadly on other areas of modern psychology to better understand behavior in the Holocaust. If we could go back to a time before psychology’s position on the Holocaust had evolved as it has, assemble a group of psychologists from a wide swath of the discipline and ask them what psychology can contribute to understanding the behav- ior of Holocaust perpetrators, what range of answers might we have on offer? What might experimental psychologists, cognitive psychologists, developmental psychologists, biopsy- chologists, experts on memory and perception, political psychologists, and the many more psychologists represented by the dozens of Divisions of the American Psychological Association, as well as the Association for Psychological Science have to say on the topic? Of course the Holocaust had a social dimension, but it was enacted by individuals, and surely there is something of value to be learned by treating them as individuals with psy- chologically relevant personal histories, and not solely as subjects of others’ influence or orders, or as creatures shaped by evolution to devalue outgroups. The extant psychology of the Holocaust is focused at the social level: intergroup con- flict, interpersonal influence. Individual psychological factors have received only frag- mentary and incomplete treatment. Social categorization is often implicated as a fundamental mechanism underpinning the kind of stereotyping, prejudice, and discrimi- nation that can lead to mass violence. But social categorization and its unfortunate seque- lae are in part, at least, an expression of more basic adaptations that economize cognitive effort at the expense of complete accuracy in our understanding of our world. Fiske and Taylor (1991) used the term “cognitive miser” to describe our tendency to use and other shortcuts to achieve a good-enough solution to most problems we confront. Heuristics and cognitive biases, such as the and representativeness , are natural economies of cognition that can be shown to fail spectacularly in certain circumstances, often (not always) circumstances created especially for the pur- pose by psychologists in a laboratory. But the fact that these mechanisms exist must mean that they were once, and perhaps still are, adaptive: there must also be occasions on which cognitive shortcuts produce the right answer, or a good-enough answer. A poignant postwar insight into the origins of one kind of thinking that contributed to the Holocaust can be found in the musical South Pacific. When Joe Cable sings “You’ve Got to Be Carefully Taught” he touches on something profoundly important: prejudice is not an inevitable aspect of human life. The implication in the song is that racial prejudice is something that is contingent, something that arises when purposely nurtured in a particular cultural environment. Nellie Forbush, otherwise the very arche- type of wholesome American Midwestern purity and goodness, is afflicted with and

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Mastroianni 667 deeply conflicted by the prejudice she has learned at her parents’ knees. It is worth remembering that concern over the perceived approval of race-mixing implicit in “You’ve Got to Be Carefully Taught” threatened the viability of South Pacific in the same America that had helped defeat Nazi Germany only four years before (South Pacific, n.d.). Milgram and Zimbardo are correct when they suggest that the potential for evil behav- ior is not confined to a small, deviant sub-group of humanity. They are wrong when they narrowly locate that potential in mechanisms like obedience and role conformity. These mechanisms are an adequate explanation for some small fraction of Holocaust perpetra- tors, but what about the economists who earnestly planned the starvation of millions of people, or those who stood by as their neighbors were rounded up and then looted their belongings? What stands between an ordinary American (or contemporary German) from a medium-sized town and a concentration camp guard is a great deal more than a few hours or days in a social psychology laboratory. Perhaps if genocide were mainly produced by mechanisms like those engaged at Yale or Stanford, it would be simpler to prevent. But the sad truth is what makes genocide possible often seems to be the product of generations of human experience, of elaborate rational, emotional, and logical justifi- cations that cannot be created or overcome in a few hours or days. A more complete psychology of the Holocaust can help clarify that more complex reality. In the immediate aftermath of the Holocaust many sought to locate the source of genocidal evil in a deviant sub-group (a “them”) that could be neatly separated from the rest of us. After Hannah Arendt’s (1963) interpretation of Eichmann and the Milgram experiments, though, it seemed clear that absolutely anyone could be a genocidal perpe- trator under the right situational pressures. These two species of explanation can be thought of as representing the opposite ends of a continuum: at one end, very few humans have the potential for genocidal violence; at the other end, all of us have that potential. Neither of these extreme positions is tenable any longer. Research has consistently shown that perpetrators were not, and are not, afflicted with mental disorders, nor do they con- stitute a readily recognizable sub-group identified by particular personality traits, so the “them” end of the continuum cannot explain very much. On the “us” end, Milgram’s obedience studies had great dramatic and theatrical appeal, and have thus secured a place not only in academic discussions of the psychology of the Holocaust but in the popular imagination as well. The Milgram studies and Hannah Arendt’s term “banality of evil” seemingly democratized and universalized the potential for genocidal evil. After Milgram and Arendt, perpetrators could no longer be comforta- bly thought of as “them”: instead perpetrators could as easily be “us.” Two developments have eroded the power of that simple idea: our increasingly sophisticated understanding of the history of the Holocaust, and our increasingly sophisticated understanding of the Milgram experiments. The former development has exposed the willing and even enthu- siastic participation of many more institutions and individuals in German society and the occupied territories in the Holocaust than previously thought. There is now a much wider range of behavior to be explained, and obedience of the reluctant sort demonstrated at Yale can now be seen to have played at most a small role in the Holocaust. The latter development requires us to modify our assessment of the Milgram studies themselves. New questions have been raised about the conduct of the studies and the rigor of the experimental procedures. Other papers in this special issue have addressed these concerns

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 668 Theory & Psychology 25(5) directly. Just as our growing historical knowledge challenges the external validity of the Milgram studies as having explanatory power for the Holocaust, recent scholarship about the studies themselves also raises questions about their internal validity. James Waller, Roy Baumeister, and especially Ervin Staub have, to varying degrees, expanded psychological thinking about genocidal behavior beyond the relatively narrow situationist interpretation of Milgram. Much more remains to be done, however. As we learn more about the varieties of response to and participation in the perpetration of atrocities during the Holocaust, openings for psychological explanation outside the realm of obedience and social influence begin to appear. Were young people more susceptible to the Nazi message than older people? If so, why? What psychological mechanisms produced this state of affairs? Were there gender differences in support for Nazism? Did self-interest play a role in the attempt to exterminate Jews? The idea that any of us could be transformed into genocidaires in a few hours in a social psychology laboratory is wrong. But it is the case that growing up a certain way, in a particular culture, steeped in destructive ideologies can produce people who will commit terrible acts of destruction. Understanding that a more complicated, long-term process is the task ahead of psychologists interested in explaining the Holocaust, and this task will necessarily draw on psychological knowledge quite broadly.

Acknowledgements The views expressed in this article are those of the author and do not necessarily reflect the official policy or position of the United States Air Force Academy, the Air Force, the Department of Defense, or the U.S. Government.

Funding This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.

References Arendt, H. (1963). Eichmann in Jerusalem. London, UK: Faber and Faber. Bartov, O. (2001). The Eastern front, 1941–1945: German troops and the barbarisation of war- fare. New York, NY: Palgrave Macmillan. Baumeister, R. F. (1999). Evil: Inside human violence and cruelty. New York, NY: Holt. Baumeister, R., Massicampo, E. J., & Vohs, K. (2011). Do conscious thoughts cause behavior? Annual Review of Psychology, 62, 331–361. Baumrind, D. (1964). Some thoughts on ethics of research: After reading Milgram’s “Behavioral Study of Obedience”. American Psychologist, 19, 421–423. Blass, T. (2002). Perpetrator behavior as destructive obedience: An evaluation of Stanley Milgram’s perspective, the most influential social-psychological account of the Holocaust. In L. Newman & R. Erber (Eds.), Understanding genocide: The social psychology of the Holocaust (pp. 91–109). New York, NY: Oxford University Press. Browning, C. (1992). Ordinary men: Reserve police battalion 101 and the final solution in Poland. New York, NY: HarperCollins. Burger, J. (2009). Replicating Milgram: Would people still obey today? American Psychologist, 64(1), 1–11. Cesarani, D. (2004). Becoming Eichmann. Cambridge, MA: Da Capo Press.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Mastroianni 669

Fiske, S., & Taylor, S. (1991). Social cognition (2nd ed.). New York, NY: McGraw Hill. Gilbert, G. M. (1947). Nuremberg diary. New York, NY: Farrar, Strauss and Co. Gilbert, G. M. (1950). The psychology of dictatorship. New York, NY: The Ronald Press Company. Goldhagen, D. (1996). Hitler’s willing executioners: Ordinary Germans and the Holocaust. New York, NY: Vintage Books. Graveline, C., & Clemens, M. (2010). The secrets of Abu Ghraib revealed: American soldiers on trial. Potomac, MD: Potomac Books. Haney, C., Banks, C., & Zimbardo, P. G. (1973). Interpersonal dynamics in a simulated prison. International Journal of Criminology and Penology, 1, 69–97. Harff, B. (2003). No lessons learned from the Holocaust? Assessing risks of genocide and political mass murder since 1955. American Political Science Review, 97(1), 57–73. Hilberg, R. (2003). The destruction of the European Jews (3rd ed.). New Haven, CT: Yale University Press. (Original work published 1961) James, W. (1992). The dilemma of determinism. In G. E. Myers (Ed.), William James: Writings 1878–1899 (pp. 566–594). New York, NY: The Library of America. (Original work pub- lished 1886) Lipstadt, D. (2011). The Eichmann trial. New York, NY: Schoken Books. Mastroianni, G. R. (2013, Summer). Looking back: Understanding Abu Ghraib. Parameters, 43(2), 53–65. Mestrovic, S. G. (2007). The trials of Abu Ghraib: An expert witness account of shame and honor. Boulder, CO: Paradigm. Milgram, S. (1963). Behavioral study of obedience. Journal of Abnormal and Social Behavior, 67, 371–378. Milgram, S. (1964). Issues in the study of obedience: A reply to Baumrind. American Psychologist, 19, 848–885. Milgram, S. (1974). Obedience to authority: An experimental view. New York, NY: Harper and Row. Milgram, S. (1979, March 31). Quotes. In Wikiquote. Retrieved from http://en.wikiquote.org/wiki/ Stanley_Milgram Miller, A., Buddie, A., & Kretschmar, J. (2002). Explaining the Holocaust: Does social psychology exonerate the perpetrators? In L. Newman & R. Erber (Eds.), Understanding genocide: The social psychology of the Holocaust (pp. 301–324). New York, NY: Oxford University Press. Newman, L., & Erber, R. (2002). Understanding genocide: The social psychology of the Holocaust. New York, NY: Oxford University Press. Perry, G. (2013). Behind the shock machine. New York, NY: The New Press. South Pacific (musical). (n.d.). In Wikipedia. Retrieved from https://en.wikipedia.org/wiki/South_ Pacific_(musical) Staub, E. (2011). Overcoming evil. New York, NY: Oxford University Press. Waller, J. E. (2002). Becoming evil: How ordinary people commit genocide and mass killing. New York, NY: Oxford University Press. Zimbardo, P. G. (2007). The Lucifer effect: Understanding how good people turn evil. New York, NY: Random House.

Author biography George R. Mastroianni is a Professor of Psychology at the US Air Force Academy in Colorado Springs, Colorado. Prior to joining the faculty at the Air Force Academy, Dr. Mastroianni served as a Research Psychologist in the US Army, where he was engaged in a variety of applied research programs relating to human factors engineering and human performance. He co-edited A Warrior’s Guide to Psychology and Performance, and has a strong interest in the psychology of ethical behavior in military culture and settings.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 TAP0010.1177/0959354315608705Theory & PsychologyHoffman et al. 608705research-article2015

Article

Theory & Psychology 2015, Vol. 25(5) 670­–689 Acting otherwise: © The Author(s) 2015 Reprints and permissions: Resistance, agency, and sagepub.co.uk/journalsPermissions.nav DOI: 10.1177/0959354315608705 subjectivities in Milgram’s tap.sagepub.com studies of obedience

Ethan Hoffman Clark University

N. Reed Myerberg University of Cambridge

Jill G. Morawski Wesleyan University

Abstract In this account of the Obedience to Authority experiments, we offer a richer and more dynamic depiction of the subjects’ acts and reactions. To paraphrase Milgram, our account tries to examine the central elements of the situation as perceived by its research subjects. We describe a model of the experimenter–subject system that moves beyond experimentalism and humanism, positing instead a model that considers experimenter–subject relations and extends both spatially and temporally past the experiment’s traditionally assumed limits: the walls of the laboratory and its canonical methods. Following Butler and Krause, we propose an approach that attends to quotidian, subtle, and unregistered ways of acting otherwise. Taking the Yale archive’s collection of Milgram’s subject files, audio recordings, and notes as historical traces of the experimenter– subject system, our analysis introduces a grounded understanding of how Milgram’s cut between obedience and disobedience renders invisible all but the most explicit manifestations of resistance or ways of acting otherwise. Investigating Milgram’s work through an experimenter–subject systems model illuminates previously undocumented affective and temporal dimensions of laboratory life and serves as a template for assessing other experimental situations.

Keywords Stanley Milgram, non-sovereign agency, obedience to authority, resistance, social psychology of experimentation

Corresponding author: Ethan Hoffman, Department of Psychology, Clark University, Worcester, MA 01610, USA. Email: [email protected]

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Hoffman et al. 671

Renowned as a brilliant experimenter, Stanley Milgram captured the world’s attention with an apparent miniaturized demonstration of the conditions under which individuals yield to authority and behave obediently. With the exception of his co-investigators and scholars who viewed materials at Milgram’s archive, it is not widely known that Milgram was a meticulous recorder of experimental activities who documented and preserved subjects’ nonverbal and verbal responses, post-experimental correspondence, and even his own interactions with colleagues. His first published report shows this attentiveness; departing from the conventional protocol of experimental reportage, it recounts subjects stuttering, trembling, digging their fingernails, and biting their lips. Milgram explained these actions as tension and indicated their intensity by describing laughter that “seemed entirely out of place, even bizarre,” behavior that was “untoward and uncontrollable” (Milgram, 1963a, p. 375). Such observations proved to be parenthetical, however, for Milgram attended principally to responses of obedience and failure to obey. Most reports sideline not only the bodily and paralingual evidence of “exceptional distress” (Milgram, 1963a, p. 371) but also subjects’ post-experimental accounts which Milgram otherwise appreciated as “insightful” (1964a, p. 850). The public story, then, fixes on two behaviors—obey and disobey—but provides no template for interpreting the myriad activities that transpired in the laboratory. What, then, can be said of Subject 0113’s post-experimental account? The subject stated:

I must confess that I suspected from near the beginning that something was amiss. Being in an electro-mechanical field I suspected that the voltage was not going up as was shown on the control board, but as I sat there at the board I figured out that if anything was being raised it was only the amperage … As I sat there at the board I could remember getting calmer and calmer with the realization growing in my mind that I was not giving the person on the other side of the wall the shocks shown on the board. So that by the time the experiment was over I was comparatively calm, until the other man returned to the room then I felt compassion for him and I wished to get out of there as fast as possible. I hope these additional comments will be of some value to you. (Reaction of subjects, 1962a)

Milgram generally interpreted such reports of doubt or disbelief as self-serving defense reactions, arguing that “cognitive processes may seem to rationalize behavior that the subject felt compelled to carry out” (1974, p. 173). However, Subject 0113’s reporting cannot be dismissed by the psychoanalytic explanations of defense reaction or “denial” or “tension”: his detective work, cognitive logics, self-reflections, and emotional shifts along with continued respect for the experimenter’s project indicate more than mere reactivity. Nevertheless, in the experiment’s accounting system, Subject 0113 was counted as an obedient subject, one among the majority who in the end complied with authority. Readers of Milgram’s publications see a scientific bookkeeping that has no columns for registering oppositional gestures, conniving, misinterpretations, feigned ignorance, lamentation, , or anything else for that matter. They receive no account of subjects acting otherwise—acting other than dichotomously. This public balance sheet of the experiments guides a parable that has been sustained for over half a century, enduring experimental replications, reassessments, and historical examinations. When considered in context, the obedience studies illustrate postwar

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 672 Theory & Psychology 25(5) social scientific aspirations to guide civic reform (McCarthy, 2004) and anxieties about the state of American masculinity (Nicholson, 2011a). Yet the experiments’ aspirations are not only a historical matter, for they continue to be showcased in psychology textbooks (Stam, Lubek, & Radtke, 1998) and generalized to contemporary atrocities of military aggression and everyday unethical acts in the financial world (Nicholson, 2011b). Although these generalizations tend toward pessimism, others invoke optimism. The experiments have provided an exemplary case for a philosophy of “situationism,” proposing that while human agents are constrained by the power of the situation in which they act they can “counteract harmful situational effects” given certain knowledge (Mele & Shepherd, 2013, p. 1). Both the dominant moral lesson drawn from Milgram’s experiments and the genera- tive one offered by situationists preserve the binary that is obedience and disobedience. In Milgram’s and in situationist models of human action, experimenters’ accounts of subjects and actions take precedence: they alone determine whether or not subjects’ reports are accurate or meaningful. While Milgram’s methodological precision is lauded, he is attentively inattentive to even his own behavioral category of disobedience and to instances of subjects’ , resistance, turning away, refusal, sabotage, and defiance. Yet, not all subjects obeyed in a manner that Milgram defined as obedient and even some of the decisively obedient subjects did more than, or other than, obey. Most appraisers continue to overlook the ways subjects acted otherwise; however some have identified subjects’ active involvement, finding subjects frequently were aware of the confected situation and mindful of their participation in it (Nicholson, 2011b; Perry, 2013; Ross & Nisbett, 2011). Ross and Nisbett encouraged a “subject’s-eye view” of the experimental events to see how subjects often did “confront the experimenter and refuse to continue, often quite forcefully, just not effectively” (2011, p. 57). Once subjects divergent acts are recognized, we find the experiments “may have less to say about ‘destructive obedience’ than about ineffective and indecisive disobedience” (p. 57). Our investigation aims to cast light on these acts and other darkened spaces using Milgram’s experimental observations and surviving data archived in the Stanley Milgram Papers (SMP) at Yale University. We are intrigued not by “obedience to authority” as much as by “obedience to experimental method”: the material and discursive practices which constitute a frame for experiments, enabling as well as constraining the interaction of experimenters, subjects, and confederates. More specifically, we aim to delineate aspects of experimental experiences that have gone largely unreported yet are lively, productive, and often essential to empirical outcomes. This analysis invites a rethinking of Milgram’s belief that subjects’ acting otherwise (and the subterfuges they attempt) are largely self-serving and token efforts to appear as “benign” men (1974, pp. 159–160). Attending to these experiences also corroborates other researchers’ detection of incongruities and illogical conditions within the experiment (Nicholson, 2011b; Patten, 1977). Seeing the ways that the events “did not make sense or add up from the perspective of the subject” (Ross & Nisbett, 2011, p. 57), in turn, illuminates the ways that subjects undertook their own problem-solving logics of doubt, intervention, curiosity, and mimicry. Acknowledging that experiments involve more than what is represented in official scientific reports, however, calls for a conceptual framework and nomenclature that better

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Hoffman et al. 673 represent the “experimenter–subject system” constituting human psychology experiments. Our archival analysis employs a framework that views the dynamic relations between institutional norms of the experiment and quotidian performances of its actors as a mutu- ally dependent feedback loop: performances are regulated but not fully determined by institutional norms or actors’ background assumptions and knowledge of psychology writ-large. In practice, these norms and background assumptions are manifested and subject to mutation (Richards, 2002). This feedback loop constitutes the experiment, yet it also harbors the possibility for subjects’ resistance or non-compliance—the mutation of norms and the indeterminacy of performance open a gap where subjects can act other- wise. Our syncretic approach permits the detection of ways of acting otherwise yet does so without discounting the significant ways that power and history shape the performances of experimental subjects and how methodological norms constrain behav- ioral possibilities. We begin with a brief review of the extant models of the experimenter–subject system, and introduce our alternative model that attends at once to the bottom-up impact of minute, local performances and the top-down influence of institutional norms and broader patterns of power. Our framework is informed by but departs from re-analyses and historical studies of Milgram’s experimental practices (Nicholson, 2011b; Perry, 2013; Rochat, Maggioni, & Modigliani, 2000; Stam et al., 1998). It also differs by fore- grounding how power is dispersed and produced through institutional and interactional arrangements. We then proceed to examine the epistemic conditions under which Milgram made his empirical distinctions or “cuts” between obedience and disobedience. Identifying these epistemic cuts reveals not only what experimental conditions and evidence have been occluded but also how Milgram did not always abide by fixed bina- ries of subject and experimenter or obedience and disobedience. Additionally, the broader model of the experimenter–subject system is used to appraise Milgram’s notions of agency, including his apparent ambivalence toward subjects’ verbal and nonverbal actions. His concepts of agency and its diminution, the “agentic state,” are replaced by a perspective that has close affinities with situationism yet recognizes the limits to agency and the dispersed dynamics of power. Following a review of the epistemic conditions of the obedience studies, the paper examines archival evidence of subjects’ actions in order to identify the quotidian ways that some of Milgram’s subjects performed, felt, and thought otherwise. The conclusion proposes that an understanding of the obedience studies that is more attuned to subjects’ capacity for resistance will guide both better experimental design and the refinement of situationism.

Regarding the actors in experiments Extant views: Control and quantification, humanism, and Foucault Even as presentist histories of the progressive, inevitable march of science have fallen by the wayside, there still remain significant lacunae in the history of the human sciences; notably, the labor of experimental subjects remains largely unexamined. Conventional, experimentalist models in psychology eschew the particularity of subjectivity in favor of a quantitative notion of subjects that understands them as little more than substrata for

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 674 Theory & Psychology 25(5) stimulus responses. This experimentalist position ontologizes the research subject as subsisting in aggregate data, a reduction that satisfies the parallel epistemic norms of the controlled laboratory environment and the quantitative ideal (Law, 2004; Star, 1983). Rather than attend to , this scientific rationality demands that the measures of subjects’ behavior and cognitions be stripped of any beliefs, values, affects, and atti- tudes that were not intentionally elicited by the experimental stimulus. Furthermore, this rationality requires reducing these bare, elicited responses into anonymous and quantitative units. Lost in the space between these two norms of control and quantification are the lively experiences of psychology’s primary data producers: the subjects. The obedience experiments sit in uneasy tension between the experimentalist model, on the one hand, and a humanist vision on the other. This tension is evident in Milgram’s treatment of disobedience. Although Milgram dedicated only 12 pages of his 1974 book to a discussion of outright disobedience, he pronounced it “the measure that we sought” and affirmation of “humanistic values” (p. 164), including the “mobilization of inner resources” and “transforming them into action” (p. 163). Although highlighting these “inner resources,” Milgram’s model of human action cannot be said to be anything other than positivist and experimentalist. Exemplifying this positivist outlook, in a lab note, he wrote that the affection a “young man” feels for a “young lady,” if unarticulated in an observable way, does not constitute “a fully social event” (Milgram, 1961b). Similarly, in the case of the obedience experiments, Milgram claimed, “The subject’s behavior has a binary quality to it. He may press the lever, or he may abstain” (Milgram, 1962b). Milgram conceptualized disobedient subjects as drawing on some autonomous, par- ticular “inner resource” of their dissent while, nevertheless, insisting that the experi- ment’s validity depends on the homogeneity of stimulus-response reactions across the entire sample. Humanistic critiques of mainstream research counter this experimentalist model, yet importantly remain in its contours by identifying agency with intentionality and “even with a kind of personal sovereignty understood as self-determination or control” (Krause, 2012, p. 1). The experimental subject thus conceived operates autonomously and independent from the exigencies of the laboratory. Baumrind’s (1964) rejoinder to the obedience experiments exemplifies the humanist model. In her call for ethical safe- guards and in others’ humanist critiques, Enlightenment notions of the autonomous individual come to the fore: these accounts demand preservation of the research subjects’ intentional choice and free will. However, both the humanist and positivist models of experimenter–subject relations foreclose on seeing how power constrains subjects’ behavior. Emerging out of Foucauldian thought, some contemporary researchers have adopted a sharply contrasting view that understands the experimental subject to be the effect of a system of disciplinary arrangements. Still, this third, loosely termed Foucauldian perspective neglects relational dynamics and prematurely delineates sub- jects’ capacities for action (which are presumed to be merely the effect of power). Thus, such Foucauldian models undervalue the idiosyncratic and minute detail of lived experience reported by subjects. To construct a representation of the specific experimental world in which the obedience experiments transpired, a representation that neither reduces the subject to an “effect” of power nor naively imagines an autonomous subject, another perspective is needed.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Hoffman et al. 675

This model must attend to both the power-suffused character of the laboratory and the manners in which this power is realized through imperfect, often improvised performances of experimental roles. Instances of acting otherwise emerge through feedback loops con- necting the laboratory’s institutional norms and the actual performative dynamics that constantly, if subtly, allow for mutation of these norms. These mutations reside in the performative interval (Butler, 1997), the gap between the injunction to perform in some expected way and the actual performance that is induced. Because solicited performances are never perfect and always subject to in situ mutation and reinvention, this gap is never fully closed—there always remains a space between, a space where a subject might fail to perform as instructed, where a subject might act otherwise.

On agency The three models of the subject—positivist, humanist, and Foucauldian—described above truncate, albeit in different ways, a full appreciation of the experiment and the experimenter–subject system. Although marking various forms of power and capacities of the subject, they foreclose on mutations and performative possibilities, thus overlooking evidence of subjects’ actions. An empirically more robust model requires a reappraisal of agency, a reconsideration that is found in Krause’s (2011, 2012) articulation of non- sovereign agency. By “non-sovereign,” Krause means to acknowledge the materially and socially distributed capacities for action shared by each and every individual (in the case of the obedience experiments, the experimenters, confederates, and subjects). Extending her account to the laboratory world, we reject the assumption that the small, shared performances that accrete into a durable experimental situation are exclusively the autonomous, conscious choices of a given actor, independent from all relational exter- nalities. Instead, we see interactions between experimenter and subject as “a function of the communicative exchanges, background meanings, personal intentions, social interpretations, self-understandings, and even bodily encounters” (Krause, 2012, p. 5) that constitute their life in the laboratory. Thus, while these various externalities are certainly negotiated in an intersubjective context delineated partially by the norms of experimental research, they are not stable and not exhaustive. Inasmuch as the context for an agentic expression can never be determinative or total, it is always open to mutation and improvisation. Yet, if it is the case that, as Krause holds, “agency is non-sovereign in part because our efficacy does not always reflect our will, or conscious control” (2012, p. 4), agency is even less sovereign because there cannot exist a private language through which agentic performances might be executed by the solitary subject. Nevertheless, being registered by the experimenter as efficacious is neither a necessary nor sufficient condition for agentic expression. The various performances that fail to be registered as efficacious are still meaningful loci for analysis, if one is able to identify them ex post facto in subjects’ reports or other archival material.

Norms and cuts: Experimenter–subject systems and experimental design In the Milgram experiments, it is apparent that the experimenter had full power over drawing the binary of what was agentic and what was not. More generally, at the outset,

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 676 Theory & Psychology 25(5) the experimenter is faced with a breadth of methodological, ontological, and episte- mological questions about the nature of the subject, the subject’s agency, and the boundaries of the experiment. From these presuppositional questions follow normative decisions that constitute the logic of the experiment. There then remains the task of actually enacting these choices—a task that is necessarily executed imperfectly. Indeed, it is pre- cisely the imperfect way that these decisions are put into practice (the mutation between the injunction to perform and the actual performance) that opens up the performative interval: a gap in understanding that the experimenter seeks to close, a gap that may indeed appear to be closed, but a gap that nevertheless may never be perfectly filled-in. These normative decisions take the form of what we– following Devereux (1968), Barad (2007), and others– will call cuts. As these researchers observed, there is nothing that necessitates a determinate Cartesian separation between the observer and the object observed. Rather, material and discursive processes are crucial in enacting these separa- tions or cuts that establish the object observed as distinct from the apparatus that is doing the observing (Barad, 2007, p. 114). Enacted partitions between various, seemingly pre- given objects and observers are of profound importance in understanding the logic of Milgram’s laboratory. These separations divide the laboratory from the outside world, subject from experimenter, and obedience from disobedience in a way that is as epis- temically exclusionary as it is ontologically necessary. Given the necessity of some system of cuts, the task is to posit a different conception of the experimental situation that opens the door to other interpretations while also appreciating that they are as partial as any other set of partitions. Although the experimenter–subject system proposed here also involves a series of cuts, they differ from those insisted upon in Milgram’s design. Our system attends to the significance and entangled complexity of relations between experimenter and subjects: a wider range of ecological, temporal, and agentic conditions within the experimental system; and the non-sovereign and often unacknowledged agency of all actors. Although Milgram’s cuts, like any system of enacted distinctions, were assumed in advance, our attention is drawn to how the mutual interactions of subjects and experimenters push back and forth on the different sides of the cut. These in situ interactions reconfigure the institutional norms governing the experiment and establish the conditions of possibility for not-yet-given performances. It is the very unsettledness of cuts that gives room for the performative interval or gap in which actors may resist and act otherwise. Realizing a fuller understanding of the relational dynamics of the laboratory world requires the premise that experimental subjects have not simply been disciplined in a certain manner, but that they also enter into a genuine relation with experimenters who are likewise bound up as co-participants in a complex situation. The normative status of subjects as participants is a potential locus of contestation that is never com- pletely settled. Comprehending subjects’ experiences requires decentering the designed experimental system, even as it evolves in situ, in favor of an expansive and permissive model of agentic entanglements, dynamic and contested cuts, and ever-evolving performances. This does not involve dismissing Milgram’s cuts, but rather taking into consideration how performances by subjects and experimenters alike mutate extant cuts and enact new ones.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Hoffman et al. 677

Some ways of acting otherwise Attentive inattention Our excavation of Milgram’s experimenter–subject system starts from the premise that many subjects exceeded the neat bifurcation suggested by Milgram’s epistemic cuts. Although relegated into two neatly defined camps—those who obeyed and those who did not—subjects in the obedience experiments felt, thought, and acted in ways that indicate a multiplicity of experiences, a complexity that was sidelined in Milgram’s published research (Milgram, 1963a, 1965, 1974). This is not to say that Milgram was inattentive to the multiplicity of subjects’ inter- actions in—and with—the obedience experiments, or more generally to what Law calls messy processes that “necessarily exceed our capacity to know them” (2004, p. 6) even through scientific scrutiny. And while Milgram did explain away or ultimately ignore subjects’ reports of suffering caused by the experiment (Nicholson, 2011b), he nevertheless sought out their testimony and puzzled over the diversity of subjects’ experiences in the pages of his laboratory logbooks. For instance, he wrote, “There are several explanatory schemes which cover the same findings; none covers all the find- ings; and then there are certain phenomena for which there seem to be no truly adequate theoretical explanations” (Milgram, 1962c). The search for a common explanatory scheme that accounted for the multiplicity of subjects’ experiences was an ongoing project for Milgram, as evidenced by the variety of theoretical approaches he essayed in his laboratory logbooks. These entries alone show his attempts to understand sub- jects’ performance through such diverse theoretical lenses as a gestalt model (Milgram, 1961c), learning theory (Milgram, 1961a), a focus on the triad of relationships between laboratory actors (Milgram, 1962a), and an attunement to the subtle spatial effects of the laboratory environment (Milgram, 1964b). The thirst for a theoretical grounding for the obedience experiments is evident in more than only Milgram’s logbook. His theoretical wanderings also came explicitly to the fore in his conversations with Dr. Paul Errera, a psychiatrist brought in by Yale to interview former subjects about possible harm resulting from the experiments. These interviews occurred in 11 sessions between February and May 1963; all but one was held in group settings. Transcripts of the sessions show Milgram repeatedly probing Errera after the interviews to ascertain if Errera had gleaned any novel interpretation of the experiment (quoted in Errera, 1963a, 1963b). However, Errera (like Milgram) was impressed by the multiplicity of experiences suggested by subjects’ comments. He explained to Milgram that across and even within conditions, one “can’t necessarily attribute [breaking off] to a common motive that is being activated in a person” (Errera, 1963a, p. 47). Milgram and Errera repeatedly discussed the difficulty of identifying a “common feature” that could explain subjects’ obedience or disobedience (cited in Errera, 1963b, p. 33). Our archival analysis, presented below, centers on these and other self-report data that Milgram collected from subjects particularly, on the questionnaire subjects completed upon receipt of the complete study report, a year after the experiment. The questionnaire asked subjects to complete a brief multiple-choice survey about their experience in the study, along with a prompt for open-ended comments. By and large, subjects complied with this request and returned the questionnaires (86% of subjects in condition 2). Our

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 678 Theory & Psychology 25(5) archival analysis centers on this free-response data taken from files of those subjects in condition 2, the baseline condition, (SMP, Box 118); the fragments of subject responses from all conditions that Milgram’s team coded and typed onto index cards (SMP, Box 44); transcripts of the Errera interviews (SMP, Box 155); audio recordings of condition 20 (the female condition) (SMP, T53, T54, T55); and Milgram’s logbooks and notes (SMP, Boxes 45 & 46). The extensiveness of the data files itself is evidence of Milgram’s claim that he was attentive to subjects’ experiences. In a note from December 1963 he commented, “I have always taken the subjects’ point of view as my starting point, putting myself in the subject’s place, and trying to figure out what the critical features of the situation can be as the subject sees them” (Milgram, 1963b). However, in the published record of the obedience experiments, Milgram’s qualitative data generally stood as something to be explained away rather than constituting a “starting point.” For example, subjects’ exhibition of affectual excess (laughing during the course of the experiment) was disregarded as the mere physical conversion of anxiety, a discounting that reinforced claims about subjects’ eventual obedience and displaced any strain or latent will to disobey (Milgram, 1974, pp. 152, 161). Similarly, doubts that some subjects expressed about the veracity of Milgram’s deception were dismissed as simply a “defense function” and a “post facto explanation” (1974, pp. 172, 174). He wrote that obedient subjects’ doubt should not merit the experimenter’s trust but instead should be viewed as a face- saving denial of experimental obedience, like laughter, a process that “eases the strain of obeying the experimenter, eliminating the conflict between hurting someone and obey- ing” (1974, p. 158). Yet subjects understood their laughter to be caused by reasons other than the resolution of strain. Subject 1601, for instance, found himself laughing because “it seemed ridiculous that punishing the learner helped in any way to improve his memory” (Reaction of subjects, 1962a). So too, archival analysis indicates that the many reasons that subjects had for doubting Milgram’s deception illustrate the multiplicity and com- plexity of the laboratory world.

Thinking and believing otherwise The archival evidence seems to suggest that Milgram’s investigation into subjects’ self- reported experience was more than just a matter of collecting data—individual index cards were made containing a significant excerpt of each subject’s open-response and these cards were in turn sorted into content themes for later analysis. However, a search in the archives yielded only two analyses of these data: the first a brief analysis by Takedo Murata, a research assistant, that investigated subjects’ reported belief about whether or not the learner was actually receiving a painful shock (Murata, 1963); and the second by another assistant, James Miller (Miller, n.d.), that examined a small minority of the free-responses. Despite the incompleteness of Milgram’s qualitative project, the index cards and debriefing questionnaires reveal a vast assortment of subject experiences that exceeded experimental expectations in unique, creative, and at times subtle ways, evading any singular theoretical explanation. Even if one does not grant credibility to subjects’ post-experimental testimony—a testimony that explicitly attests to both believing and acting otherwise—the excess of research participants’ subjectivity, as Derksen

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Hoffman et al. 679

(2001) has noted, is both evidenced (and, moreover, negatively defined) by the experi- menter’s ceaselessly precise attempts to constrain and transform the subject’s experience into something that is both standardized and quantifiable. As suggested above, one form of excess subjectivity that caught the attention of Milgram and his team from a very early date was the possibility that subjects might be suspicious of the experiment’s deceptive “memory and learning” cover story. Murata’s investigation of condition 2 shows that 64% of the subjects who returned the question- naires (n = 36) did not fully believe that the learner was being shocked (Murata, 1963). We analyzed these same debriefing questionnaire responses, coding any response other than “I fully believed that the learner was getting painful shocks” as general suspicion, and coding any specific reason for that suspicion mentioned in that subject’s free- response as concrete doubt. Similar to Murata, we found that of the subjects in condition 2 who returned follow-up questionnaires (n = 35), 60% expressed general suspicion about the experiment, while 26% gave a specific reason for doubting the experiment’s cover story. Particularly noteworthy was the observation that the percentage of generally doubtful subjects did not differ significantly across obedient subjects (59%) and disobe- dient subjects (62%) in the baseline condition. While this finding is consistent with Milgram’s claim that whether subjects believed the protocol did not affect the overall rates of compliance in any “substantial manner” (Milgram, 1974, p. 173), a re-analysis of the data which excludes those subjects who reported being uncertain indicates that “believers” were more likely to be defiant (Brannigan, 2013). It is strikingly evident that from the beginning of his experimentation, Milgram was concerned that subjects might not buy into the deception (Russell, 2010). Subjects themselves wondered about this matter. Subject 1605 wrote, “The fact that you have 60% that completed the shocks to the end only shows that most of these people were certain that no shocks were given” (Reaction of subjects, 1962a). Subject suspicion pre- sented a problem not only because it jeopardized face validity, but also because such suspicion implicitly conveyed the possibility that subjects also were being deceptive, in this case, towards the experimenter. Although Milgram wrote that “tension tells us that the situation is real for the subject” (Milgram, 1962d), it is clear that some of the subjects who did not believe were nevertheless obedient. Consequently, how are we to tell an obedient subject who believes in the cover story from one who does not? What if there is tension, in spite of disbelief? With regards to distinguishing the obedient believers from the obedient non-believers, one subject, Subject 0113 (quoted at the outset of this article), presents an instructive case. He, like many subjects, exhibited a liveliness that ran in excess of what the experi- mental system was able to capture as intelligible data. Yet what we see in his data file— that is, how he was observed by Milgram and his colleagues—is that he seemed to be a subject like any other: an exemplar, really, of obedience to authority. The data show that he carried through the shocks to the 450-volt maximum, revealing how long he took to administer each shock, and the duration of each shock. They document those aspects of his background that meet the experiments’ inclusion criteria (that he is a male between 20 and 50 years old from the New Haven area). Even brief examination of Subject 0113’s post-experimental responses reveals considerably more than what is registered in the cut between obedience and disobedience. Informed by his background in “an

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 680 Theory & Psychology 25(5) electro-mechanical field,” Subject 0113 could not believe the veracity of the learner’s pain; yet, he wrote in his free-response, “By the time the experiment was over I was comparatively calm, until the other man returned to the room then I felt compassion for him and I wished to get out of there as fast as possible” (Reaction of subjects, 1962a). Beyond Subject 0113’s doubt, what is missed in the data as rendered in the published obedience experiments is the reality (his attested knowledge of electrical engineering) that subjects bring their own experiences into the lab. As he grew “calmer and calmer” while delivering shocks, his file demonstrates that subjects are anything but fungible substrates for universal responses to stimuli or situations. Even the seeming inconsistency between belief, performance, and affect is not the totality of the laboratory’s messiness; one might consider as well Subject 0113’s reac- tion (quoted previously) after having “shocked” the learner all the way to the end. He reported that he was still very upset when he saw the learner afterward and wanted to leave immediately—in spite of his knowledge that the machine could not be doing what the experimenter claimed it did. Subject 0113’s discomfort is illustrative of how laboratory life is relational in nature, influenced by the subtleties of experimental inter- actions, material conditions, and by the larger power structures in which the laboratory is nested (Latour & Woolgar, 1985). Although subjects bring their own experiences to the experimental situation, these prior experiences are not deterministic of the totality of their engagements with the laboratory. Subject 0113’s affective response to seeing the learner’s face, even after correctly concluding that the experiment was not what it seemed, shows the multiplicity of subjects’ experiences. Subject 0113 demonstrates the indeterminate, sometimes strange series of partial, incomplete causations that co- function in subjects’ performances. Much like Subject 0113’s background understanding of electrical machinery, subjects’ grasp of institutional norms (specifically notions of what ethical limits Yale University would place on experimentation) also factored into subjects’ self-reported reasons for responding to the experimental situation as they did. Subject 0202, like many others (Subjects 0223, 0231, 0408, 0507, 0508, 0701, 0825, 0841, 0902, 1012, 1411, 1801) wrote assuredly that Yale or Milgram “would not allow anyone to suffer as much as [the] actor pretended” (Reaction of subjects, 1962a). Just as subjects enter the laboratory with certain assumptions about what it means to be a subject and what kinds of obligations that position entails, they also carry with them an image of psychological experimentation writ-large, including ideas about what research can and cannot be carried out ethically. And although Milgram attempted to control for this effect of “institutional context” with his Bridgeport conditions, as Milgram himself noted, the overall structure of laboratory relations remained effectively unchanged across the 24 experimental conditions (Milgram, 1974, p. 70). While doubt and suspicion pervade subjects’ post-experimental accounts, they are by no means the only kinds of subjectivity that exceeded the cut between obedience and disobedience. Some subjects found themselves questioning the reality of the situation as a result not only of background knowledge (e.g., about Yale, or about electrical engineering, in the case of Subjects 0113, 0432, 1003, and 1331) but also due to in situ observations. For example, Subject 0929 watched the experimenter handing the learner a dog-eared check, leading him to reason that the check was being reused and that the

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Hoffman et al. 681 learner was not really so naïve (Reaction of subjects, 1962a). Similarly, Subject 1615 observed a dog-eared word-list for the learning and memory task, and Subject 0237 became suspicious when he, but not the learner, was asked to sign a release waiver. Both Subject 0517 and 1809 noted the one-way mirror and supposed it indicated that they were being observed. Likewise, Subject 1810 surmised that the learner’s screams were not coming from below the door but was, instead, “quite sure” that the “‘grunts & screams’ were electrically reproduced from a speaker mounted in students room.” Still another, Subject 0209, was left feeling confused and doubtful because the “learner seemed to make an extremely indifferent effort to recall correct associations.” Other times, aberrations in the experimental protocol and the inevitable mutability of perfor- mances led subjects out of the naïveté necessitated by Milgram’s design—the failure to give the learner a check, or a third party tinkering with the shock generator being but a few examples of what triggered Subject 0208’s suspicions and problem-solving behavior (Reaction of subjects, 1962b). Across the board, subjects’ inevitable performances as either “obedient” or “disobedient” were achieved through a complex mess of discourse, action, and gesture. Subjects are categorized dichotomously, despite their disparate understandings of the experiment- proper and psychology writ-large, and their embodiment of varied, idiosyncratic affect. For some subjects, the central question of the experiment was Should I keep shocking?, for others it was What is going on here?, Is the learner really being shocked?, Would Yale do such a thing?, Why is the learner performing badly?, or even How can I get the learner in the other room to better perform? Subject 1703 stated the salience of such questions explicitly, writing that the conflict between himself and the learner was over- shadowed by “another conflict … that of ‘My role in this experiment and How I Should React’” (Reaction of subjects, 1962a). Statements like this indicate how experimental performance takes on detective work, problem-solving, while other accounts, like Subject 1914’s, illustrate just how important solving that problem can be for a subject:

Though I accepted the experiment at face value initially, I had doubts after the learner began to complain about the shocks, and I began to think critically about the entire experimental situation. I felt fairly sure that I was the only subject, and my own reactions were being studied rather than the “student’s.” Because of this I did continue with the program, almost feeling a gleeful pleasure at having guessed, in some degree, what was actually happening. At the end of the experiment I did indicate my surprise to the experimenter since I felt that my reaction to the experimental situation might well have been different. (Reaction of subjects, 1962a)

As shown by his “gleeful pleasure,” the most salient feature of the experiment for Subject 1914 was its mystery, not its emotional and moral dilemma. He nevertheless remained keenly aware that his laboratory purpose was for the benefit of the experi- menter, acknowledging that his problem-solving mindset had produced a different “reaction to the experimental situation” than the experimenter had been looking for. Subject 1914’s attention to both the experimenter’s and his own problem-solving shows how subjects can sometimes push back on the cut between the subject’s pre- sumed vision of obedience as a moral quandary and the experimenter’s understanding of obedience as an intellectual puzzle.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 682 Theory & Psychology 25(5)

Performing and feeling otherwise As they entered into the lab, many subjects’ curiosity compelled them to take active measures when passive observation alone would not shed full light on the true nature of the experiment. While some subjects probed and questioned the experimenter to ascer- tain what was happening, others directed their “testing” against the learner. Both Subjects 1419 and 1434 found themselves experimenting on the experiment. In his free-response Subject 1419 revealed, “I cheated once during the experiment. I announced that I was giving a high voltage stimulus, & gave a low voltage to see if the subject would say ‘ouch’ nevertheless” (Reaction of subjects, 1962a). Similarly, in his desire to understand what had just transpired, Subject 1810 wrote that he began to take notes about the experi- ment once he left the laboratory, scrupulously documenting his experience much like the experimenter had been doing just minutes earlier. Subject 0601, another note-taker, not only mimicked the experimenter’s scrupulous recording of the minute goings-on of the laboratory environment, but went so far as to mail these notes to Milgram as an attachment to the free-response questionnaire. Just as the pervasive and subversive doubt evident in the archive marks an excess subjectivity, a scientific curiosity (of the performative sort exhibited by Subject 1419, 1434, 1810, and 0601) also constitutes a way of acting other- wise that runs beyond simplification or reduction into the narrow typology afforded by the cut between obedience and disobedience. To be sure, many subjects were “taken into” Milgram’s experimental narrative without resistance; however, the variegated nature of subjects’ background experiences meant that no two subjects would be engaged the same way. For example, Subject 2004 stated several minutes after the experiment that her experience in a mental hospital sev- eral years earlier led her to assume that a person getting shocked would “get used to it after a while” (Subject 2004, n.d.). She vividly remembered being strapped to the ECT table, and recounts with a slowing, quieting voice the sight of other patients on her unit and “how pale they became.” The experiment triggered powerful memories for Subject 0310 as well, who revealed that his eventual disobedience was influenced by a family trauma. He wrote, “I believe my concern was intensified because the actor greatly resem- bled and seemed about the same age as my step-father, who had succumbed to a heart attack” (Reaction of subjects, 1962c). Both these cases exemplify the idiosyncratic, messy, and oftentimes affectively charged subjectivity that fills in the gap of the performative interval. Like the doubtful subjects, those subjects who did believe Milgram’s narrative expressed their agency through subversive performances that were never registered in the published accounts of the obedience experiments. Much like the diverse degrees of naïveté and the array of curiosity described above, a colorful patchwork of affective exchanges laid the groundwork for ways of acting otherwise. For example, Subject 1801 tried to assuage the learner’s pain by covertly delivering a shock of shorter duration to the learner (Reaction of subjects, 1962a), a tactic Milgram noticed among several subjects but one he deemed to be ineffective and in the end to serve merely “as a balm to the subject’s conscience” (Milgram, 1974 p. 159). Another subject frustrated with the learner for getting so many answers wrong, attempted to loudly emphasize the correct word pairs. However, the learner “still didn’t pick that up either and he was getting the shocks and I

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Hoffman et al. 683 started hesitating around 140 volts as … I mulled it over quickly in my mind I thought, well, nobody is going to do anything that’s going to hurt this man, I still thought he was getting the shock though” (cited in Errera, 1963a, p. 5). Quite opposite to the subject’s reported attempt to help the learner, another subject revealed to Dr. Errera that he had become annoyed at the learner, complaining that “it seemed like the fellow wasn’t even trying to help himself” (Errera, 1963b). Some subjects were so frustrated with the learner’s seeming incompetence that they offered to exchange roles and let the learner administer the shocks (Subject 0240, 0502); the learner refused such offers, displaying an inflexibility that hinted that all was not as it appeared. Subjects’ confusion, frustration, fear, sympathy, annoyance, doubt, and curiosity all (like Subjects 0310 and 2004’s traumatic pasts) constitute an excess subjectivity that—while operative and foundationally salient in the heat of the experimental moment—eventually melts away from what is inscribed in Milgram’s published data. The archival materials docu- ment more than bare behavior: they reveal laboratory liveliness that is never fully beholden to experimenters’ cuts.

Resistance by any other name? Milgram sought post-experimental data to corroborate subjects’ beliefs in the experimental conditions and to ascertain whether individual characteristics (age, occupation, military service, and the like) influenced behaviors. Yet he also turned to these data to under- stand what constitutes obedience, and his conversations with Errera indicate he was at least temporarily stymied by the multiplicity of subjects’ reported beliefs, feelings, and actions. Milgram’s concentrated attention to these diverse responses, both the substantial and subtle, yielded new explanatory accounts. Yet his auxiliary, post- experimental interpretations ultimately dimmed rather than clarified defiance. His interpretations made no allowance for the fact that scientists cannot always negotiate the world as they please because “nature” sometimes “resists” scientists’ aspirations (Galison, 1987), and it forecloses too on the possibility of intentionality (Patten, 1977). Milgram thus neglected to consider how subjects’ agency might be an “emergent property of intersubjective exchanges” (Krause, 2012, p. 8), emerging, in this case, from the impersonal, unequal dynamics of the experimenter–subject system. Put otherwise, one does not need a “Foucauldian conspiracy” account (Ash, 1992) to appreciate the ways experimental designs regulate performances (Spears & Smith, 2001) or notions of sover- eign agency (intentionality) to understand individual actions. To some degree, Milgram was able to eschew this vocabulary because he had made the very relationship between experimenter and subject the center of his obedience stud- ies. He then used the post-experimental interviews and reports to check the reliability of his methods. Despite an experimental design that honed in on the experimenter– subject relationship, and despite pristine behavioral definitions of obedience and disobedience, Milgram apparently still found it necessary to examine the ways that subjects acted otherwise. He went to lengths to reconcile his avowed commitment to the subjects’ perspectives with dedication to the experimental hypothesis. This rec- onciliation ultimately involved elaborate reinterpretation of the subjects’ perspectives. Admittedly facing a problem of linguistic representation, he recognized that “there is

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 684 Theory & Psychology 25(5) probably no word in everyday language that covers the experimental situation exactly, without omissions or irrelevant connotations.” He acknowledged that using the words “obey and disobey” to describe the subjects’ diverse actions “was partly for convenience” (1977, p. 122). The dual aim to attend to subjects’ perspectives and remedy lacunae of language was accomplished by applying several strategies for interpreting the ways of acting otherwise. First, a variety of restive, even dissident actions were captured with the concepts of “tensions” and “strains.” Physical, verbal, gestural, and “nearly defying” actions were interpreted as subjects’ means of inhibiting disobedience. In these visible, observed, and recorded tensions and strains, subjects were claimed to “display a curious dissociation between word and action” (Milgram, 1974, p. 77). He reasoned that despite their efforts, subjects struggle to disengage from the experimental situation. The post-hoc introduction of the additional variables of tension and strain involved a second strategy: incorporating psychoanalytic and unconscious cognitive notions such as “dissociation.” Subjects’ reports of resistant acts and thoughts thus were deemed to indicate non-conscious processes of “self-delusion,” “denial,” and reaction formation (Milgram, 1974, pp. 158– 160). A third interpretive strategy involved distinguishing between apparent versus genu- ine disobedience, a distinction that required introducing a behavioral category of “dissent” to refer “to a subject’s expression of disagreement with the course of action prescribed by the experiment” (1974, pp. 161–162). Although acknowledging that “dissent” may be a “first step” toward defiance of the experimenter, Milgram all but dismissed dissent by claiming it to be ineffective and, perhaps more importantly, to have the “self-serving end” of giving subjects “psychological consolation” and a publically desirable image. Through these interpretive strategies, Milgram not only accounted for resistant actions, but also set a high criterion for disobedience. Disobedience, he wrote, “Is not an act that comes eas- ily.” He continued, it “implies not merely the refusal to carry out a particular act of the experimenter but a reformulation of the relationship between subject and authority” (1974, p. 162). Disobedience requires “mobilization of inner resources,” carries a con- siderable “psychic cost,” and produces “a gnawing sense that one has been faithless” (pp. 163–164). For an act to count as disobedience it must not only meet the operational definition but also entail suffering. This high standard for disobedience, one set by moral and personal as well as behavioral criteria, was sustained in subsequent writings: in a 1970 article Milgram distinguished the disobedience of war resisters who were “morally inspired but politically ineffective” (1977, p. 149) from those who undertook truly effec- tive resistance through added personal sacrifices and practical counter-actions. In light of these crucial interpretive moves to represent ways that subjects acted otherwise, it is not surprising to find that his 1974 book discusses disobedience on only 12 of the 205 pages and his 1977 volume of collected essays mentions disobedience on only 32 of the 350 pages, including the reprinting of the 1970 article on war resisters. What is known about resistance (disobedience and related dissenting acts) from the experiments thus depends on what Milgram counted as such. With extensive interpretive accounting and requisite “simplification” of experimental data (Star, 1983), along with relatively minor analytic attention to the approximately 40% who eventually disobeyed according to the operational definitions, the experiments stand primarily as lessons on obedience. Remembered are the approximately 60% of subjects who obeyed authority. In

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Hoffman et al. 685 this regard Milgram’s studies are not exceptional, for the difficulties of seeing how individuals act otherwise, how they resist, refuse, or disregard objectionable yet normative conditions, are evident across the human sciences. Despite the ever-present cultural hopes that individuals will resist injustice and refuse oppression, our human science understanding of resistance is underdeveloped. Definitions are largely rigid and narrow; as Martin observed (and Milgram’s work illustrates), the criteria for “what counts as resistance are held at an unreasonably stringent level” (1987, p. 183). Researchers have tended to study manifest, openly declared forms of resistance, neglecting “low profile, undeclared resistances that constitute the domain of infra-politics” (Scott, 1990, p. 198). The study of resistance confronts not only the paucity of analytic tools but also the many forms resistance can take. Defiant actions and speech transgress normative conduct and, therefore, often are socially complex, ambivalent, or ambiguous (not always transparent in intention or outcome; Ortner, 1995). Challenges to authority are commonly labeled uncivilized, mad, or morally bad (Potter, 2011). Resistance can take forms beyond the politically obvious and can emerge as “barely recognizable, less- than-conscious mobilization of bodily potentials” (Hynes, 2013, p. 573). A conceptual reformulation of resistance also needs to recognize the diverse techniques individuals can use to defy oppressive conditions (Fivush, 2010; Hanna, 2013). Analysis of materials in the Milgram archive, mostly unpublished data, yields a preliminary taxonomy of techniques of resistance. This compilation includes a range of acts of defiance, from modest bodily gestures to grave suspicious and subversive acts. The ways that Milgram’s subjects acted otherwise are made visible only with an empiri- cally robust model of the experimenter–subject system that replaces the perfect relationship assumed (or desired) in conventional representations and sustained only by “ignoring complexity” and “subjects’ reactions” (Star, 1983, p. 207). The model adopted here undoes those simplification processes and thus challenges the conventional epistemic cut between observer and object of observation, experimenter and subject. A robust experimenter–subject model likewise interrogates the cut between obedience and disobedience, recognizing instead the ambiguities, ambivalences, messi- ness, and even ineffectiveness of defying normative conditions. The model acknowledges agency not as autonomous and always efficacious but as non-sovereign, socially distrib- uted, and dependent on social uptake (Krause, 2012). Thus our analysis joins with and affirms the larger claims of situationism, yet suggests how situationist aims to enhance individuals’ capacities to oppose injustice and unfreedom need to devote attention to those actions dismissed as either “minute,” “external,” or “irrelevant,” but which actually constitute the potentialities of resistance.

Conclusion Examining laboratory life through an extended experimenter–subject system combines both a top-down Foucauldian awareness of power’s distributed influence and a bottom- up humanist model attuned to the corporeal, cognitive, and affective dimensions of subjects’ experiences. What emerges from this synthesis is not simply the docile, static personhood painted by experimentalists and Foucauldians alike, but a conception of the subject marked simultaneously by distributed, non-sovereign agency and a capacity for

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 686 Theory & Psychology 25(5) ways of acting otherwise. In fact, it is this very non-sovereign agency that makes resist- ance possible—resistance which is not visible in Milgram’s articles on the experiments. While those published accounts define disobedience exclusively by an outright refusal to shock the learner, we found that subjects resisted in layered, multiple ways. Our model provides an understanding of Milgram’s experiments far different from that commonly echoed through popular discourse—such as the claims of Georgetown University busi- ness professor, Edward Soule, in a New York Times article. Speculating about how mid- level bank managers broke the law to please their superiors, Soule explained, “As human beings, we are predisposed to be obedient to authority, no matter how malevolent it may be” (Stewart, 2013, para. 19). At the very least, there is a danger in such generali- zations about human nature. If there is anything our investigation of the Milgram subject files shows, it is the exact opposite: that despite a seemingly totalizing experimental apparatus, subjects can find ways to question, defy, and subvert authority. It is our hope that an understanding of the obedience experiments that is more attuned to the complex- ity of the laboratory world—from its institutional norms to its in situ performances—will both guide a broader vision of experimentation in psychology and inform individuals of their own potential to act otherwise.

Declaration of Conflicting Interests The author(s) declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.

Funding The author(s) disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This research was supported in part through fellowships funded by the Howard Hughes Medical Institute’s Undergraduate Science Education Program.

References Ash, M. G. (1992). Historicizing mind science: Discourse, practice, subjectivity. Science in Context, 5(2), 193–207. Barad, K. (2007). Meeting the universe halfway: Quantum physics and the entanglement of matter and meaning. Durham, NC: Duke University Press. Baumrind, D. (1964). Some thoughts on ethics of research: After reading Milgram’s “Behavioral Study of Obedience.” American Psychologist, 19(6), 421–423. Brannigan, A. (2013). Genocide and the obedience paradigm. In Beyond the banality of evil: Criminology and genocide (pp. 1–21). Oxford, UK: Oxford University Press. Butler, J. (1997). Excitable speech: A politics of the performative. London, UK: Routledge. Derksen, M. (2001). Discipline, subjectivity and personality: An analysis of the manuals of four psychological tests. History of the Human Sciences, 14(1), 25–47. Devereux, G. (1968). From anxiety to method in the behavioral sciences. The Hague, the Netherlands: Mouton. Errera, P. (1963a, April 4). Obedient subjects. [Meeting conducted by Dr. Paul Errera]. Stanley Milgram Papers (Sanitized Data, Box 155A). Yale University Archives, New Haven, CT. Errera, P. (1963b, April 18). Obedient subjects. [Meeting conducted by Dr. Paul Errera]. Stanley Milgram Papers (Sanitized Data, Box 155A). Yale University Archives, New Haven, CT.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Hoffman et al. 687

Fivush, R. (2010). Speaking silence: The social construction of silence in autobiographical and cultural narratives. Memory, 18(2), 88–98. Galison, P. (1987). How experiments end. Chicago, IL: University of Chicago Press. Hanna, P. (2013). Reconceptualizing subjectivity in critical social psychology: Turning to Foucault. Theory & Psychology, 23, 657–674. doi:10.1177/0959354313493152 Hynes, M. (2013). Reconceputalizing resistance: Sociology and the affective dimension of resist- ance. The British Journal of Sociology, 64(4), 559–577. Krause, S. R. (2011). Bodies in action: Corporeal agency and democratic politics. Political Theory, 39(3), 299–324. Krause, S. (2012, March). Freedom beyond sovereignty. Paper presented at The Stanford Political Theory Workshop, Stanford, CA. Latour, B., & Woolgar, S. (1985). Laboratory life: The construction of scientific facts (2nd ed.). Princeton, NJ: Princeton University Press. Law, J. (2004). After method: Mess in social science research. New York, NY: Psychology Press. Martin, E. (1987). The woman in the body: A cultural analysis of reproduction. Boston, MA: Beacon Press. McCarthy, A. (2004). Stanley Milgram, Allen Funt, and me: Postwar science and the “First Wave” of reality TV. In. S Murray & L. Oulette (Eds.), Reality TV: Remaking television culture (pp. 19–39). New York, NY: New York University Press. Mele, A. R., & Shepherd, J. (2013). Situationism and agency. Journal of Practical Ethics, 1(1), 62–83. Milgram, S. (1961a, May). Learning theory and constitutional factors. Stanley Milgram Papers (Series II, Box 46, Folder 164). Yale University Archives, New Haven, CT. Milgram, S. (1961b, June). Social elements articulated through a system of action. Stanley Milgram Papers (Series II, Box 46, Folder 164). Yale University Archives, New Haven, CT. Milgram, S. (1961c, October). Why obedience? A gestalt formulation. Stanley Milgram Papers (Series II, Box 46, Folder 164). Yale University Archives, New Haven, CT. Milgram, S. (1962a, March). Three relational bonds. Stanley Milgram Papers (Series II, Box 46, Folder 163). Yale University Archives, New Haven, CT. Milgram, S. (1962b, July). An underlying model of the experiments: Field of forces. Stanley Milgram Papers (Series II, Box 46, Folder 164). Yale University Archives, New Haven, CT. Milgram, S. (1962c, July). One theory or many. Stanley Milgram Papers (Series II, Box 46, Folder 164). Yale University Archives, New Haven, CT. Milgram, S. (1962d, August). Tension in the experiment: Interpretation. Stanley Milgram Papers (Series II, Box 46, Folder 164). Yale University Archives, New Haven, CT. Milgram, S. (1963a). Behavioral study of obedience. Journal of Abnormal & Social Psychology, 67, 371–378. Milgram, S. (1963b, December). Is there a unity to the experiments? Stanley Milgram Papers (Series II, Box 46, Folder 164). Yale University Archives, New Haven, CT. Milgram, S. (1964a). Issues in the study of obedience: A reply to Baumrind. American Psychologist, 19, 848–852. Milgram, S. (1964b, July). Three levels of relationship: Spatial, personal, social. Stanley Milgram Papers (Series II, Box 46, Folder 164). Yale University Archives, New Haven, CT. Milgram, S. (1965). Some conditions of obedience and disobedience to authority. Human Relations, 18, 57–76. Milgram, S. (1974). Obedience to authority: An experimental view. New York, NY: Harper and Row. Milgram, S. (1977). The individual in a social world: Essays and experiments. Reading, MA: Addison-Wesley.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 688 Theory & Psychology 25(5)

Miller, J. (n.d.). Content analysis of free responses in questionnaire sent out to subjects. Stanley Milgram Papers (Series II, Box 45, Folder 158). Yale University Archives, New Haven, CT. Murata, T. (1963, August). Reported belief in shocks & level of obedience. Stanley Milgram Papers (Series II, Box 45, Folder 158). Yale University Archives, New Haven, CT. Nicholson, I. (2011a). Shocking masculinity: Stanley Milgram, obedience to authority and the crisis of manhood in postwar America. ISIS, 102, 238–286. Nicholson, I. (2011b). “Torture at Yale”: Experimental subjects, laboratory torment and the “rehabilitation” of Milgram’s “Obedience to Authority”. Theory & Psychology, 21, 737–761. doi: 10.1177/0959354311420199 Ortner, S. (1995). Resistance and the problem of ethnographic refusal. Comparative Studies in Society and History, 37(1), 173–193. Patten, S. (1977). Milgram’s shocking experiments. Philosophy, 52, 425–440. Perry, G. (2013). Behind the shock machine: The untold story of the notorious Milgram psychology experiments. New York, NY: The New Press. Potter, N. N. (2011). Mad, bad, or virtuous? The moral, cultural, and pathologizing features of defiance. Theory & Psychology, 22, 23–45. doi:10.1177/0959354310385746 Reaction of subjects. (1962a). [Coded subject response cards]. Stanley Milgram Papers (Series II, Box 44). Yale University Archives, New Haven, CT. Reaction of subjects. (1962b). [Condition 2 subject files]. Stanley Milgram Papers (Sanitized Data, Box 118). Yale University Archives, New Haven, CT. Reaction of subjects. (1962c). [Condition 3 subject files]. Stanley Milgram Papers (Sanitized Data, Box 154). Yale University Archives, New Haven, CT. Richards, G. (2002). The psychology of psychology: A historically grounded sketch. Theory & Psychology, 12, 7–36. doi:10.1177/0959354302121002 Rochat, F., Maggioni, O., & Modigliani, A. (2000). The dynamics of obeying and opposing authority: A mathematical model. In T. Blass (Ed.), Obedience to authority: Current perspectives on the Milgram paradigm (pp. 161–192). Mahwah, NJ: Lawrence Erlbaum. Ross, L., & Nisbett, R. E. (2011). The person and the situation: Perspectives of social psychology. London, UK: Pinter and Martin. Russell, N. (2010). Milgram’s obedience to authority experiments: Origins and early evolution. British Journal of Social Psychology, 49, 1–23. Scott, J. C. (1990). Domination and the arts of resistance: Hidden transcripts. New Haven, CT: Yale University Press. Spears, R., & Smith, H. J. (2001). Experiments as politics. Political Psychology, 22, 309–330. Stam, H. J., Lubek, I., & Radtke, H. L. (1998). Repopulating social psychology texts: Disembodied “subjects” and embodied subjectivities. In. B. Bayer & J. Shotter (Eds.), Reconstructing the psychological subject (pp. 153–186). London, UK: Sage. Star, S. L. (1983). Simplification in scientific work: An example from neuroscience research. Social Studies of Science, 13(2), 205–228. Stewart, J. B. (2013, July 5). Boss’s remark, employee’s deed and moral quandary. New York Times (Business Day). Retrieved from http://www.nytimes.com/2013/07/06/business/moral- quandaries-at-mf-global.html Subject 2004. (n.d.). [Audio recording of experimental session]. Stanley Milgram Papers (Sanitized Data, T53Ui2). Yale University Archives, New Haven, CT.

Author biographies Ethan Hoffman is a clinical psychology doctoral student in the Francis L. Hiatt School of Psychology at Clark University. His research examines discourse about mental health and gender,

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Hoffman et al. 689 particularly the ways that individuals negotiate dilemmas of agency and determinism in contempo- rary rhetoric regarding depression. Email: [email protected] N. Reed Myerberg is a graduate student in the University of Cambridge’s program in the History, Philosophy, and Sociology of Science, Technology, and Medicine. He is currently working on a translation of Georges Canguilhem’s Vie et Mort de Jean Cavaillès as well as projects involving the intersection of science, art, and philosophy in post-1968 French thought. Email: nrm33@cam. ac.uk Jill G. Morawski is Professor of Psychology and member of the Science in Society Program at Wesleyan University. Her research examines the historical dynamics of modern psychology, and her current project considers the history of research subjects and their relation to experimenters. Email: [email protected]

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 TAP0010.1177/0959354315592062Theory & PsychologyBaumrind 592062review-article2015

Essay Review

Theory & Psychology 2015, Vol. 25(5) 690­–696 When subjects become © The Author(s) 2015 Reprints and permissions: objects: The lies behind sagepub.co.uk/journalsPermissions.nav DOI: 10.1177/0959354315592062 the Milgram legend tap.sagepub.com

Diana Baumrind University of California, Berkeley

Abstract In her exposé of Milgram’s deceptions in obtaining “” and “dehoaxing” his subjects, Perry reveals Milgram’s deceit and misrepresentation in his dealings with colleagues as well as participants. Perry relies on evidence from Milgram’s unpublished papers and transcripts of his experimental proceedings to support her pejorative judgment on Milgram’s professional ethics. Although deception research, such as Milgram’s, clearly violates the informed consent clause of the APA Code of Ethics, it remains a modus operandi in child development and social psychology research. I argue that deception research proscribes informed consent and infringes the fiduciary obligation of psychologists to be trustworthy, and should be prohibited rather than justified by a cost-benefit analysis, as is presently the case.

Keywords deception research, fiduciary obligations, informed consent, Milgram’s lies

Gina Perry, Behind the Shock Machine: The Untold Story of the Notorious Milgram Psychology Experiments. New York, NY: The New Press, 2013. 352 pp. ISBN 9781595589217 (hbk).

Most people concur with most ethicists that lying is wrong, and requires justification (see Bok, 1978). In my view lying is not always wrong, but deceptive research is. In several essays (Baumrind, 1964, 1985, 2013) I criticized Milgram for violating the fiduciary obligation of a psychologist to be trustworthy, by lying to prospective participants in his obedience studies. Milgram (1964) justified lying to participants about the purpose and methods of his study when soliciting their participation, on the basis that the valuable knowledge he acquired was a higher-order good that could not have been procured as reliably by any alternative (non-deceptive or less stressful) methodology. Milgram’s

Corresponding author: Diana Baumrind, University of California, Berkeley, USA. Email: [email protected]

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Baumrind 691 supporters believe that even if Milgram’s deceptions are considered a cost it is a cost justi- fied by the benefit of the knowledge his research produced. Gina Perry disagrees, as do I.

The research psychologist as fiduciary The use of deception in research violates the fiduciary obligation of psychologists to be trustworthy and respectful in their dealing with persons who have accepted the role of subject. In contrast to “caveat emptor” which may apply in some sales relations, the preeminent duty of a fiduciary is trustworthiness. Fundamental moral principles of reciprocity, justice, and respect for persons are violated when the research psychologist deceives participants whose cooperation and compliance is granted with the expectation that the psychologist, as a fiduciary, is trust- worthy and solicitous of their well-being. Milgram violated his fiduciary obligation to research participants to be trustworthy by his deceptive research, and to be respectful by his contempt for those who complied with the coercive prods of the “experimenter” to continue to shock the “learner.” Milgram’s disrespect is conveyed by the pejorative term, “shockingly immoral,” he used to condemn what he referred to as their “destructive obedience” to his confederates (Milgram, 1964, p. 849; 1974, p. 194).

When is consent informed After reading Milgram’s (1963) report of his obedience experiment, I was sufficiently appalled at his deception and psychologically abusive treatment of participants to say so in print (Baumrind, 1964). I followed this critique with a series of articles disagreeing on ethical grounds with the position of the American Psychological Association (APA) on the use of deception when obtaining “informed” consent. I asserted (Baumrind, 1971, 1972, 1978, 1979, 1985) that although informed consent does not require full disclosure of the researcher’s questions and hypotheses, an ethical code is meaningless if it fails to proscribe lying to participants about the purpose of a study or the part they will play in it, except unless they have previously given written consent to that possibility. Deception research, such as Milgram’s, violates the right of participants to knowingly agree or refuse to participate in a particular experiment. The right to informed consent is inviolable and unqualified. Misinformed consent is a prima facie violation of the inal- ienable right of participants as autonomous beings to informed consent in the research they volunteer for. Misinformed consent is a morally unacceptable breach of trust in a fiduciary that should never be infringed by an investigator, or justified by a cost-benefit analysis, even for the sake of knowledge, in the name of science. However, withholding information about possible experimental manipulations or research hypotheses in order to minimize demand characteristics does not, in my view, qualify as deception research (see Baumrind, 1985, p. 165), provided that prospective participants have agreed in advance to temporary or permanent selective disclosure.

The APA code of ethics and Milgram’s deception research In contrast to my condemnation of deception research, the utilitarian cost-benefit algo- rithm of the APA code of ethics (APA, 2002) permits deception when soliciting consent

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 692 Theory & Psychology 25(5) to participate and throughout the experimental procedure, requiring only that the deception, in the researcher’s view, have prospective value, inflict no harm, and have no effective alternative. An “effective alternative” may be interpreted by the researcher as one that is as convenient and thrifty as the deceptive procedure, and “harm” is not defined to include lying to the participant. Milgram’s lies to his participants began with a false description of what the research was about. In a newspaper advertisement, Milgram recruited volunteers ostensibly “to participate in a study of memory and learning at Yale.” Once recruited, volunteers were falsely told that they would be teachers in a learning experiment in which they would punish the learners with increasingly powerful electric shocks whenever the learner made a mistake. Milgram further lied to his participants in what Perry (2013b, p. 82) aptly calls a “deceptive debrief”: Rather than telling participants the truth—that the machine was a prop—participants were told only that the shocks were not as painful as they seemed. Thus, even when Milgram debriefed his participants, he did not “dehoax” (his term) them in the event, he said, that he wished to continue his obedience experiments. Milgram (1974, p. 198) justified his pervasive use of deception as serving a revelatory function akin to the joint creation by the playwright and the theater-goer for the sake of the viewer’s enjoyment. However, Milgram’s analogy between theatrical fiction and experimental deception in the benefits they can incur ingenuously overlooks a crucial distinction: whereas theatergoers willingly suspend disbelief in an illusion created for their own entertainment, participants who are lied to are deceived without their informed consent, to serve the experimenter’s, not the participant’s, interests. Since Milgram’s deception research had prospective value, inflicted no physical harm, and an effective alternative to deception could be more costly and less convenient, his hoaxes might well not violate the APA code of ethics today (although the stress he induced might), which renders Provisions 8.07 and 8.08 that “psychologists do not conduct a study involving deception” and that they debrief promptly, hollow.

Perry’s exposé of Milgram’s deceptions In Behind the Shock Machine: The Untold Story of the Notorious Milgram Psychology Experiments, Gina Perry (2013a) uniquely exposes Milgram’s use of deceit and illusion in his dealings with colleagues as well as participants. The term “notorious” (which connotes infamous, disreputable), in the title of Perry’s book, signals that it is a critical exposé, not a morally neutral recitation or laudation of Milgram’s work. Perry’s journalistic sleuthing and careful fact-checking revealed deception not only in Milgram’s informed consent pro- cedure and debriefing of his participants, but in his published accounts of his experimental procedures and findings. As a result Perry does not share the admiration Milgram and his study continue to inspire in his protagonists (e.g., see Blass, 2004; Elms, 1972; Haslam, Miller, & Reicher, 2014; Miller, 1986). Perry made her pejorative judgment on Milgram’s ethics explicit in the title of her article in Theoretical and Applied Ethics. In that article – “Deception and Illusion in Milgram’s Accounts of the Obedience Experiments” – Perry (2013b) concluded “that

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Baumrind 693 evidence from Milgram’s unpublished papers and original recordings and transcripts cast doubt on Milgram’s reliability as a narrator of the obedience research” (p. 90). Chapter titles in her book such as “Subjects as Objects” and “The Secret Experiments” convey that Milgram concealed details that did not support his narrative of what actually took place in the obedience experiments. Based on Milgram’s archival records Perry docu- ments her charge that in his published reports of the procedural details of his obedience research, Milgram presented them as more standardized than they actually were.

Perry’s methodological critique of Milgram’s study Information Perry gleaned from interviews with former participants and “accomplices” (Milgram’s term for his assistants), as well as Milgram’s unpublished archival materials and published accounts led her to question the construct validity and the generalizability of Milgram’s obedience study. The construct validity of Milgram’s study of “destructive obedience” requires that participants actually believe they are delivering painful shocks to the learner. Based on information from Milgram’s assistants and the participants she could locate, Perry con- cluded that many, if not most, of the participants Milgram classified as obedient were not naïve, but chose to put their skepticism aside because they were affiliated with the investigator, and assumed he would allow no harm to come to the learner. Perry reports that of the participants who fully believed the learner was receiving painful shocks 62.5% disobeyed the experimenter (see Perry, 2013a, p. 139, table). Contrary to Milgram’s account, Perry found that the experimenter improvised as needed to pressure participants to continue. When participants expressed doubt about continuing they were urged to proceed with four presumably “standardized” prods (that would violate all ethical codes today which mandate that participants be told they are free to withdraw their consent at any point):

Prod 1: Please continue, or Please go on. Prod 2: The experiment requires that you continue. Prod 3: It is absolutely essential that you continue. Prod 4: You have no other choice, you must go on.

It is ambiguous at which step the participant was categorized as “obedient” for the pub- lished accounts. The fact that the delivery of these prods was not standardized, casts doubt on the construct validity of Milgram’s categorization of most participants as “obedient,” (that is the 65% rate of participants who went to the maximum voltage). Perry also questions the ecological validity and therefore the generalizability of Milgram’s findings. Orne and Holland (1968) had earlier questioned the extent to which “obedient” participants were actually deceived, and therefore the ecological validity of the study: they pointed to incongruencies which must have raised suspicion; for example, the incongruity of the experimenter in a university-sponsored project calmly prodding participants to continue as the victim screams and demands to be released. The generalizability of Milgram’s findings to Holocaust atrocities, and to the obedience of American soldiers to their military superiors in the Vietnamese conflict (see Milgram,

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 694 Theory & Psychology 25(5)

1974, pp. 180–189) is compromised by the particularities of the experimental condition he created. Because participants were paid for their participation and recruited on the basis that they would contribute to an important scientific endeavor, their compliance in Milgram’s setting might well have reflected a sense of fair play and employee loyalty rather than the obedience Milgram characterized as “shockingly immoral” (1974, p. 194). Having agreed to participate, Milgram’s participants were presented with an ethical dilemma—to refuse to fulfill a commitment for which they had been paid to serve the cause of science in a study designed by a reputable scholar at a reputable university to study memory and learning or (so they thought) to cause an innocent person physical pain, an act Milgram refers to pejoratively as “destructive obedience.”

Deception research is not ethically acceptable Deceptive procedures are so much a part of psychology’s modus operandi today that they often are not even noted or justified. For example, in the recent staged research by Lyon et al. (2014) to study how age and condition affect revealing a misdeed, young children removed from their parents’ custody were approached by a stranger who urged them to conceal a manipulated toy breakage. During the debriefing the children were not dehoaxed, but were told (hypocritically) how important it was (for them) to always tell the truth. Neither the children’s court-appointed attorney nor any institutional review board objected to the manipulation of the children, and the researchers did not note or justify the deception in their article. The objective of the Lyon et al. study to learn how to elicit honest disclosure of a misdeed from children did not, any more than Milgram’s objective to learn more about how to prevent destructive obedience, require lying to participants. When a fiduciary violates a person’s trust by deception research the deceived participant—in the Lyon et al. study, a young child—is taught that he or she cannot trust those who by social contract are designated trustworthy and that lying as a means to one’s end is not reprehensible. Yet it is doubtful that today’s APA Code of Ethics would prohibit deceiving participants in either the Milgram or Lyon et al. study, if the investigator simply claimed that scientific advantage justified the use of deception.

Closing thoughts Perry’s book, even though written to appeal to a lay audience and despite its extensive and informative notes, needs an index and reference list to fulfill its scholarly mission. In Chapter 3, Perry reports her extended conversations with Herb Winer, who had been a participant in Milgram’s experiment as well as a colleague at Yale. In 2000 Winer had initiated a brief correspondence with me, identifying himself as a “disobedient subject in the Milgram experiments, of which you were (and still, to my knowledge, are) the most serious critic.” I wrote to Winer to ask whether, from his perspective, Perry was correct that “behind his banter and professed admiration for Milgram Herb’s anger simmered” (p. 71), although that he was still “very glad” to have participated in the study. Winer (now age 92) called me, initiating a long and informative conversation. Although still ambivalent about Milgram’s procedures, Winer remembered his participation in Milgram’s

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Baumrind 695 research more with pride and warm feelings than with the ambivalence and anger Gina Perry (2013a, p. 71) described (H. Winer, personal communication, June 14, 2000). Presuming the accuracy of Perry’s fact-checking, Milgram’s (1963) version of these experiments can never again be accepted as the whole truth. Perry wrote that the ethical outcry beginning with Baumrind’s critique “eventually ended experiments like Milgram’s” (2013a, p. 123). Unfortunately I don’t think it did. Deception research is alive and well in social psychological research. As the recent Lyon et al. (2014) study indicates, deception research, including deceptive debriefing, need only be justified by its presumed social significance and enhanced experimental control to meet the permissive standards of the APA Code of Ethics. The decision to deceive is left largely to the investigator, who clearly is an interested, rather than objective, party. Most deception research is motivated more by convenience than necessity. Perry’s exposé and the incisive critiques of deception research in this special issue of Theory & Psychology will, I hope, block the present routine use of deception in social psychologi- cal research. The few experiments that, like Milgram’s, appear to require misinformed consent violate the inviolable right of research participants to informed consent, and should be proscribed. Surely, were their code of ethics to prohibit deception research, psychologists could use their ingenuity to devise non-deceptive alternatives to studying destructive obedience, and other socially significant phenomena.

Funding This research received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors.

References American Psychological Association (APA). (2002). Ethical principles of psychologists and code of conduct. Retrieved from http://www.apa.org/ethics/code/principles.pdf Baumrind, D. (1964). Some thoughts on ethics of research: After reading Milgram’s “Behavioral study of obedience”. American Psychologist, 19(6), 421–423. Baumrind, D. (1971). Principles of ethical conduct in the treatment of subjects: Reaction to the draft report of the Committee on Ethical Standards in Psychological Research. American Psychologist, 26(10), 887–896. Baumrind, D. (1972). Reactions to the May 1972 draft report of the Ad Hoc Committee on Ethical Standards in Psychological Research. American Psychologist, 27(11), 1083–1086. Baumrind, D. (1978). Nature and definition of informed consent in research involving deception. In The Belmont report, Ethical principles and guidelines for the protection of human subjects (Appendix, Vol. II, pp. 23–171; DHEW Publication No. OS 78–0014). Washington, DC: The National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. Baumrind, D. (1979). IRBs and social science research: The costs of deception. IRB: A Review of Human Subjects Research, 1(6), 1–4. Baumrind, D. (1985). Research using intentional deception: Ethical issues revisited. American Psychologist, 40(2), 165–174. Baumrind, D. (2013). Is Milgram’s deceptive research ethically acceptable? Theoretical and Applied Ethics, 2(2), 1–18.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 696 Theory & Psychology 25(5)

Blass, T. (2004). The man who shocked the world: The life and legacy of Stanley Milgram. New York, NY: Basic Books. Bok, S. (1978). Lying: Moral choice in public and private life. New York, NY: Pantheon Books. Elms, A. C. (1972). Social psychology and social relevance. Boston, MA: Little, Brown. Haslam, S. A., Miller, A. G., & Reicher, S. D. (Eds.). (2014). Milgram at 50: Exploring the endur- ing relevance of psychology’s most famous studies [Special issue]. Journal of Social Issues, 70(3). Lyon, T. D., Wandrey, L., Ahern, E., Licht, R., Sim, M. P. Y., & Quas, J. A. (2014). Eliciting maltreated and nonmaltreated children’s transgression disclosures: Narrative practice rapport building and a putative confession. Child Development, 85, 1756–1769. Milgram, S. (1963). Behavioral study of obedience. Journal of Abnormal and Social Psychology, 67(4), 371–378. Milgram, S. (1964). Issues in the study of obedience: A reply to Baumrind. American Psychologist, 19(11), 848–852. Milgram, S. (1974). Obedience to authority. New York, NY: Harper & Row. Miller, A. G. (1986). The obedience experiments: A case study of controversy in social science. New York, NY: Praeger. Orne, M. T., & Holland, C. H. (1968). On the ecological validity of laboratory deceptions. International Journal of Psychiatry, 6(4), 282–293. Perry, G. (2013a). Behind the shock machine: The untold story of the notorious Milgram psychol- ogy experiments. New York, NY: The New Press. Perry, G. (2013b). Deception and illusion in Milgram’s accounts of the obedience experiments. Theoretical and Applied Ethics, 2(2), 79–92.

Author biography Diana Baumrind is a research psychologist at the Institute of Human Development, University of California at Berkeley. Her ongoing interests and recent publications are on uses and abuses of power assertion in parenting; development of competence in children; and research ethics, in particular ethics of deception research.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 TAP0010.1177/0959354315590436Theory & PsychologyReview 590436book-review2015

Review

Theory & Psychology 2015, Vol. 25(5) 697­–700 Understanding the unthinkable © The Author(s) 2015 Reprints and permissions: sagepub.co.uk/journalsPermissions.nav DOI: 10.1177/0959354315590436 tap.sagepub.com

Augustine Brannigan, Beyond the Banality of Evil: Criminology and Genocide. Oxford, UK: Oxford University Press, 2013. 261 pp. ISBN: 9780199674626 (hbk).

Reviewed by: Matthew P. Unger, University of Alberta

Canada’s recent attempts at understanding and addressing issues related to mass ethnic violence and genocide are reflective of international social trends seeking appropriate methods of societal reconciliation and criminological responses. The 2013 Canadian Truth and Reconciliation Committee for investigating the abuses and experiences of Aboriginal children during the Indian Residential Schools system brought to the fore of Canadian consciousness both the violence of Canada’s colonial past and the necessity for expression, resolution, and reconciliation (Stanton, 2011). In 2009, the indictment of Rwandan foreign nationals involved in the Rwandan genocide was an indication of the current trend towards domestic prosecution of foreign national war criminals and people involved in genocidal activities. Finally, the trend towards the increasing governance of previously relatively ungoverned areas as a way of redressing previously traumatized places and preventing future atrocities is reflected in the circuit courts of the eastern Canadian Arctic (p. 217). These events have brought to the Canadian consciousness the complexities associated with addressing past mass traumas, public and governmental recognition, and the necessary healing and recovery after episodes of mass ethnic violence. As Brannigan states in his exploration of the criminological responses to genocide and ethnic violence, “there are gaps between memory (truth telling through the recollection), justice (establishing culpability of specific individuals according to stringent legal standards based on such revelations), and reconciliation (the subsequent ‘healing’ of previously divided communities)” (p. 195). Augustine Brannigan’s study on the social psychological and criminological responses to genocidal activities is a fascinating and detailed contribution to reframing the founda- tions of predominant interpretations of the genocidal mentality, social discourses, and the various transformations of the domestic and international legal response in contexts that have experienced mass ethnic violence. The title intentionally evokes Hannah Arendt’s (1963/2006) interpretation and coverage of the Adolf Eichmann trial in Jerusalem in 1961. The influence of Arendt’s coverage of the Israeli trials came from the stark conclusion that the most horrific and brutal violence in human history was carried out by seemingly normal people within a complex bureaucracy carrying out their duties as any other worker might in a normal desk job. The understanding that authority established a kind

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 698 Theory & Psychology 25(5) of natural “agentic” state that decreased people’s sense of culpability, responsibility, and guilt spurred both controversy and a wave of social psychological studies, most notably the in/famous Milgram experiments. Through his own fieldwork in Rwanda, recent stud- ies on genocide and the Holocaust, and the various international and domestic responses, Brannigan offers what he terms a “criminological odyssey” that seeks to redress perceived shortcomings of the literature grounded in Arendt’s original “banality of evil” thesis. Brannigan’s book is divided thematically into three major components, which together reflect a surprising breadth of scholarship. The first section offers an evaluation of Arendt’s banality of evil thesis and its exposition within social psychological thinking. He evaluates a considerable body of research into Adolf Eichmann and his ascribed cul- pability and role within the atrocities of the Holocaust. Out of this, the authority thesis of the original Milgram experiment cannot stand up to the criticisms and revisions of later experiments. Rather, adherence to duty might actually interpret Eichmann’s case in more astute terms. The second section seeks to ameliorate misconceptions implicit within these earlier social psychological studies by introducing Norbert Elias’ imaginative work. His engagement with Elias allows Brannigan an historical interpretation of the human subject that establishes a relationship between modernizing processes and the development of genocidal mentalities. The third section offers an exploration of the development of the cosmopolitan/international legal and criminological responses to genocidal activities—its paradoxes and complications. He evaluates the extent to which the truth and reconciliation commissions that have been developed and deployed throughout the world are effective in contexts that have experienced great trauma. Brannigan traces the intentional European progression of society from sovereign-based forms of governance that predisposes a genocidal mentality towards broad forms of dem- ocratic political participation that restrain sovereignty politics. One of Brannigan’s major normative standpoints is revealed in the last section of the conclusion where he advo- cates for a responsible government that evokes widespread collective political participation and representation. What we see in Brannigan’s study is both the complicated progres- sion towards effective cosmopolitan law and the increasing localization of ways in which societies apprehend, address, and redress recent and historical atrocities. Brannigan offers several telling insights into the field, especially his implicit critical standpoint towards essentializing historical/ancient conflict as the root of genocidal activities in favor of the complicated and contingent history of policies developed during colonialism. One implication that I take from this standpoint is that we need to examine how historical symbols and events serve as virtual signifiers that can be employed/ activated/be put into the service of contemporaneous concerns and anxieties. I wonder if Brannigan’s research, while enormously comprehensive and thoughtful, might benefit from broader interpretations of the modern genesis of genocide in the development of nation-state politics, which may also provide a counterpoint to the social psychology interpretations of reduced culpability in the presence of a strong sovereign. For instance, Lieberman (2006) suggests through close historical research that it was not just the government that perpetrated ethnic violence, but that whole swathes of the populace were involved in practically each conflict. To my mind, Brannigan’s work allows scholars to think about the extent to which broad social discourses contribute to the development of ethnic conflict and genocidal

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 Review 699 events. Readers of this journal might want to consider with Brannigan’s insights the normative and meaningful frameworks that motivated people to participate. Understanding that the response to genocide is complicated by the broad social legitimation of the eradi- cation of certain racial, ethnic, and other demographics within state borders is also implicitly acknowledging the manner in which social discourses frame judgment and ethical relations of a people. For instance, another interpretation of the social psychologi- cal experiments influenced by Arendt’s thesis and Milgram’s original studies might actu- ally suggest that deference to authority and duty rather than an essential psychological characteristic is a social, historical, and contingent experience of normative belonging. These social psychological studies do not necessarily capture the fields of normativity, the significance of the collective consciousness, and rationalities embedded within cer- tain historical atrocities. Could we not expand this understanding of the subject by the assumption that people act in a world of broad societal affective discourses that frame what people find important? If we begin with this understanding, then we can move towards making the link between the social psychological and the international commu- nity’s response to help prevent/curb genocidal activities more tenable—this is precisely the kind of thinking that Brannigan’s text works towards. I am appreciative of Brannigan’s evaluation and revision of the studies on authority post-Milgram, though it appears at times that he may be over-reliant on David Cesarani’s (2006) recent and authoritative biography of Eichmann for his interpretation of Arendt. If this is so, then this could limit his interpretation of Arendt’s deceptively complicated thesis. In one mention of Arendt in particular, he seems to have evoked earlier misinter- pretations of the “banality of evil” thesis. After quoting Cesarani’s derisive interpreta- tion, he writes, “When we think about genocide, we have been conditioned to think of the ‘banality’ of evil. Were we to take the perspectives of its advocates, there is nothing banal about it” (p. 21). Yet, this seems to be a conflation of the social psychological experi- ments with the ideas of Arendt, as well as a reaffirmation of earlier critiques of the thesis that were based on fundamental misinterpretations. It is my intuition that the legacy of the social psychological research has led to a flattening of Arendt’s ideas and that if we go back to the source, her ideas might in fact help achieve some of the goals and inten- tions of this book. For instance, two of the main goals of Brannigan’s text are to reformulate, or nuance, understandings of genocidal mentalities and expose the complexities of the different criminological responses to large-scale ethnic violence. Judith Butler (2011), in summa- rizing Arendt, writes that it’s not the evil of such atrocities that has become banal, but the fact that genocide is reflective of the general tendency in modern society to destroy ethi- cal thoughtfulness. If we take this perspective, we might also get close to how Brannigan wants to revise the authority thesis and reconstruct a social historical ontology to appre- hend the possibility of such events. In fact, by acknowledging Arendt’s assumption that genocide is unprecedented and somehow an implicit aspect of the modern project (Bauman, 2001), we might be able to reformulate atomistic conceptions of the subject that both mainstream criminology and social psychology assume. I see this book being an excellent reference for understanding both the academic and political legacy of the banality of evil thesis, with a strong evaluation of the critical con- tributions and complications that subsequent social psychological experiments bring to

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015 700 Theory & Psychology 25(5) bear on that initial windfall of an insight. Because of this breadth, I see it to be essential reading for both expanding and historicizing the significance of some of the most signifi- cant studies of the previous century. Furthermore, it provides a unique contribution to genocide studies by connecting the different social responses to genocidal activities to the fields of social psychology and criminology. Dr. Brannigan’s rigor and comprehen- siveness within the book allow it to be suitable for upper undergraduate and graduate level work, as well as an original and important piece of academic research for scholars working in genocide, trauma, and conflict studies.

References Arendt, H. (2006). Eichmann in Jerusalem: A report on the banality of evil. New York, NY: Penguin Books. (Original work published 1963) Bauman, Z. (2001). Modernity and the Holocaust. Ithaca, NY: Cornell University Press. Butler, J. (2011, August 29). Hannah Arendt’s challenge to Adolf Eichmann. The Guardian. Retrieved from http://www.theguardian.com/commentisfree/2011/aug/29/hannah-arendt- adolf-eichmann-banality-of-evil/print Cesarani, D. (2006). Becoming Eichmann: Rethinking the life, crimes, and trial of a “desk mur- derer”. Boston, MA: Da Capo Press. Lieberman, B. (2006). A terrible fate: Ethnic cleansing in the making of modern Europe. Chicago, IL: Ivan R. Dee. Stanton, K. (2011). Canada’s truth and reconciliation commission: Resettling the past. The International Indigenous Policy Journal, 2(3), 1–18.

Downloaded from tap.sagepub.com at UNIV CALGARY LIBRARY on December 1, 2015