The General Brock University Undergraduate Journal of History

Volume 3

April 2018

ISSN 2371-8048

Brock University, St. Catharines, Ontario

The General Brock University Undergraduate Journal of History

Editors-in-Chief Matt Jagas and Grace Viana

Editorial Board Amanda Balyk Kaitlyn Carter Lucas Coia Rebecca Nickerson Vince Savoia

Cover Design Chance Mutuku

Special Thanks The Brock University History Department The Brock University Historical Society Brock University Printing and Digital Services Dr. Jessica Clark, Assistant Professor Dr. Colin Rose, Assistant Professor Dr. Daniel Samson, Associate Professor and Chair Tim Ribaric, Digital Services Librarian Niagara Military Museum

Contents

Foreword By: Dr. Colin Rose 5 Introduction By: Matt Jagas and Grace Viana 7 Early Modern Midwives: Carving a Path in the Male-Dominated Field of Medicine By: Rachel Bedic 9 The United States and Sputnik: a Reassessment of Dwight D. Eisenhower's Presidential Legacy By: Matthew Bologna 29 The Spectacle of Death By: Carina Cino 55 “The Last Battayle is Atte Hande”: Conceptions of Death in Renaissance Italy By: Lucas Coia 77 The Trials of Jamestown: an Investigation of the External Factors Influencing England’s First American Colony By: Ally Dries 101 Science versus ‘Science’: Exploring the Life of Benjamin Banneker in the Context of Thomas Jefferson’s Views in Notes on the State of Virginia By: Emma Evans 111 Art in Early Modern Italy: Artemisia Gentileschi and Caravaggio By: Joslin Holwerda 122 The Trials and Significance of Nazi War Criminals and Collaborators in By: Gabrielle Marshall 150 Vietnamese Farmers That Changed the World: the Impact of the Vietnam War on the Cold War By: Michael Martignago 163 The Formidable Widow: a Comparison of Representations and Life Accounts of Widows in Early Seventeenth-Century England By: Rebecca Nickerson 191 Enemies Within or Without Enemies: “Enemy Aliens” and Internment in Canada during the Second World War By: Tricia Nowicki, with an introduction by Dr. Daniel Samson 206

Foreword

When I was a graduate student, attending the job talks of prospective professors at the University of Toronto, I had in my back pocket a question that I levelled to each candidate: as a professor, do you see yourself as teaching history, or training historians? I thought I was very clever, trying to stump these poor souls with what, by the measure I had in mind, was a false dichotomy. A working historian requires a wide body of knowledge, informed by reading vast bibliographies of secondary and primary sources. But we also need the skills and habits of mind to interpret those materials according to their own schema, to understand their place in broader conversations among scholars, to fit the documents of the past into their historical contexts. So the question of teaching history versus training historians was nonsensical, as we strive to do both at the same time.

As it turns out, the skills necessary to study history and the knowledge developed along the way are precisely the sorts of skills and knowledge that allow us to participate broadly in our contemporary society. A keen eye for the subtle differences between political or social thought—gleaned from the study of the dreaded historiography—lets us unpack the meaning behind the words of politicians and activists, to evaluate the relative positions of their platforms. A deep knowledge of the past lets us understand that these ideas have their own genealogies, have appeared in societies before, and have had consequences that may not seem obvious at first. The thick reading of historical documents gives us the ability to understand the broader societal contexts in which political and social movements develop and to understand how the meaning of ideas can shift over time according to those contexts.

If 2017 was an uncertain year in Canadian and global politics, 2018 perhaps provides more certainty, but not the comforting kind. Across the developed world, the sharp rise of political populism threatens the stability of social democracies and the general liberal order within which our institutions have developed. The very premise of expertise—that rigorous training, skilled deployment of that training, and the marshalling of factual evidence should guide decision-making—has come under attack not just in Trump’s America, but in Europe, and here in Ontario as well. A large proportion of the western population feels disenfranchised by the meritocracy of expertise, imperfect as it is, and have reacted against the very idea of democratic rule. Young people in the United States, across the political spectrum, now poll at 29% approval on the statement that “democracy is not always preferable.”1

As historians, part of our job is to place historical context into present society. We know what happens when the legitimacy and strength of social and political institutions crumble, when the ties that bind people together dissolve under the assault of politics of blame and division.

1 Yascha Mounk, “The Strongman Gap,” Slate, 13 March 2018, https://slate.com/news-and- politics/2018/03/support-for-democracy-is-a-partisan-issue-now.html, accessed 15 March 2018.

5 If this seems a dreary way to foreword Volume III of The General, I think that the research herein provides hope yet. With knowledge of the past comes the responsibility to deploy that knowledge, to participate in the social and political movements of the day, and to bring the skills of historical study to bear on the problems we currently face. The essays in this volume deal with thorny issues of gender relationships and the status of women in society, how we treat the bodies of criminals and social outcasts, the impact of environmental and political instability on societies, and the consequences of globalized political conflict. Within our own department we have students directly engaging with the historical contexts of the major problems we face today. With keen eyes towards the subtleties of documentary evidence these student-researchers are taking their rightful place in the important public conversations that we must have.

And that cheers me. Not just at Brock but in universities across Canada and the West history retains its vital role as a guide to understanding the present. My own students have proven deeply sensitive to the historical threads connecting the study of the past to the world around them. In my courses on violence, warfare, and social inequality in the early modern world, my students have a clear vision of the relationship of past problems to future concerns. As they move into the next stages of their lives, they are equipping themselves with the knowledge of history and the skills of historians to tackle the challenges that lie ahead. So I smile and remain hopeful.

Colin Rose Assistant Professor of European and Digital History

6 Introduction

In the fall of 1772, after graduating from Princeton University, the renowned American statesman William Bradford wrote to his college friend and future president of the United States, James Madison, about what he planned to do now that had graduated:

What business I shall follow for life I have not yet determined. It is a matter which requires deliberation & as I am not pressed by Age I intend to be in no hurry about it. I propose making History & Morality my studies the ensuing winter, as I look upon them to be very necessary in whatever employment I may hereafter engage.1

The uncertainty that Bradford had about his future after graduation may seem familiar to many undergraduate students at Brock. Yet, we might find encouragement in seeing this and comparing it to his future accomplishments as a Commander in the American Revolutionary Army and as the second United States Attorney General under George Washington. For us in the History Department, we should especially find encouragement in his comments on the study of history and morality. So often we are asked where a history degree might lead us after graduation with the assumption that there is little else than teaching. Bradford, however, seems to suggest that there is much more and that the study of history leads one to acquire tools and skills that are not commonly found elsewhere, but would be essential to any future endeavors. The skills to read and analyse critically, to write efficiently and concisely, and above all to think historically are acquired by the student of history and, as Bradford knew, would be essential for whatever business one "shall follow for life."

Bradford's grouping of morality with history addresses some of the many skills that students learn during their studies. As historians we have a moral duty to the inhabitants of the past and the ways in which we interact with and represent their history. At Brock, young historians develop empathy toward others in and outside of the classroom: whether it be empathy for those long past in the dusty documents of the archives or for our fellow classmates in long and involved seminar discussions. By learning to understand and value the contributions of others we expand and connect ideas from ancient Mesopotamia to twenty-first century Canada and gain perspective on important topics. As this ability to celebrate the subtleties of the human experience develops, we learn to apply these tools to the present world. The History Department at Brock is a community of individuals connecting with and expanding on the ideas of others towards a collaborative understanding of history that would not be possible without the enthusiastic input of everyone involved.

As editors of this third edition of The General Brock University Undergraduate Journal of History, our aim is to showcase the work of our student historians, continuing the work of past two volumes of The General in displaying their engagements with the past and their contributions to the collaborative understanding

1 “William Bradford to James Madison, October 13, 1772.” Founders Online, National Archives, last modified 1 February 2018, www.founders.archives.gov.

7 of history. As well, in the following essays we can see the many skills and tools of the historian in action, skills and tools that as much as they were in Bradford's time are still essential for success today.

Alongside The General this year we have been able to see the beginning of a new project that will extend our history community at Brock to the Niagara community at large. The History Lab will be working closely with the Niagara Military Museum and Seedling for Change in Society and Environment to provide students with the chance to begin thinking about military history differently, alongside members of the Niagara community. Guest lectures, commemorative events, and community outreach activities will provide students and faculty with the opportunity to apply history outside the classroom. It will be a partnership that combines tradition and innovation to explore new ways of thinking about important topics; together they are working towards the history of the future.

We would like to extend our thanks firstly to all the authors who submitted essays to The General and especially to those whose work we have included in this volume. There were many fantastic submissions and choosing those to include was not an easy task, but we feel those selected represent the range of fantastic scholarship that Brock students undertake. We would like to thank all of the professors of the Brock History Department. In particular, we thank Dr. Clark for her vital guidance and oversight of this project and Dr. Samson for the opportunity to work on this project. We also thank Dr. Rose for his foreword as well as our brilliant team of editors and the Brock University Historical Society for all of their contributions in making this a quality journal. Thanks go to Tim Ribaric for assisting with the technical portion of the journal and Chance Mutuku for providing us with this volume's wonderful cover art. We would like to also extend our thanks to the Niagara Falls Public Library for hosting the celebration of our launch and the Niagara Military Museum for stocking copies of the journal for the public.

To our readers, we have chosen to include essays of various topics from different undergraduate courses offered at Brock, which we believe best display the talent and range of work that our student historians are engaged in here. We hope that you find in these a sample of what our history students at Brock have achieved, the skills they have acquired during their time here, and what they are capable of in the future. In the words of Bradford, these studies will be "very necessary in whatever employment [they] may hereafter engage."

Matt Jagas and Grace Viana Co-Editors-in-Chief April 2018

8 Early Modern Midwives: Carving a Path in the Male-Dominated Field of Medicine

By: Rachel Bedic

The study of medicine during the Renaissance has primarily focused on famous male physicians who revolutionized the way medicine was practiced. Famous male practitioners such as William Harvey (1578-1657), Andreas Vesalius (1514-1564), and

Ambroise Paré (1510-1590) have become a consistent focus for historians when discussing early modern medicine.1 The presence of female medical practitioners in peer- reviewed journal articles and books has unfortunately lacked prominence. Esteemed feminist historians such as Patricia Crawford and Sara Mendelson have provided some explanation as to why texts written about female medical practitioners are limited.

According to Crawford and Mendelson “[f]ormal records relating to female surgeons and physicians are hard to find, but there are hints that more women were involved than the licencing papers indicate.”2 Other historians like Thomas Benedek have also examined the lack of sources for female medical practitioners. He claims “we hardly have even biased descriptions by men of the activity of midwives [because women] were simply ignored in medical writing until the end of the sixteenth century.”3 In this essay, I will be examining the limited but particularly revealing sources on women in the medical field to demonstrate the realms of knowledge of these women as well how their authority was undermined due to their gender.

Women in the early modern era were faced with numerous challenges and risks when pursuing a career in the medical field, due to their gender and status in society.

1 Benjamin Lee Gordon, Medieval and Renaissance Medicine (NY: Philosophical Library, 1959), 625. 2 Sara Mendelson and Patricia Crawford, Women in Early Modern England, 1550-1720 (Oxford: Clarendon Press, 1998), 318. 3 Thomas G. Benedek, "The Changing Relationship Between Midwives and Physicians During the Renaissance," Bulletin Of The History Of Medicine 51, no. 4 (1977): 551.

9 Women who pursued a career in midwifery faced constant persecution from the church and their male counterparts for not having formal education, despite having better knowledge of the female body. Yet even with this undermining of their authority, the natural knowledge of women in the early modern period still allowed them to thrive in the medical field as midwives, since, although they lacked the formal education their male contemporaries received, this informal knowledge of the female body greatly supplemented this lack of formal training.

Education was a major issue for women pursuing a career in medicine. Women were not allowed to go through the same channels as their male counterparts to receive formal health education and medical training.4 To become a licensed medical practitioner in the early modern era, aspiring practitioners needed to receive formal medical education and training, which took place at a university. In most cases, these practitioners also needed to work in a formal apprenticeship in addition to receiving their university degree in order to obtain their full medical licence.5 During the middle ages, universities became esteemed learning centres for scholars. They emerged from a multitude of religious centres such as cathedral schools but began to take off as leading educational institutes for subjects other than religion and theology. The three main subjects that were dominant in universities were theology, law, and medicine.6 Historian Kevin Madigan believes that by the start of the early modern period, there were around eighty universities that existed in Europe and the British Isles.7 Universities certified their students in medicine and afforded them the knowledge to create a formalized and certified medical practice, which

4 Douglas Biow, Doctors, Ambassadors, Secretaries: Humanism and Professions in Renaissance Italy (Chicago: The University of Chicago Press, 2002), 48. 5 Mendelson and Crawford, 318. 6 Kevin Madigan, Medieval Christianity: A New History (New Haven: Yale University Press, 2015), 266. 7 Ibid.

10 was hoped to attract patients based on the prestige of the doctor for obtaining university education.

Since universities were male-only learning institutions, women were disadvantaged due to their lack of formal education. Without being allowed to attend university and thereby gain and expand their medical knowledge, women began to be portrayed as incompetent medical practitioners. As a result, they had to work much harder than their male counterparts to show they were capable and qualified to practice medicine. Without years of formal medical training, many women were unable to treat diseases and some common illnesses. Many female midwives did not have the knowledge to use the new medicine of the time, which created further barriers. Consequently, many of these female medical practitioners looked towards natural healing in order to treat their patients. According to Merry Wiesner-Hanks, natural healing remedies involved the use of herbs which were prepared by people with less formal training.8 Many women determined to practice medicine would turn to herbal remedies to cure their patients.

Herbal medicine compensated for their lack of education and formal medical training.

A 1553 testimony from Denmark highlights the extent of one woman’s medical knowledge. As stated in Wiesner-Hank’s introduction to the text, medical treatment in

Denmark and in the majority of Europe was handled by a hierarchy of individuals ranging from university trained physicians (who were all male) to medical practitioners who were less formally trained such as apothecaries and midwives.9 The testimony took place on Friday 19 February 1553 in front of the mayors and city council in “Malmø and

8 Merry Wiesner-Hanks, “Vesalius’s On the Fabric of the Human Body (1543) and Elizabeth Blackwell’s A Curious Herbal (1747-9),” in Early Modern Europe 1450-1789 (Cambridge: Cambridge University Press, 2012). 9 Merry Wiesner-Hanks, “A couple, a medical practice, and an accusation, Denmark 1553,” in Early Modern Europe 1450-1789 (Cambridge: Cambridge University Press, 2012).

11 Valentin Køler, Reeve ibidem.” A couple named Johan Krumpis and Magdalena Krumpis were called to stand in front of city council while their patients testified that the couple had an honest medical practice. From this account, we can gain a better knowledge of how women practiced medicine.

One woman named Marine Bartolemæus testified that the doctoress Magdalena

“had helped [her and her husband] in one way and the other way with herbs, water and other things, so that [they] thanked her a lot, because she had helped [them].”10 The testimony reveals that the doctoress was healing her patients with natural remedies such as herbs. Although the testimonies do not specifically describe in great detail how the couple healed each of their patients, it can be inferred that they each had their own realm of medical knowledge. As stated in Marine Bartolemæus’ testimony, the doctoress used herbs to cure her patients. It can be deduced that the doctoress did not have any formal apprenticeship but still did have an acceptable amount of herbal knowledge, especially because she was able to successfully cure her patient.

In most of her husband’s testimonies, there is very little description of his methods except for Hans Nieleson Skormagere, who testified “that the doctor gave him a drink, which neither harmed him or did anything good.”11 Although it is not specified what type of drink the patient was given, it could have been a pharmaceutical concoction the husband learned from another male practitioner or from some kind of formal apprenticeship. Ironically, based on the testimonies, the wife’s herbal remedies seemed to work more efficiently than the husband’s likely pharmaceutical remedies. The testimony shows that although early modern women did not have formal medical knowledge, they

10 Ibid. 11 Ibid.

12 were still able to efficiently cure their patients. It also shows that the different realms of knowledge each medical practitioner gained during their training were based on their gender; the women were more likely to use natural remedies whereas men were more likely to use formalized medicine.

This 1553 Denmark testimony also reveals the allocation of jobs divided between the couple. Out of the ten testimonies given at the court, two specifically claimed that the wife Magdalena had helped them or their daughter give birth. There was a third testimony that claimed Magdalena had helped his wife, however he did not explain what Magdalena had helped with. Out of these two (possibly three) births, Magdalena was in charge of helping each one of these patients who were giving birth. This suggests her husband acknowledged his wife had more medical knowledge then he did when dealing with female anatomy. For this reason, Magdalena was allowed to deal with each expecting mother that came to the couple asking for assistance in giving birth.

Women faced further barriers in gaining medical knowledge as they were unable to join medical guilds and organizations. Not only were these guilds beneficial for learning how to practice medicine, but they were also the early forms of unions, which protected and aided their members. According to Madigan, the majority of guilds were actually created in universities, which is why the modern term for university is derived from the Latin term for guild: universitas.12 Since women were not allowed to attend university, they would have also been excluded from joining a guild. Guilds controlled who was allowed admission and what each member needed to demonstrate in order to show they were competent. Guilds also decided when each member would move up in

12 Madigan, 266.

13 rankings from novice to a master.13 Without guild membership, it was very hard for a woman to show her level of medical skill without formally being ranked by the hierarchy of the medical community.

Guilds also proved to be another nuisance for female medical practitioners because they sparked unfair competition by excluding women from the medical community. Madigan states guilds were groups of “men organized to protect common economic interests.”14 He also discusses their political authorities in dealing with their organization. Madigan’s statement implies these guilds would protect the members within its organization by removing outside competition. Wiesner-Hanks is more explicit in describing how the guild dealt with outside competition. She states “medical practitioners who had received formal training protested to civic authorities when individuals were practicing medicine without being a member of their guild or organization.”15 Since there would be no female members in a medical guild, women would be the main source of competition and would also be the biggest concern for the guild.

In a sixteenth-century petition to the city council of Munich, Germany, a female medical practitioner stood before the council to defend herself against charges laid against her by a medical guild. The guild asked for Katherina Plumanerin Carberinerin

(the medical practitioner) to be forbidden from treating or examining any further patients.

In this petition, when answering to the charges against her for practicing medicine outside the guild, Carberinerin claimed the guild was jealous of her practice. She stated, “not one person who has come under my care has a complaint or grievance against me. If the

13 Ibid, 267. 14 Ibid, 266. 15 Merry Wiesner-Hanks, “Woman’s petition to be allowed to practice medicine, Germany sixteenth century,” in Early Modern Europe 1450-1789 (Cambridge: Cambridge University Press, 2012).

14 doctors, apothecaries, or barber-surgeons have claimed this, it is solely out of spite and jealousy.”16 Carberinerin did not allow the guild to assert false claims about failing to provide adequate medical attention. She instead exposed the guild’s true intentions to impede Carberinerin by stating they only made this complaint against her because she was not part of their organization. It can also be inferred that they were making this complaint against her due to her gender, fearing that a woman was intruding in a male occupation.

Carberinerin further defended her case by stating that the medical guild did not have any real proof to condemn her or forbid her from practicing medicine. After stating that she has been informed of this petition, she claimed the decision to forbid her from practicing medicine was “arrived at because of malice and not through fault of [her] own.”17 She went on to state “[the decision] appears to me not only strange, but also totally deplorable.”18 Without substantial evidence to condemn her, Carberinerin tried to expose the guild for their persecution of her practice by challenging them to provide proof of her inadequacies.

Carberinerin then tried another tactic. She focused on her duty as a woman to “use

[her] feminine skills, given by the grace of God, only when someone entreats [her] earnestly.”19 She also stated, “I never advertised myself, but only when someone has been left for lost, and they ask me many times.”20 By claiming that she was not a professional medical practitioner, Carberinerin was not only undermining her own authority, she was also undermining her professionalism. Carberinerin’s petition to be allowed to practice

16 Ibid. 17 Ibid. 18 Ibid. 19 Ibid. 20 Ibid.

15 medicine provides a glimpse into the attitudes of early modern society and specifically the legal system. By stating her practice was for charity, the city council would most likely have allowed her to continue, since it would not look good to deny a woman to continue with her Christian charity. Carberinerin blatantly stated “I do whatever I can possibly do out of Christian love and charity…”21 By stating she was just doing her

Christian duty as a woman to help others, Carberinerin could easily prove that she was not a threat to the business of the male practitioners in the guild. In order to continue her practice, it unfortunately meant she needed to undermine her own authority and professionalism. However, the case reveals the determination of some women to continue practicing medicine.

Women continued to practice medicine despite constant persecution and threats to their authority because they had gender-specific and unique knowledge. Being in touch with their own bodies meant women were naturally educated on much of the female anatomy, and, therefore, they had medical knowledge that male practitioners did not. In many cases this is what kept women like Carberinerin in the medical field because her

“feminine skill” made her popular to her female patients. There was also the issue of trust that caused many women to favour female medical practitioners over educated male practitioners. As stated by Carberinerin, “at all times, as is natural, women have more trust in other women to discover their secrets, problems, and illnesses than they have trust in men.”22 This belief was especially true when women were in labour, as having a female midwife was more reassuring because the midwife was familiar with her patient’s body and would be less intimidating than a male midwife.

21 Ibid. 22 Ibid.

16 In many cases, it was men who sought the assistance of a female midwife to help their wives give birth. 23 Carberinerin commends husbands who do this by declaring:

“honourable husbands who love and cherish their wives will seek any help and assistance they can, even that from women, if the wives have been given up (by the doctors) or otherwise come into great danger.”24 Female midwives’ knowledge of the female body made them naturally talented medical practitioners, which is why they gravitated to midwifery because it specifically dealt with the female body. Being knowledgeable of female anatomy also made female midwives more sought after, even though their male counterparts held more formal medical knowledge. Women held authority by pursuing midwifery and were able to represent themselves as a sought-after professional in the medical field.

Although the peasantry sought midwives in the early modern era, the nobility favoured the services of male midwives to aid their wives in giving birth. In 1670, a male midwife named Julian Clement helped the chief mistress of King Louis XIV of France give birth to their son Louis-Auguste de Bourbon, who became known as Duc de Main.

Twelve years later, Clement also delivered Louis of Burgundy in 1682, who became the

Dauphin of France.25 For aiding the nobility, Julian Clement received the title of accoucheur, which became the proper term for a male midwife. Clement’s growing popularity among the nobles created a new trend. As stated by Haggard, “Male midwives became the fashion among ladies of the court. The princesses of the period hastened to

23 Mendelson and Crawford, 140. 24 Wiesner-Hanks, “Woman’s petition to be allowed to practice medicine, Germany sixteenth century.” 25 Howard Haggard, Devils, Drugs, and Doctors: the Story of the Science of Healing from Medicine-Man to Doctor (Chicago: Norman Press, 2012), 43.

17 place themselves under the care of accoucheurs.”26 In Clement’s case, he aided the wife of Philip V of Spain’s wife and delivered three of their children.27

While male midwives may have been more popular with the nobility, they were not as effective or as knowledgeable as female midwives. Thomas Benedek makes a similar argument in his own work. He argues that female midwives “possessed a certain amount of practical knowledge in the restricted area of the signs and symptoms of pregnancy, labour, and its complications which physicians and surgeons still generally lacked in the 16th century.”28 The male midwives who were sought after by the nobility were most likely popular due to their services being requested by royalty. Male midwives were not as practical as female midwifes because women knew their patient’s body just as well as their own. In addition to their own natural knowledge, many midwives were widows and had already gone through the process of giving birth.29 They had personal knowledge of what should happen based on their own labour, and they were empathetic toward patients going through the same painful process. The empathy many female midwives had for their patients would not be the case for any male midwife because he would never feel the pain of childbirth or have the fear of common deadly complications.

For midwives who had gone through the process of childbirth, they would be more willing to ease the pains of childbirth and help to make their patient as comfortable as possible during the process. Although female midwives may have seemed to be the better choice, male midwives continued to gain popularity due to their formalized medical training and official licensing.

26 Ibid. 27 Ibid. 28 Benedek, 551. 29 Wiesner-Hanks, “Ordinance Regulating Midwives, Germany 1522,” in Early Modern Europe 1450-1789 (Cambridge: Cambridge University Press, 2012).

18 During the late seventeenth century, male midwives were put at a great disadvantage due to what Howard Haggard claims to be the “height of prudery.”30 During the seventeenth century, many female patients requested that their male midwives perform their tasks in a blind fashion. The woman would be covered in a blanket from the waist down, and the male midwife would have to drape the blanket over his shoulders or fasten the blanket behind his neck. The male midwife would then have his arms under the blanket preforming his manipulations blindly while his head would be positioned above the blanket.31 This greatly impeded a male midwife’s task of birthing because he was unable to see what was going on during the process as well as what he was doing.

Benedek adds another element to Haggard’s “height of prudery,” by stating it was taboo for men to examine women’s genitals, and in some cases, it was prohibited.32 This was a cause for concern because a midwife should be able to examine their patient’s genitals to ensure the baby was correctly positioned in the birth canal. Blindly conducting the procedure greatly reduced the care and effectiveness of the male midwife. A female midwife would have been a smarter choice, even if she did not have the formal medical knowledge her male counterpart possessed, because she would be able to see how the labour was progressing.

While a woman may have been the more appropriate choice when choosing a midwife in the early modern era, the issue of her education and training was a cause for concern for some couples. However, there were ways in which women were taught the art of midwifery without secular control dictating the length of their training. Patricia

Crawford and Sara Mendelson have explained the training of a midwife was informal yet

30 Haggard, 47. 31 Ibid. 32 Benedek, 551.

19 fulfilled by unofficial apprenticeships done with friends and family. In some cases, their informal training could last several years under the supervision of a licensed or senior midwife. 33 Women learned from older midwives, who were in many cases their mothers

(revealing this was a passed down occupation). These apprentices would learn how to verify the degree of cervical dilation, the correct fetal position the baby should be in, as well as how to use oils in order to facilitate birth.34 These women also gained training in taking care of the baby after it was born by freeing its respiratory tract from mucus and, finally, washing and swaddling it.35 This knowledge was all learned from the hands-on training the women received. In many cases, this hands-on learning would have been more significant and useful than receiving a lecture on the process.

An unusual method of learning midwifery practices was through reading handbooks written by other midwives. Most of the women of the peasantry were still illiterate, so only literate women of the nobility would be able to read these handbooks.

One noble woman named Catharina van Schrader was a professional midwife who wrote a book in her eighties that was comprised of her most difficult cases during her career.36

Each case described the difficulty during birth, most commonly explaining how the child was stuck or presented itself in a difficult position. In each case Schrader described how she was able to save the mother and how she was able to birth or remove the baby from the birth canal.37 Although most women would go through informal apprenticeships,

33 Mendelson and Crawford, 319. 34 Donatella Lippi, "Witchcraft, Medicine and Society in Early Modern Europe," Archives of The History & Philosophy Of Medicine/Archiwum Historii I Filozofii Medycyny 75, no. 1 (2012): 72. 35 Ibid. 36 Wiesner-Hanks, “Memoirs of the midwife Catharina van Schrader, Netherlands 1734,” in Early Modern Europe 1450-1789 (Cambridge: Cambridge University Press, 2012). 37 Ibid.

20 some rare cases allowed women to learn about the profession through written works by other literate midwives.

The informal training midwives received from their friends and family was a cause for concern for secular authorities. City councils across Europe enacted regulations to control the training and education women received to become midwives. As Benedek states, “Much of our limited knowledge of midwifery in the 15th and 16th centuries is based on the regulations that were enacted to govern its practice.”38 The type of education and training these women received can almost exclusively be found in official city documents, which regulated how midwifery was conducted in the city. One of these documents is a 1522 ordinance regulating the practice of midwifery in the southern

German city of Nuremberg. This ordinance required female midwives of Nuremberg to swear an oath. This oath forced women to follow the regulations put in place in order to continue practicing medicine and keep their title of midwife.

Under the oath, there were six rules that the female midwives were required to follow, though only a few pertained to their training. The first rule states “no midwife should send a maid to a woman having her first child unless she had completed one year of her training program.”39 While this section of the oath not only reveals that midwives were receiving informal apprenticeships from senior midwives, it also reveals that they had a duration outlined for their apprenticeship. The city council set the apprenticeship at one full year, however the midwives requested it to be changed to “only the first quarter year.”40 The city council, consisting solely of men, undermined the female midwives’ authority by not taking into consideration their expert opinion. The council chose not to

38 Benedek, 553. 39 Wiesner-Hanks, “Ordinance Regulating Midwives, Germany 1522.” 40 Ibid.

21 entertain the idea of a quarter-year’s training because they felt it was too short to let the apprentices (the maids) practice midwifery on their own. Since the senior midwives conducting the training asked for length to be set at a quarter year, their authority should have been respected because they have obtained more knowledge on the practice of midwifery than the city council have.

The fifth rule in the ordinance regulated who was allowed to be a midwife and what channels a senior midwife had to go through to take on an apprentice. By regulating who was allowed to be a midwife, the city council assumed authority over the profession and controlled the way it was governed. The rule states

No midwife is to take on a maid-apprentice without the knowledge of the overseer of midwifery. No maid-apprentice is to be accepted who is married or has her own household, but those that are single or widowed, so that these persons are not called away from their instructors to their private business or housework, and will always be available.41

The council forbade married women to practice midwifery because they believed a wife’s primary role was to first take care of her husband and raise her children. The council allowed single or widowed women to become midwives because they only had to take care of themselves and therefore would not have had more important commitments to tend to. During this time, the church pushed for midwifery to be a charity among women for women, which would contradict the motives of the council because they did not allow all women to participate.42 As much as the council wanted to wield control and did not want the women to have agency, they ironically revealed that they considered midwifery to be a valid profession. By stating the midwives needed to be single and to “always be

41 Ibid. 42 Madigan, 314-315.

22 available,”43 the council undermined the church’s call for charity and instead, wanted to regulate the profession to ensure proficiency.

One main factor in this document is that it discusses an overseer of midwives.

Whether this was already a position before the ordinance or a newly created position drastically changed the interpretation that could be taken from it. If this was a position already in place, there would be the likelihood that the most senior midwife held this position, and therefore the profession would have had a female hierarchy controlling the community. However, if the position was created after the instatement of the document, a male would most likely hold this position and undermine the midwives’ authority further by preventing them from governing their own community. Unfortunately, the document does not specify. However, Mendelson and Crawford have discussed the challenges made to female authority. They argue, “controlling birth was too important to be left in female hands.”44 Taking this into consideration, the position was most likely created after the release of the ordinance and was held by a male midwife with formal medical knowledge who could oversee the operation. The council also stated that if a midwife was to switch their trainer, they must provide a “justifiable and legitimate cause for leaving that may be proven to the council or to those appointed by it. In such cases, the apprentice will not be forbidden to complete her training years with another sworn midwife.”45 Further control placed on the community meant the women were unable to regulate their own profession, having to plea with the council to make any changes.

The city council further asserted their authority over the profession of midwifery in Nuremberg by enacting a law which stated “If any midwives show themselves to be

43 Wiesner-Hanks, “Ordinance Regulating Midwives, Germany 1522.” 44 Mendelson and Crawford, 316. 45 Wiesner-Hanks, “Ordinance Regulating Midwives, Germany 1522.”

23 disobedient or disagreeable, the city council will not only remove them from their office, but will also punish them severely, so that all will know to shape up and watch their behaviour.”46 The council made sure they were explicit in stating they would be informed when a midwife did not conform to the regulations. As Benedek explains, ordinances like these were created because there was a growing concern for the public health, specifically the high mortality rate for women and their babies.47 The council showed that they did not feel the community of midwives were competent at their jobs because they created regulations to ensure the midwives were effective. However, by enforcing these regulations without fully consulting the community of midwives, the council inhibited the female medical community from being able to govern and control their own people without being subordinate to the all-male council.

Secular control was not the only problem midwives faced when practicing medicine. Ecclesiastic authorities asserted their power over the profession of midwifery by devaluing it as a skilled medical profession and instead portraying it as charity, in which women were called to aid each other in the name of God.48 In an era that became increasingly dependent on ensuring professionals were approved and licenced, female midwives felt the pressure of obtaining a license to continue their occupation as a midwife. Bishops from the church were almost the sole providers of midwife licenses to women.49 However, this was not done out of the belief that women were capable of running their own medical facility, but with the belief that it was a woman’s duty to

46 Ibid. 47 Benedek, 550. 48 Samuel Kline Cohn, Cultures of Plague: Medical Thinking at the End of the Renaissance (New York: Oxford University Press, 2010), 228. 49 Mendelson and Crawford, 284.

24 perform these charitable acts in order to conform to the church.50 Many women realized they needed approval from the church in order to continue their occupation as a midwife.

Women like Carberinerin explicitly stated their work was for charity in order to obtain a licence and continue their practice without ridicule. By admitting, like Carberinerin, that their work was done “out of Christian love and charity,”51 these women weakened their own professional identity. Consequently, female midwifery was seen more as a charity than an actual profession.

As much as the church helped women continue their occupation by granting them licenses for midwifery, it also consequently disgraced female midwives for performing abortion, which went against beliefs of the church.52 One of the Ten Commandments states “thou shall not kill,” which made it easy for the church to paint an image of the female midwife as an ‘unholy murderer.’53 The Roman Catholic Church also enacted a special piece of legislation on 29 October 1588 in Rome by Pope Sixtus V, which deemed that all acts of abortion were homicide and would be punished as such.54

Eventually fear spread across Europe, and many midwives faced accusations of witchcraft. As Donatella Lippi explains, midwives, in addition to healers, gained the respect of the people while they also sparked fear in society, since many believed “that knowing how to cure also meant knowing how to kill.”55 Midwives possessed the power to bring a baby into the world or to take it out of the world. Even if no abortion was conducted, midwives were still blamed for the death of the child due to the feelings of

50 Ibid, 314. 51 Wiesner-Hanks, “Woman’s petition to be allowed to practice medicine, Germany sixteenth century.” 52 John Christopoulos, "Abortion and the Confessional in Counter-Reformation Italy," Renaissance Quarterly 65, no. 2 (2012): 477. 53 Ibid, 465. 54 Ibid, 465. 55 Lippi, 69.

25 guilt the mother possessed for being unable to birth a healthy baby. Lippi states the parents accused the midwife “of witchcraft or of having killed the baby in order to offer it to the devil.”56

With accusations of witchcraft, midwives had to continue to prove themselves as professionals despite being disgraced by the church. As Lippi argues, “[it was] unthinkable to give birth without the help of a ‘wise woman,’ who was an expert in gynecology even though she was suspected of messing around with the devil.”57 Even though the deeply religious society feared the immorality of the midwives, having one present during a birth was crucial in guiding the mother through her labour. Accusations of witchcraft and questions of morality were added to the countless issues female midwives had to deal with when carving their own professional path in a male-dominated medical field. Women continued to answer the call for midwives as they passed their knowledge down onto their daughters and young apprentices.

Women in the early modern era faced numerous challenges when pursuing a career in the medical field. Their gender and status in society hindered their attempt to have the occupation of female midwife recognized as a profession. As Mendelson and

Crawford state, “[m]idwives attempted to sustain women’s control over their own physiology, but their efforts were undermined by the rise of the man-midwife, part of the general movement towards male medical professionalism.”58 Despite having natural knowledge of the female anatomy, female midwives were still subordinate to their male counterparts, due to the rise of all-male universities and formal apprenticeships. Men in the medical field feared the idea of women forming a professional group and resented the

56 Ibid, 71. 57 Ibid, 70. 58 Mendelson and Crawford, 435.

26 idea of an organized female unit.59 Many guilds called for regulations to be enacted in order to subdue the rising female presence in the professional community. With the rise of secular control over midwives, ecclesiastical powers also set out to undermine the authority of female midwives by converting the profession into an act of charity. With secular and ecclesiastic control shaping the perceptions and portrayal of midwifery, midwives needed to persevere in presenting their occupation as a profession and exemplify their knowledge of the female anatomy in order to thrive in the medical field.

59 Ibid, 316.

27 Bibliography Benedek, Thomas G. "The Changing Relationship Between Midwives and Physicians During the Renaissance." Bulletin Of The History Of Medicine 51, no. 4 (1977): 550-564. Biow, Douglas. Doctors, Ambassadors, Secretaries: Humanism and Professions in Renaissance Italy. Chicago: The University of Chicago Press, 2002. Christopoulos, John. "Abortion and the Confessional in Counter-Reformation Italy." Renaissance Quarterly 65, no. 2 (2012): 443-484. Cohn, Samuel Kline. Cultures of Plague: Medical Thinking at the End of the Renaissance. New York: Oxford University Press, 2010. Gordon, Benjamin Lee. Medieval and Renaissance Medicine. New York: Philosophical Library, 1959. Haggard, Howard Wilcox. Devils, Drugs, and Doctors: the Story of the Science of Healing from Medicine-Man to Doctor. Chicago: Norman Press, 2012. Lippi, Donatella. "Witchcraft, Medicine and Society in Early Modern Europe." Archives of The History & Philosophy Of Medicine / Archiwum Historii I Filozofii Medycyny 75, no. 1 (2012): 68-73. Madigan, Kevin. Medieval Christianity: A New History. New Haven: Yale University Press, 2015. Mendelson, Sara, and Patricia Crawford. Women in Early Modern England, 1550-1720. New York: Clarendon Press, 1998. Wiesner-Hanks, Merry. Early Modern Europe 1450-1789. Cambridge: Cambridge University Press, 2012.

28 The United States and Sputnik: a Reassessment of Dwight D. Eisenhower's

Presidential Legacy

By: Matthew Bologna

Dwight D. Eisenhower was an extraordinary serviceman adored by his country for his accomplishments in military and civilian life. As a four-star general, Eisenhower led the United States through the European Theatre of the Second World War, having coordinated the liberation of France in 1944 and spearheaded the Western Allies’ advance into Germany in 1945. After the war, Eisenhower served as the first Supreme

Allied Commander of the North Atlantic Treaty Organization (NATO), the military governor of the American occupation zone in Germany, and later as president of

Columbia University. Indeed, Eisenhower’s life was nothing short of exceptional.

Yet Eisenhower’s tenure as President of the United States from 1953 to 1961 failed to arouse the same fervour of admiration as generated by Eisenhower ‘the

General.’ In fact, for much of his post-presidency life Eisenhower endured a barrage of criticism from academics and journalists alike. The stereotypical characterization of

Eisenhower was that of a do-nothing president whose ignorance and complacency tarnished the prestige of the executive office. In his 1958 publication Eisenhower:

Captive Hero, journalist Marquis Childs chastised the former president’s political inexperience. Eisenhower had “no understanding of the uses of patronage and power,” and surrounded himself with equally simplistic and unimaginative Cabinet members.1

Likewise, Emmet J. Hughes—a former speechwriter for Eisenhower—lambasted

Eisenhower for his passive style of leadership. In his 1963 publication entitled Ordeal of

1 Anthony James Joes, “Eisenhower Revisionism and American Politics,” in Dwight D. Eisenhower: Soldier, President, Statesmen, ed. Joann P. Krieg (New York: Greenwood Press, 1987), 283.

29 Power, Hughes argued that the 1950s was a “lost decade” because of Eisenhower’s

“disdainful aloofness from aggressive politics, an aversion to rough political combat, and to the President’s basic assumption that many heads are better than one – especially one’s humble own.”2 Adding insult to injury, a poll conducted in 1962 by Arthur Schlesinger

Jr. among American historians who have assessed presidential performance ranked

Eisenhower twentieth out of thirty-five presidents, tied with Chester A. Arthur and behind Benjamin Harrison.3

By the 1980s, however, both popular and scholarly opinion of President

Eisenhower experienced a remarkable reversal in assessment. A combination of disastrous foreign policy pursuits (e.g. the Vietnam War) and declining economic growth produced a nostalgic yearning for the peace and prosperity that defined Eisenhower’s two terms as president. Furthermore, the declassification of National Security Council archives and the publication of Eisenhower’s memoirs precipitated a shift in scholarly opinion of President Eisenhower. In his 1982 biography of Eisenhower, political scientist

Fred Greenstein praises the former president as “politically astute and informed, actively engaged in putting his personal stamp on public policy, and [who] applied a carefully thought-out conception of leadership to the conduct of his presidency.”4 Eisenhower’s

“hidden-hand” style of leadership, namely his use of cabinet ministers as “lightning rods” for Cold War policy, enabled the president to preserve his popularity and credibility with the American public.5 In Eisenhower and the Cold War, historian Robert Divine applauds

2 Ibid, 285. 3 Ibid, 283. 4 Mary S. McAuliffe, “Commentary/Eisenhower, the President,” Journal of American History 68, no. 3 (1981): 627. 5 Stephen G. Rabe, “Eisenhower Revisionism: A Decade of Scholarship,” Journal of Diplomatic History 17, no. 1 (Winter 1993): 114.

30 Eisenhower for using his knowledge and expertise in foreign policy to keep the United

States out of war for eight years.6 The essence of Eisenhower’s strength, Divine claims,

“lies in his admirable self-restraint.”7 An additional poll conducted by Schlesinger Jr. in

1982 placed Eisenhower eighth out of forty presidents.8

President Eisenhower’s response to the launch of the Soviet satellite Sputnik on 4

October 1957 is a testament to his capabilities as president. Unfortunately, the Sputnik

Crisis is an overlooked episode of the Eisenhower administration and in American history. The available literature on Eisenhower either minimizes the Sputnik Crisis to focus on more memorable domestic crises during this period, namely, the Little Rock

Crisis of 1957 or ignores Sputnik altogether. Jim Newton, for example, devotes only three of 350 pages to Sputnik in his 2011 biography on Eisenhower, while John Logsdon’s

1997 publication Spaceflight and the Myth of Presidential Leadership focuses solely on the Kennedy administration.9

Eisenhower showed himself to be a proactive and attentive president who responded rationally and intelligently to Sputnik. Upon receiving word of the successful launch of the Soviet satellite in October 1957, Eisenhower surrounded himself with scientists, engineers, and academics in the President’s Scientific Advisory Committee

(PSAC) in order to rationally assess Sputnik’s implications for American national security and to develop the appropriate policy responses to reassert American resolve during the

Cold War. As such, Eisenhower and the PSAC accelerated American intercontinental

6 Joes, 290. 7 Ibid, 294. 8 James T. Patterson, Grand Expectations: The United States, 1945-1974 (New York: Oxford University Press, 1997), 243. 9 Mark Shanahan, Eisenhower at the Dawn of the Space Age: Sputnik, Rockets, and Helping Hands (Lanham MD: Lexington Books, 2017), 20-25.

31 ballistic missiles (ICBM) and satellite programs to end the Soviet monopoly in space and established the National Aeronautics and Space Administration (NASA) as a civilian space agency to conduct further civilian pursuits in space. To eliminate the inter- departmental competition over resources, Eisenhower augmented the authority of the

Defense Secretary in the Department of Defense over financial allocations to the Army,

Navy, and Air Force, and created a director for research and development to eliminate duplication amongst the armed forces. On the domestic front, the administration provided for moderate infusions of federal funding into post-secondary education via the National

Defense Education Act (NDEA) to stimulate student enthusiasm for science and engineering. Despite facing a Democrat-controlled Congress, the Republican Eisenhower succeeded in securing most of his policy goals. Indeed, an examination of Eisenhower’s responses to Sputnik contributes to the scholarly re-evaluation of Eisenhower’s legacy and cements Eisenhower’s place in historiography as an impactful and effective president.

At approximately 7:30 p.m. on 4 October 1957, the Soviet Union fired an R-7

ICBM carrying the world’s first manmade satellite, Sputnik, into orbit. 10 Neither

Eisenhower nor his administration expressed significant alarm over the launch of the

Soviet satellite. Was Sputnik truly unexpected? The National Security Council (NSC) had known since 1955 that the Soviet Union intended to launch a satellite in accordance with the International Geophysical Year (IGY), a period from 1 July 1957 to 31 December

1958 in which the International Council of Scientific Unions (ICSU) recommended that

10 Yanek Mieczkowski, Eisenhower’s Sputnik Moment: The Race for Space and World Prestige (Ithaca: Cornell University Press, 2013), 12.

32 governments develop and launch earth satellites in the interests of global science.11 In

November 1956, the NSC even predicted that the Soviet Union would launch a satellite before the United States, but concluded that the American satellite, dubbed Project

Vanguard, would contribute more to science because of its superior scientific network.12

The NSC also asserted that satellites themselves did not constitute an active military threat, Sputnik notwithstanding.13 Indeed, Sputnik did not indicate to the United States that the Soviet Union achieved superiority. By the autumn of 1957, the American intercontinental (ICBM) and intermediate-range (IRBM) ballistic missiles neared completion, while U-2 reconnaissance flights over the Soviet Union confirmed that the

United States possessed an enormous lead over the Soviet Union in missile technology.14

Eisenhower—who was not even in Washington at the time of Sputnik’s launch—thus had little reason to believe that Sputnik threatened national security: “Sputnik does not rouse my apprehensions, not one iota. […] They [the Soviets] have put one small ball into the air.”15

Sputnik may not have posed a military threat, but the “small ball” produced mass hysteria throughout the media and Congress. The editors of the New Republic compared

Sputnik to the discovery of the New World and feared that Sputnik signalled the Soviet

Union’s ascendancy as a leading technological and scientific power.16 Likewise, the

Chicago Daily Tribune feared the military implications of Sputnik, warning, “[if] the

11 “National Security Council Report 5520: Note by the Executive Secretary to the National Security Council on U.S. Scientific Satellite Program,” Foreign Relations of the United States, 1955-1957, United Nations and General International Matters, Volume XI, 20 May 1955. Accessed on 17 October 2017. 12 Shanahan, 47. 13 “National Security Council Report 5520, 20 May 1955.” 14 Shanahan, 60. 15 Walter A. McDougall, …The Heavens and the Earth: A Political History of the Space Age (Baltimore: The Johns Hopkins University Press, 1997), 146. 16 Ibid.

33 Soviets could deliver a 184-pound ‘moon’ into a predetermined pattern 560 miles out into space, the day is not far distant when they could deliver a death-dealing warhead into a predetermined target almost anywhere on the earth’s surface.”17 Meanwhile, the Senate

Majority Leader, Democratic Senator Lyndon Johnson, ordered a congressional inquiry into the satellite and missile programs of the Eisenhower administration, hoping to “blast the Republicans out of the water.”18 For several days politicians laboured over the causes of the United States’ humiliating defeat in the satellite race. These alleged causes ranged from inter-service rivalry, administrative complacency, an inferior education system, and an ineffective White House presided over by a semi-retired golfer [Eisenhower].19

Astonishingly, popular attitudes did not evoke similar apprehensions to Sputnik. A Gallup poll conducted between 11 and 14 October found that half of the sample considered the

Soviet satellite as a “serious blow to U.S. prestige,” but a surprising 61% of respondents believed that satellites would improve rather than endanger humanity.20

The importance of psychology for national security was certainly not foreign to

Eisenhower. He personally supervised the psychological components of the African

Campaign in 1942 and the Normandy campaign in 1944.21 He even once described the

Cold War as a “struggle of ideas” in which the United States and the Soviet Union fought over the hearts and minds of domestic and foreign audiences.22 The fear of international communism and internal subversion could exert pressure on Washington to increase substantially the military budget and intervene in peripheral wars for security. As a fiscal

17 Shanahan, 70. 18 McDougall, 149. 19 Ibid. 20 Ibid, 144. 21 Kenneth A. Osgood, “Form Before Substance: Eisenhower’s Commitment to Psychological Warfare and Negotiations with the Enemy,” Diplomatic History 24, no. 3 (Summer 2000): 410. 22 Ibid, 412.

34 conservative, Eisenhower wanted to avoid crash spending of any kind. Indeed, the most pressing issue facing his administration in the aftermath of Sputnik was primarily psychological: “the first [problem created by Sputnik] was to find ways of affording perspective to our people and so relieve the current wave of near-hysteria.” 23

Eisenhower’s actions in the ensuing months reflected his administrative desire to alleviate the nation from the feelings of inferiority that Sputnik generated.

Throughout October Eisenhower consulted with members of the Science

Advisory Committee (SAC) to assess the severity of Sputnik and the appropriate administrative responses thereof. Deltev Bonk, the head of the National Academy of

Sciences, cautioned against pursuing crash programmes, noting, “we can’t always go changing our program in reaction to everything the Russians do.”24 Likewise, Nobel Prize laureate Isidor Isaac Rabi told Eisenhower that Sputnik provided the Soviet Union with tremendous momentum and that unless the administration adopted vigorous action the

United States would likely fall behind in two to three decades.25 Rabi pointed out that

Eisenhower lacked a scientific advisor, someone to provide the president with a scientific perspective and suggest policy changes accordingly.26 Scientist and co-founder of the

Polaroid Corporation Edwin H. Land recommended that the administration find ways to invigorate an enthusiasm for science in schools.27

23 Dwight D. Eisenhower, The White House Years: Waging Peace, 1956-1961 (Garden City, NY: Doubleday, 1965), 211. 24 Andrew Jackson Goodpaster, “Memorandum of Conference with the President, on October 8, 1957, 5:00 p.m.,” Dwight D. Eisenhower Presidential Library, “Sputnik and the Space Race.” 25 Andrew Jackson Goodpaster, “Memorandum of Conference with the President, on American Science Education and Sputnik, October 15, 1957, 11 AM,” Dwight D. Eisenhower Presidential Library, “Sputnik and the Space Race.” 26 Ibid. 27 Ibid.

35 With these suggestions in mind, Eisenhower unveiled his administration’s offensive during two nationwide radio and television addresses. On 7 November 1957,

Eisenhower announced the creation of the office of Special Assistant to the President for

Science and Technology to advise the President on all matters related to science and nominated James R. Killian Jr., president of the Massachusetts Institute of Technology

(MIT), for the post.28 Eisenhower also expressed his commitment to reforming the

Department of Defense to eliminate inter-service rivalry, eliminate duplication, and provide the Secretary of Defense with absolute authority in guided missile directing.29

During his second address on 13 November, Eisenhower advocated a system of nation- wide science testing in high schools, incentives for students to pursue scientific or professional studies, programs to stimulate quality teaching of mathematics and science, provisions for greater laboratory facilities, and measures to increase the output of qualified teachers. Yet he also stressed the importance of patience, noting that it took time for an idea to become an accomplishment and for a student to become a scientist.30

Indeed, as Eisenhower pointed out to an audience in Oklahoma City, the United States needed “not only Einsteins and Steinmetzes, but Washingtons and Emersons,” people of good character and of sound mind who were impervious to communist propaganda.31 By the end of 1957, Eisenhower reconstituted the SAC into his own presidential committee

(PSAC) composed of eighteen notable scientists, academics, and engineers, including

Killian, Bronk, Land, Rabi, the Director of the National Advisory Committee for

28 Dwight D. Eisenhower, “Text of address by the President delivered from the Oval Office in the White House on "Science in National Security," November 7, 1957, Dwight D. Eisenhower Presidential Library, “Sputnik and the Space Race.” 29 Text of address on "Our Future Security" delivered by the President in Oklahoma City,” November 13, 1957, Dwight D. Eisenhower Presidential Library, “Sputnik and the Space Race.” 30 Ibid. 31 Ibid.

36 Aeronautics (NACA) Hugh L. Dryden, Director of the National Science Foundation

(NSF) Alan T. Waterman, and general and aeronautical engineer James H. Doolittle.32

During his State of the Union Address in 1958, Eisenhower requested from

Congress an additional $1.3 billion in the 1958 budget, with more than half going to increased missile development and production.33 The president justified his response with the hope that “this [increase] expresses the way the American people will want to respond to the promises and dangers of the dawning age of space conquest.” But before the administration could proceed, the United States needed to successfully launch a satellite into space. The official American satellite project, the Navy’s Project Vanguard, exploded only a few seconds after take off on 6 December 1957, embarrassing the nation. 34 On 3 January 1958, Killian forwarded a memorandum to Eisenhower, suggesting that the Army’s Jupiter-C rocket and Explorer satellite offered substantially greater chances of success than the Navy’s Project Vanguard and thus recommended that the administration stop developing Project Vanguard for the IGY and instead switch over to the Explorer.35 During a NSC meeting on 24 January 1958, Eisenhower instructed his

Secretary of Defense James McElroy to give Jupiter-C the same priority as Project

Vanguard and accelerate the production of the Thor, Jupiter, Polaris, and Atlas IRBMs.36

A scheduled launch on 29 January saw Explorer blast off without incident, reaching orbit

32 James Rhyne Killian Jr., Sputnik, Scientists, and Eisenhower: A Memoir of the First Special Assistant to the President for Science and Technology (Cambridge, Massachusetts: The Massachusetts Institute of Technology Press, 1977), 107 and 278. 33 Divine, 82. 34 Ibid, 71. 35 Killian Jr., 122-123. 36 “National Security Council Action No. 1846, approved January 24, 1958” in “Intercontinental Ballistic Missile (ICBM) and Intermediate Range Ballistic Missile (IRBM) Programs.” Central Intelligence Agency, Electronic Reading Room. Accessed 24 October 2017.

37 just before midnight on 1 February.37 The United States now had a satellite in space. In comparison to Sputnik, Explorer carried much more sophisticated scientific equipment, including the capability of measuring cosmic rays beyond the atmosphere and sending the data back to earth via two radio transmitters.38 Equipped with this powerful technology,

Explorer discovered the Van Allen radiation belts shortly after its launch in early 1958.39

On 4 February 1958, Eisenhower appointed a PSAC panel led by Nobel laureate

Edward Purcell to recommend the outlines of a space program and organization for the

United States.40 Until then, the National Advisory Committee for Aeronautics (NACA) accounted for the closest the United States had to a space agency. Created in 1915 under

President Woodrow Wilson, NACA worked on aircraft and missiles for four decades.41

But NACA’s resourcefulness declined by the late-1950s as the armed forces circumscribed rocket research from NACA. Each service pushed for independent space and satellite projects. In December 1957, the Army submitted a fifteen-year space program that forecasted lunar reconnaissance and two-man satellites by 1962, manned lunar circumnavigations by 1963, and a fifty-man moon base by 1971.42 In contrast, the

Air Force worked towards manned spaceflight with the X-15 program, while the Navy lobbied for the use of satellites for navigation, weather, and fleet communication.43

During Senate hearings in January of 1958, each service tried to convince the administration of its own capability in space by calling for recognition of their services’

37 Divine, 86. 38 Ibid, 96. 39 Ibid. 40 Killian Jr., 122. 41 Mieczkowski, 171. 42 McDougall, 166. 43 Ibid.

38 respective skills and contributions to the military.44 It seemed as though space policy would be the next victim for inter-service rivalry.

In March of 1958, the PSAC released its essay, “Introduction to Outer Space,” which outlined four reasons why space technology and exploration were important: man’s compelling urge to explore and to discover, military security, national prestige, and science. Eisenhower received the paper with great enthusiasm, proclaiming “I found this to be so informative and interesting that I wish to share it with all the people of America, and indeed with all the people of the earth,” and urged the media to widely distribute the essay throughout the country.45 The PSAC discounted most calls from the armed services for crash space programmes, but granted the military the importance of reconnaissance, meteorology, and communications. These military functions, however, raised questions concerning international law such as where space began, the legality of over-flight, and the regulation of space vehicles.46 These concerns suggested the wisdom of a civilian agency free from the bureaucracy, free to draw talent from inside and outside the government, and possessing contractual powers in the private sector.47

But who would have control over space exploration for the United States? A new space agency took time to organize and required legislative approval from Congress,

NACA did not have relief from the civil service, and the new Advanced Research

Projects Agency’s ties to the Army would give spaceflight a military and not civilian character.48 The PSAC report thus favoured the establishment of a new space exploration agency by legislation and concluded that the major goals of spaceflight were scientific

44 Ibid. 45 Killian Jr., 124; Mieczkowski, 152. 46 McDougall, 172 47 Ibid. 48 Ibid, 171.

39 and political, not military: “the psychological impact of the Russian satellites suggests that the U.S. cannot afford to have a dangerous rival outdo it in a field which has so firmly caught, and is likely to continue to hold, the imagination of mankind.” 49 The new

American organization would leave military satellites to the Pentagon, but otherwise would be lodged in a transparent civilian agency as a contrast to Soviet secrecy.50 NACA was the preferred choice because of its experience as a research organization, but the organization was too small and lacked access to the rocket and space engineers available in the Army, Navy, and Air Force. As such, the PSAC report recommended that NACA’s basic laws be amended to tap into military resources, to provide for a single director appointed by the President, free NACA from the civil service, coordinate with the

Department of Defense, and permit contracts with private industry.51

Eisenhower instructed Killian to draft a bill by 27 March 1958, hoping to avoid delay and recommendations for drastic changes by the Department of Defense.52 Entitled the National Aeronautics and Space Act of 1958, the legislation stressed that the purposes of space activities included the expansion of human knowledge, improvement of aircraft and space vehicles, preserving the United States as a leader in science, promoting cooperation with other nations, and utilizing American scientific and engineering resources for exploration. 53 However, the Act’s most important provision was the establishment of the National Aeronautics and Space Agency (NASA) as an independent office of the government to replace the ailing NACA governing board and the chartering of two parallel space programs with split responsibilities between NASA and the

49 Ibid. 50 Ibid. 51 Ibid, 172. 52 Killian Jr., 135. 53 McDougall, 172.

40 Department of Defense: the former devoted to research and a civilian application of science and the latter devoted to military applications.54

The National Aeronautics and Space Act encountered stiff resistance in the

Democrat-controlled Congress. Many felt that “agency” sounded too weak for an organization with enormous responsibilities in science. Eileen Galloway, a prominent researcher, suggested changing “agency” to “administration” and appointed an administrator to head the new organization.55 The bill found support in the House of

Representatives, but Senate Majority Leader Johnson wanted to ensure that the bill did not impose limitations on the Department of Defense’s fulfilment of military needs in space. 56 Johnson submitted a revised version of the bill that created a National

Aeronautics and Space Council, a powerful policy-making and coordinating group to advise the administrator of NASA. 57 Both Eisenhower and Killian opposed the suggestion because the committee would be too powerful, be composed of cabinet officers, and could try to dictate policy to him.58 Eisenhower broke the Congressional deadlock by inviting Johnson to the White House on 7 July 1958 to settle their differences. Eisenhower worried that the NASC placed too many demands on the president, but Johnson allayed Eisenhower’s concerns with the suggestion that he

[Eisenhower] chair the proposed Space Council.59 Eisenhower agreed with Johnson’s proposal. In the ensuing compromise the Space Council would be modelled after the NSC with the president as chairman, while the Council itself would include nine members,

54 Ibid. 55 Mieczkowski, 172. 56 Killian Jr., 136. 57 Ibid. 58 Ibid. 59 Divine, 147.

41 with three from outside the government.60 The final Senate conference bill agreed to give

NASA control over aeronautics and space activities sponsored by the American government except for defense and military operations, which would be preserved in the

Department of Defense.61 Eisenhower signed the bill on 29 July 1958. On 1 October

1958, NACA would disappear and remerge as the National Aeronautics and Space

Administration.62 By the end of 1960, NASA had launched thirty-six earth satellites and four deep space probes, whereas the Soviet Union had only one satellite orbiting the earth and one orbiting the sun.63

For many observers, the lengthy time it took for the United States to develop a satellite indicated the extent to which inter-service rivalry within the Department of

Defense hampered national security initiatives. As expressed by a 1957 issue of U.S.

News and World Report, “many blame the U.S. missile lag on arguments over which service should develop which missile.”64 Since the announcement of the American satellite program in 1955 the Army, Air Force, and Navy conducted their own independent rocketry research, and each service wanted pre-eminence in developing rocketry.65 An ideal solution for the satellite program was to marry the Army’s Redstone rocket to the Navy’s scientific instruments. Doing so would have allowed the United

States to launch a satellite sooner, but Army-Navy rivalry nullified this proposition.66

Despite the administration’s designating of the Navy’s Project Vanguard as the United

60 Ibid. 61 Ibid. 62 McDougall, 176. 63 Eisenhower, 260. 64 Ibid, 87. 65 Mieczkowski, 45. 66 Ibid.

42 States’ first satellite, the development of separate ICBMs, IRBMs, and satellite projects continued in the Army and the Air Force, leading to duplication and waste.

The reorganization of the Pentagon and the Department of Defense were always on Eisenhower’s presidential agenda. Eisenhower often complained to Defense Secretary

McElroy and Deputy Secretary Donald Quarles that the burdensome and chaotic organization of the Pentagon hampered the Defense Department’s ability to respond to military threats:

The Joint Chiefs of Staff [uniformed military advisors to the president, the Defense Department, and the NSC] as it now exists is too complicated to work in warfare when minutes will be as precious as months have been in the past. Readiness for anticipated emergency demands that the peacetime organization be made so simple and clear that decision and control are free of delays and obstructions.67

Eisenhower frequently derided the Joint Chiefs for their obsession over missiles and their acting as spokesmen for their own branches rather than providing the Secretary of

Defense with concrete advice and recommendations for action.68 As per his defense policy, the “New Look,” Eisenhower sought the reorganization of the Department of

Defense to strengthen civilian control in the Pentagon, eliminate cumbersome boards and committees, provide mechanisms for better strategic planning, and strengthen the position of the Secretary of Defense vis-à-vis the Joint Chiefs of Staff.69 Eisenhower did not push the issue further because he did not want to arouse the opposition of the Pentagon before the 1956 presidential election.

The fiasco over Sputnik provided Eisenhower with the ideal opportunity to pursue his long-awaited reorganization plans for the Pentagon. During a meeting with Secretary

67 Eisenhower, Waging Peace, 245. 68 Divine, 87. 69 Dwight D. Eisenhower, Mandate for Change, 1953-1956 (Garden City, NY: Doubleday & Company Inc., 1963), 447.

43 of Defense McElroy, Eisenhower outlined three primary considerations that he expected to see in the reformation plan. The first called for unified commands that provided the

Secretary of Defense with full control over the Army, Navy, and Air Force. The second called for “fiscal flexibility” that would give the Secretary of Defense the power to allocate funds to the three services directly rather than through Congress. The third proposed the creation of a new position of Director of Defense Research and

Engineering, a nationally recognized leader in science to supervise all weapons, engineering, and scientific developments within the Department of Defense to eliminate duplication and waste.70 On 1 April 1958, McElroy instructed Charles A. Coolidge, head of an advisory group on organizational affairs in the Pentagon, to submit the Department of Defense recommendations to Congress.71 The House of Representatives, however, opposed the reorganization plan because the second clause on “fiscal flexibility” violated the powers of the purse afforded to Congress via the American Constitution. 72

Eisenhower responded with a letter to Congress dated 3 April 1958, in which he justified his reorganization on the grounds that atomic warfare required efficiency and expedient decision-making. Eisenhower altered his proposal to ask only that the Secretary of

Defense be given limited authority in the allocation of funds between the services in special cases, but the House stood firm in its opposition.73 Democratic Representative

Carl Vinson from Georgia accused Eisenhower of attempting to establish a “Prussian-

70 Divine, 87. 71 Ibid, 129. 72 Ibid, 131. 73 Ibid.

44 style” general staff, and reminded the president that “space ships, satellites, and guided missiles cannot abrogate the Constitution.”74

Undaunted by Congressional opposition, Eisenhower brought the issue of defense reorganization public. After 20 April, the president devoted nearly every public appearance address to the issue of defense reorganization in order to build public support, making speeches to the United States Chamber of Commerce, the Advertising Council, and at a Republican National Committee dinsner honouring Republican members of

Congress.75 In a far more ambitious undertaking, Eisenhower wrote to more than one hundred business executives of companies benefitting from defense contracts to bring pressure on Congress. 76 The executives responded overwhelmingly in favour of

Eisenhower’s proposals and bombarded Congress and the armed service committees with letters expressing their endorsement of defense reorganization. 77 Regardless, the

Democrat-controlled House of Representatives continued to oppose the bill. On 22 May, the House Armed Services Committee submitted a draft bill that gave Congress the power to veto any attempt by the Secretary of Defense to shift combat functions within the Pentagon, gave the Secretary of Defense the ability to exercise authority only through the secretaries of each department, and gave the Joint Chiefs the ability to express their views directly to Congress and thus bypass the defense secretary entirely.78 Eisenhower dismissed the third alteration as “legalized insubordination” of the defense secretary to the Joint Chiefs and Congress.79

74 Ibid; Eisenhower, Waging Peace, 250. 75 Divine, 134. 76 Eisenhower, Waging Peace, 252. 77 Divine, 134-5. 78 Ibid, 137; Eisenhower, Waging Peace, 252. 79 Eisenhower, Waging Peace, 252.

45 The political atmosphere of the Senate proved much more receptive to

Eisenhower’s policy preferences. The key figure in the senate, Democratic Senator Stuart

Symington of Missouri, supported Eisenhower’s bill, and even rejected the advances of

Democratic senators to lead the opposition.80 Eisenhower also secured the support of

Senate Minority Leader Republican William Knowland, who convinced Eisenhower to concede on Vinson’s third amendment in order to persuade the Senate to remove the first two amendments from the bill.81 In particular, the Senate Committee recommended a change to the wording of the bill, from “separately administered” services to “separately organized,” which removed the three services from the chain of command and allowed the defense secretary to issue orders without consulting each service in return for the secretary’s acceptance of a congressional veto over fund allocations.82 With the support of the Joint Chiefs and Secretary McElroy, the modified bill passed both Houses unanimously, and Eisenhower signed the bill into law on 6 August 1958.83 The final draft maintained the “legalized subordination” proposition and the Congressional veto over transfers of combat functions, but did provide the Secretary of Defense with the power to transfer, reassign, abolish, and/or consolidate functions to increase efficiency.84 As a compromise to Eisenhower’s concession on an enlarged the Joint Staff, the bill provided the President with the authority to establish and control unified and/or specified commands for military missions via the Secretary of Defense and also established a

80 Divine, 139. 81 Ibid, 140. 82 Ibid, 141. 83 Ibid, 141-142. 84 Eisenhower, Waging Peace, 252-253.

46 Director of Defense Research and Engineering to eliminate waste and duplication amongst the various armed services.85

The final, and perhaps the most controversial, of Eisenhower’s responses to

Sputnik was the National Education Defense Act. For politicians, scientists, and the media Sputnik exposed the scientific ineptitudes of the American school system vis-à-vis the Soviet Union. American physicists in particular lamented what was perceived to be a

“science gap” between the United States and the Soviet Union that benefited the latter, although the physicists exaggerated much of their concerns. Three studies conducted between 1955 and 1961 by the NSF, MIT, and the National Research Council indicated that the Soviet Union trained two to three times as many scientists and engineers as the

United States, with 75% of graduating students in the Soviet Union majoring in science in comparison to 25% of American students.86 The Soviet Union also had higher numbers of graduates than the United States: 95,000 to 57,000.87 William Burton, publisher of

Encyclopaedia Britannica, visited the Soviet Union and concluded that Soviet schools, libraries, and laboratories posed a greater threat to the United States than nuclear weapons. 88 Director of the American Institute of Physics Elmer Hutchinson told

Newsweek that “the entire American way of life could well be doomed to rapid extinction unless the nation’s scientific reserves were expanded quickly.”89 Worse yet, by the time of Sputnik a demographic time bomb threatened the American education system. The

85 Ibid. 86 The two researchers involved, engineers and Soviet expiates Nicholas DeWitt and Alexander Korol, inflated some of the statistics to highlight greater discrepancies between the American and Soviet educational systems to lobby the American government for greater science funding; David Kaiser, “The Physics of Spin: Sputnik Politics and American Physicists in the 1950s,” Social Research: An International Quarterly 73, no. 4 (Winter 2006): 1231. 87 Ibid 88 Mieczkowski, 173. 89 Kaiser, 1235.

47 baby boom generation that entered the elementary school system in the late 1940s began to enter high school in the late 1950s and were projected to enter higher education by the early to mid-1960s.90 The perceived deficiencies of the American education system required immediate federal attention.

As a fiscal conservative, Eisenhower generally opposed federal intrusion in the education system. But Eisenhower was not a rigid ideologue and often modified his stances on education when necessary. For example, during his first term Eisenhower authorized a four-year, $325 million program for federal aid to construct new schools, but the bill died in Congress.91 In 1956, Eisenhower appointed professor Lawrence G.

Derthick as Commissioner of Education and head of the Office of Education. A subsequent Office of Education task force recommended the expansion and improvement of graduate programs to increase the number of college teachers, an improved guidance and counselling program to identify promising talent, and college-level programs to train technicians.92 Although the aforementioned proposals were far too “radical” to receive official endorsement, they did indicate a growing awareness within the administration of the need for increased federal attention towards education.93

After Sputnik, Eisenhower consulted with his colleagues from the PSAC to determine an appropriate response to the “science gap” between the United States and the

Soviet Union. Dr. Land recommended that the administration find ways to inspire an equal vigour and enthusiasm for science amongst the nation’s youth and the scientific

90 Wayne J. Urban, More Than Science and Sputnik: The National Defense Education Act of 1958 (Tuscaloosa, Alabama: University of Alabama Press, 2010), 75-76. 91 Mieczkowski, 159. 92 Ibid. 93 Ibid.

48 community, who at the present felt isolated.94 Dr. Rabi concurred, speculating that the

Soviet Union “could pass us swiftly just as in a period of twenty to thirty years we caught up with Europe and left Western Europe behind.” 95 Likewise, Killian urged the modernization and invigoration of science education in public schools.96 Eisenhower agreed wholeheartedly. Like his advisors, the president hoped that the crisis of confidence in the American education system facilitated by Sputnik would encourage people to take an active interest in modern science.97

On several occasions Eisenhower invited Killian to meetings with his brother,

Milton Eisenhower, president of Johns Hopkins, in the mansion to discuss the various educational proposals. Killian recalls that in the discussions that followed both

Eisenhower and himself expressed concern that the euphoria from Sputnik would overstress science and engineering to the neglect of other subjects.98 Killian felt that public schools often emphasized athletics and the humanities in the curriculum to the neglect of science and engineering.99 Eisenhower understood this concern. He cautioned against a purely scientific and technological response to the Soviet feat, noting,

“specialized programs must not be allowed to upset the important balance needed in a well-rounded educational program which must insure programs in the teaching of all areas of learning.”100 Educational reform had to elevate science to a position on par with other subjects in the curriculum, not provide science with an exclusive position of superiority. Eisenhower’s overall plan involved moderate and short-term infusions of

94 Goodpaster, “Memorandum of the Conference with the President, 15 October 1957.” 95 Ibid. 96 Killian Jr., 194. 97 Goodpaster, “Memorandum of the Conference with the President, 15 October 1957.” 98 Killian Jr., 194. 99 Ibid. 100 Divine, 92.

49 federal funds into the education system in order to assure the citizenry that concrete steps were being taken to correct the “science gap.”101

Eisenhower assigned Elliot Richardson, the Assistant Secretary of Health,

Education, and Welfare (HEW), to develop an administrative proposal for educational reforms. On 27 January 1958, Eisenhower submitted the draft Educational Development

Act of 1958 to the House of Representatives, which “recommended certain emergency

Federal actions to encourage and assist greater effort in specific areas of national concern.”102 The bill earmarked $140 billion for the National Science Foundation (NSF) to support basic research and $1 billion for HEW to create 40,000 scholarships for high- aptitude high school graduates as an incentive to attend college. 103 It also expanded the

NSF’s efforts to improve the quality of science teaching in schools with new textbooks, laboratory equipment, and supplies, as well as matching funds to assist underperforming schools.104 Eisenhower emphasized the temporary nature of the bill, hoping that the bill would “produce a growing supply of highly trained manpower – scientists, teachers, and engineers” to match the Soviet challenge.105

Two days later, Democrats Senator Lister Hill and Congressman Carl Elliott submitted their own educational bill, which replicated the administration’s bill except for a provision that provided for substantial student loans through a long-term program for educational improvement.106 Eisenhower opposed the increase in loans, scholarships, and the elimination of the ‘needs’ test and insisted on making education available to an able

101 Urban, 81. 102 Ibid. 103 Divine, 93. 104 Ibid. 105 Ibid. 106 Ibid, 89.

50 student while encouraging a degree of self-reliance. 107 Compromise between the administration and Congress resulted in a final draft bill that stipulated 90% federal funding and 10% institutional funding that granted loans and provided that institutions could borrow their 10% contributions from the federal government if they could not raise sufficient funds.108 The final act appropriated $1 billion to education, with $295 million earmarked for a federal loan program based on financial need, and an additional $59.4 million to “national defense scholarships” to encourage students to pursue teaching and universities to develop graduate programs.109 Although the final bill did not reflect the totality of Eisenhower’s proposals, he nonetheless accepted the bill as an acceptable compromise. On 2 September 1958, Eisenhower signed the National Defense Education

Act into law.110 By the mid-1960s, approximately 5,500 students received graduate fellowships to become college teachers, 350,000 undergraduates and graduates borrowed

$225,000 in federal aid, and 1,450 educational institutions participated in the NDEA’s student loan program.111 President Kennedy renewed the NDEA in 1961, which shattered

Eisenhower’s hopes for a temporary federal solution to the United States’ education problems.112

Dwight D. Eisenhower’s reputation as President of the United States has undergone a remarkable shift in scholarly reputation. Previously derided as an ineffectual president, contemporary academics now herald Eisenhower for his resourcefulness, moderation, and wisdom. The Sputnik Crisis of 1957 and Washington’s responses thereto are further

107 Eisenhower, Waging Peace, 243. 108 Urban, 97. 109 Mieczkowski, 160. 110 Ibid. 111 Ibid, 162-163. 112 Ibid.

51 testaments to Eisenhower’s effectiveness as president. Recognizing the psychological ramifications of Sputnik on American morale, Eisenhower consulted with scientists, engineers, and academics in the President’s Scientific Advisory Committee (PSAC) in order to rationally assess Sputnik’s implications on national security and to implement the appropriate policy changes necessary to reassert American resolve during the Cold War.

As such, Eisenhower and the PSAC accelerated American ICBM and satellite programs to end the Soviet monopoly in space and established the National Aeronautics and Space

Administration (NASA) to coordinate future civilian applications of science in space. To reduce inter-service rivalry over funding, Eisenhower strengthened the position of the

Defense Secretary vis-à-vis the Joint Chiefs of Staff over resource allocation and also established a director of research and development to eliminate waste and duplication between the services. Finally, the administration infused federal funding into post- secondary education via the National Defense Education Act to stimulate enthusiasm for science and engineering in American schools. An examination into Eisenhower’s responses to Sputnik contributes to the scholarly re-evaluation of Eisenhower’s legacy as president and cements Eisenhower’s place in historiography as a proactive and competent executive during times of national distress.

52 Bibliography Eisenhower, Dwight D. The White House Years: Mandate for Change, 1953-1956. Garden City NY: Doubleday & Company Inc., 1963. Eisenhower, Dwight D. The White House Years: Waging Peace, 1956-1961. Garden City NY: Doubleday & Company Inc., 1965. Goodpaster, Andrew J. “Memorandum of Conference with the President, on American Science Education and Sputnik, October 15, 1957, 11 AM,” Dwight D. Eisenhower Presidential Library, “Sputnik and the Space Race,” https://www.eisenhower.archives.gov/research/online_documents/sputnik/10_16_ 57.pdf. Goodpaster, Andrew J. “Memorandum of Conference with the President, on October 8, 1957, 5:00 p.m.,” Dwight D. Eisenhower Presidential Library, “Sputnik and the Space Race,” https://www.eisenhower.archives.gov/research/online_documents/sputnik/10_9_5 7_Memo.pdf. Joes, Anthony James. “Eisenhower Revisionism and American Politics.” Dwight D. Eisenhower: Soldier, President, Statesmen, ed. Joann P. Krieg. New York: Greenwood Press, 1987. 283-297. Kaiser, David. “The Physics of Spin: Sputnik Politics and American Physicists in the 1950s.” Social Research: An International Quarterly 73, no. 4 (Winter 2006): 1225-1252. Killian Jr., James Rhyne. Sputnik, Scientists, and Eisenhower: A Memoir of the First Special Assistant to the President for Science and Technology. Cambridge, Massachusetts: The Massachusetts Institute of Technology Press, 1977. McAuliffe, Mary S. “Commentary/Eisenhower, the President.” Journal of American History 68, no. 3 (1981): 625-632. McDougall, Walter A. …The Heavens and the Earth: A Political History of the Space Age. Baltimore: Johns Hopkins University Press, 1997. Mieczkowski, Yanek. Eisenhower’s Sputnik Moment: The Race for Space and World Prestige. Ithaca: Cornell University Press, 2013. “National Security Council Action No. 1846, approved January 24, 1958” in “Intercontinental Ballistic Missile (ICBM) and Intermediate Range Ballistic Missile (IRBM) Programs.” Central Intelligence Agency, Electronic Reading Room. Accessed 24 October 2017. https://www.cia.gov/library/readingroom/doc/1960-08-19a.pdf. “National Security Council Report 5520: Note by the Executive Secretary to the National Security Council on U.S. Scientific Satellite Program,” Foreign Relations of the United States, 1955-1957, United Nations and General International Matters, Volume XI, 20 May 1955. Accessed on 17 October 2017. https://history.state.gov/historicaldocuments/frus1955-57v11/d340.

53 “National Security Council Report 5801/1: Note by the Executive Secretary to the National Security Council on Long-Range U.S. Policy Toward the Near East.” Foreign Relations of the United States, 1958-1960, Near East Region; Iraq; Iran; Arabian Peninsula, Volume XII, 24 January 1958. Accessed on 19 October 2017. https://history.state.gov/historicaldocuments/frus1958-60v12/d5. Patterson, James T. Grand Expectations: The United States, 1945-1974. New York: Oxford University Press, 1997. Rabe, Stephen G. “Eisenhower Revisionism: A Decade of Scholarship.” Journal of Diplomatic History 17, no. 1 (Winter 1993): 97-115. Shanahan, Mark. Eisenhower at the Dawn of the Space Age: Sputnik, Rockets, and Helping Hands. Lanham, Maryland: Lexington Books, 2017. “Text of address by the President delivered from the Oval Office in the White House on ‘Science in National Security,’ November 7, 1957, Dwight D. Eisenhower Presidential Library, “Sputnik and the Space Race,” http://www.presidency.ucsb.edu/ws/index.php?pid=10950&st=&st1= “Text of address on ‘Our Future Security’ delivered by the President in Oklahoma City,” November 13, 1957, Dwight D. Eisenhower Presidential Library, “Sputnik and the Space Race,” http://www.presidency.ucsb.edu/ws/index.php?pid=10950&st=&st1= Urban, Wayne J. More Than Science and Sputnik: The National Defense Education Act of 1958. Tuscaloosa, Alabama: University of Alabama Press, 2010.

54 The Spectacle of Death

By: Carina Cino

Introduction

Nuremburg, mid-sixteenth century. Arsonist Lienhard Deürlein continues to drink hard from a bottle of wine. He is paraded through the streets, muttering curses at the people he passes throughout his entire procession. Having reached the gallows, he hands the bottle to the chaplain while he urinates in the open. As his sentence is read aloud,

Deürlein concedes he is prepared for death, but has one final request: he asks of the judge to allow him to fence and fight with four of the guards. His request is aptly denied. He takes the bottle from the chaplain and begins again to drink. With tried patience, the executioner, Meister Frantz Schmidt, does not wait for Deürlein to say the words, “Lord, into thy hands I commend my spirit.” Instead, he interrupts Deürlein’s drink with a swing of his sword, striking off Deürlein’s head.1

In the early modern era, execution was the punishment fitting of many crimes.

Although their frequency varied from year to year and country to country, executions became a spectacle of retribution that resulted in death. Executions, like that of Lienhard

Deürlein, were dramatic performances filled with suspense, horror, and excitement. Their public nature drew large crowds to witness the hand of justice at work. The spectacle of punishment that emerged from this publicization of execution was used by state powers to demonstrate authority and provide justice to the people it represented. The pageantry of execution amended the wrongdoings of the criminal to the community while allowing the state to maintain control of the public’s response towards crime. Control, in an era

1 Joel F. Harrington, “Lord Judge, Have I Executed Well?” Slate Magazine, 30 May 2013, accessed 24 November 2017.

55 where many political and religious changes were occurring, was very important for state powers to successfully run a country.

Rectifying the Wronged

Criminals of a community were individuals caught committing crimes against the state, village, or community member. Their actions were in violation of laws set forth by state, local, or religious powers. Whether the crime committed was against the state, individual, or the church, execution could be a warranted punishment. If the execution was conducted in secret or with minimal witnesses, the community that had been offended was not afforded the understanding that justice had been served. By publicly displaying a criminal’s execution, the people received a sense of justice, honour was restored to the city, and the community was saved in the eyes of Christ. The pageantry of execution gave meaning to the public.

In an age of vendettas, it is not surprising that killing a person was commonplace when seeking vengeance for an individual wronged. Feuds offered a stage for unequal struggles, treacherous attacks, and maiming to occur frequently. Provided these feuds served to uphold family honour through revenge, “assailants were unashamed and third parties showed no indignation.”2 However, the sixteenth century saw a major shift in moral thought and action, as well as a suppression of vendettas and feuding legality. This was in part because of increasing state organization and the influence of religion in

Protestant and Catholic countries.3 It was no longer moral or legal for the individual to

2 Pieter Spierenburg, Violence and Punishment: Civilizing the Body Through Time (Cambridge: Polity Press, 2013), 2. 3 Ibid.

56 take revenge that led to murder, but if the state carried out the action, there would be no contention.

State-sanctioned executions thus became an alternative to the actions carried out through duelling and feuds. Instead of a person murdering the individual who had done them wrong, they could entrust the state to carry out their revenge. However, simply putting the criminal to death would be too easy. In doing this, the state would essentially be committing murder and the sense of revenge for the community or individual wronged would be absent. The outright killing of criminals would lose the relevance of the punishment and become more about the criminal than the community. With the addition of punishment as a preamble to more serious crimes, the public supported the state’s choice for the death of criminals who conducted more heinous crimes as an appropriate alternative to vendettas. People were appeased when revenge could be carried out upon one that had wronged them.

Revenge often involved an individual committing similar actions against the person who had wronged them or their family. In an “eye for an eye” fashion, the pain inflicted upon the wrongdoer was meant to make him “pay” for his actions. Personal revenge, separate from that conducted by mafia or mob groups, was often not about money; it was more about making the wrongdoer understand how it felt to have the same actions imposed upon him. This was a key element of revenge that provided individuals involved in feuds with the sense of justice that the wrongdoer had been dealt with in a way that was equal to pain inflicted upon him or his family. As mentioned previously, the simple killing of a criminal would not provide this same sense of retribution. Punishment of serious criminals, before their death, thus became a major part of public executions.

57

The punishment that occurred before hanging or beheading often recreated the crime committed. Often, “mutilation and exhibition of the body followed codes that called particular attention to the parts of the criminal’s body that had carried out the crime.”4 The objective was to have the criminal understand what it meant to be a victim of their crime, similar to the goals of pain infliction during feuds and duels. Thieves had their hands cut off, arsonists were burned, and beatings and broken bones were typical for those who committed abuse. Other punishments given to criminals included being dragged behind a horse, lashes across the back, and being cut by hot pincers.5 In public opinion, the punishment should fit the crime.

The executioner mentioned earlier, Meister Frantz Schmidt, was an executioner in

Nuremberg from 1578 to 1617. His diary outlines every execution he performed, detailed frequently with individuals’ names, their crimes, and the punishments conducted before death. Entry 43 is dated 26 January 1580. In this entry, Schmidt outlines the punishment of three child murderesses:

Margaret Dorfflerin (50 years old) from Ebermannsstatt, Elizabeth Ernstin (22 years old) from Anspach, Agnes Lengin (22 years old) of Amberg, three child murderesses. The woman Dorfflerin, when she brought forth her child in the garden behind the Fort, left it lying alive in the snow so that it froze to death. Ernstin, when she brought for her child alive in Master Behcimb’s house, herself crushed its little skull and locked the body in a trunk. But the woman Lengin, when she brought forth her child alive in the house of a smith, throttled it and buried it in a heap of refuse. All three beheaded with the sword as murderesses and their heads nailed above the great scaffold, no woman having been beheaded before this at Nuremberg. I and the two priests, namely Master Eucharius and Master Lienhardt Krieg, brought this about, as the bridges

4 Nicholas Terpstra, “Body Politics: The Criminal Body between Public and Private,” Journal of Medieval and Early Modern Studies 45, no. 1 (2015): 7. 5 Archivio di Stato di Bologna, Gonari delle Giustizie seguite in Bologna del 1050–1730.

58

were already prepared, because they should all three have been drowned.6

When Schmidt states “they should all three have been drowned,” he was referring to the just punishment the three women should have endured. Drowning left the victim helpless as they struggled for air that does not come. This resembled the inability children had to save themselves when they were left in the snow to freeze to death, or when their skulls were being crushed in. In this case, the state provided partial justice to the people through the executions of the murderers. The women were justly killed at the hand of a professional, but their initial death was not befitting of the crime. As an attempt to remedy this, the executioner and the priests nailed the heads of the women “above the great scaffold,” a final punishment for the dead. The infliction of punishment or torture before the execution, or sometimes after in the case of Dorfflerin, Erntsin, and Lengin, provided the public with a feeling that justice was equated with the treatment of criminals and the way in which they died.

The public nature of the execution served another purpose – to restore honour to the community. Honour was a very important element of early modern society across

Europe. An individual’s honour, or dishonour, reflected directly upon his or her family and the surrounding community. The porous nature of a community’s communication networks made dishonour a major social stigma. Everyone in the city knew of the dishonourable action or occupation one engaged in, making life extremely difficult.

“Judicial penalties branded the guilty person, either in reputation or with a physical mark”7 that often represented the crime they committed. Tongues of blasphemers would

6 Franz Schmidt, A Hangman’s Diary: Being the Journal of Master Franz Schmidt, public executioner of Nuremberg, 1573-1617 (Montclair, NJ: Patterson Smith, 1973). 7 Edward Muir, Ritual in Early Modern Europe (Cambridge: Cambridge University Press, 2005), 116.

59 be pierced and hands of thieves cut off. Additionally, blinding, cutting off ears, whippings and brandings were not uncommon mutilations. These marks served to warn others of the penalty of criminal action.

For many crimes, however, it was not enough to allow an individual to continue to walk the streets of the city branded. The dishonour of capital crimes warranted capital punishment, which in turn would cleanse the city of the dishonour brought about by the criminal. Meister Frantz Schmidt outlined the punishment of a woman, Barbara

Wissnerin, for the crimes of theft and prostitution in Nuremberg during the sixteenth century:

…due to great unchastity, multiple thefts and breaking and entering [she] has already been in the Loch [the city jail] eight times. She was banished many times, and has perjured herself [i.e., and come back into Nuremberg] and was beaten in the Loch. She was publicly burned through the cheeks, and her first two fingers were chopped off… She was let out of the city after swearing an oath, and warned under penalty of death not to come in to the city again. Whereupon she came in again and was caught at theft. Now on her own confession her day of execution is set for next Thursday, the first of March. For the said punishment she will be taken from life to death in water.8

Although not initially charged with greatly dishonourable crimes, Wissnerin was forced out of the city to cleanse Nuremberg of the dishonour she carried. Her return brought dishonour to the city. The only way to be certain her dishonour would cease to taint the city was to have her executed. For Wissnerin and capital criminals, dying meant the end of social dishonour. The stigma would no longer haunt the community or the individual.

Public executions allowed the community to experience the release of dishonour and the cleansing of the community. They were witnesses of the event, verifying its truth and validating the city’s return to honour.

8 “Punishment of a Woman for Theft and Prostitution, Germany 16th century,” in Early Modern Europe, 1450-1789, ed. Merry Wiesner-Hanks (Cambridge: Cambridge University Press, 2006).

60

The religiosity of the era was not absent from the ritual of execution. The public nature of punishment also served to provide salvation to the community. Initially, the execution took place at the scene of the crime. The local populous saw the execution as a warning about the consequences of committing such a crime, as well as a cleansing through the shedding of blood.9 As governments became more centralized and stable, executions were moved to permanent locations in cities. Even with the move to permanent location, the ritual of execution continued to evoke the Catholic ideology of salvation through Christ. According to the Catholic faith, Jesus died on the cross for the salvation of man. Before He died, however, Jesus was paraded through the streets of the city of Golgotha to Calvary Hill. In a similar fashion, criminals were paraded through the streets of communities before making their way to the set places of execution. Although the processions represented the same concept, the ways they were conducted varied from country to country: “in Venice the procession usually returned to the scene of the crime where the sentence was publicly read and the condemned mutilated; in London the criminal walked or was carted three miles though the center of the city from Newgate prison to the gallows; in the hangman and a priest accompanied the culprit in a chariot from prison to the town hall.”10

The procession in Bologna was similar to that of Venice. The prisoner was led first from his cell to the second floor of the Palazzo del Modestà, overlooking the Piazza

Maggiore. Here his crimes were read aloud to the square below. He then began the walk to the field of Mercato del Monte. Along the way, the prisoner might stop for Mass or return to the place where his crimes were conducted. Figure 1.1 shows a modern-day map

9 Muir, 117. 10 Ibid.

61 of where the beginning and the end of the procession in Bologna would have been in the sixteenth century. The route taken is difficult to determine, as each criminal had a different procession in terms of locations visited due to the nature of their crimes, but the beginning and end of the procession remained the same.11

Cesare d’Assaro and Giovani Vincenzo del Mauro were two men who experienced a more intricate procession. The men were from Naples and had committed capital crimes in Bologna on 27 October 27 1599. They robbed the bank of Ghelli of

15,000 scudi, shot an archebus at the notary of the Procuratore, and killed a butcher by the name of Taddeo Abelli. They proceeded to try and flee to Hungary, but the Ghelli chased and caught the two men in Vienna and brought them back to Bologna. For their punishment, the two were taken by cart through the city to the Piazza Grande and given two tanagliate, or slashes. They were then carted to the Chiavature al Banco di MS

Taddeo Ghelli and cut twice more. Next, they were dragged to the Casa del Procuratore

Grati, the house of the Procuratore they shot with the archebus, and slashed twice again.

The next stop was the gate beneath the butcher’s they had killed where they received two more slashes. Each stop was intended to force the men to remember the crimes they committed and the pain they inflicted. In Ringhiera, their right hands were cut off as punishment for their thievery. Finally, they were hanged and quartered.12

The parades undertaken by criminals were alterations of the Catholic Stations of the Cross. Instead of walking as Jesus did, persecuted for man’s salvation, the criminal walked the footsteps of the Messiah in order to be saved. It is fitting that the Crucifixion story places Jesus between two criminals, also being executed for capital offenses. The

11 Nicholas Terpstra, The Art of Executing Well: Rituals of Execution in Renaissance Italy (Kirksville, Mo.: Truman State University Press, 2008), 40. 12 Archivio di Stato di Bologna.

62 presence of criminals reinforced the notion that even the “unworthy” could be saved in

Christ. The convicted committed a crime worthy of death, but before death, his soul must be saved by walking the path of Christ. The re-enacting of the Stations was completed when the criminal reached the place of execution, the Golgotha of his city. There he was

“crucified” in Christ, cleansing him of his sins and providing him everlasting salvation.

The two criminals in the Crucifixion story represent honour and privilege, on the right side of Christ, and debasement and condemnation on the left.13 Although the criminal was saved by the honour of Christ, he was not free of the judgement passed on him. The combination of these two understandings is what allowed for the duality of justice and salvation through one perverted individual.

The salvation of the community was also an important facet of these “executions in Christ.” Communities in early modern Italy were generally tightly knitted groups of people where everyone knew each other. The watchful eyes of neighbours, especially in

Italian communities, ensured people conducted themselves appropriately. As a result, the criminal activities of individuals were often considered a reflection of the entire community. Executions were traditionally seen as a removal of the criminal from a civil society that did not recognize the actions of the prisoner as representative of the community’s conduct or honour. “Those who died were those whom the community had not rallied around to save and reintegrate through commutation, composition, informal peace agreements, acquittals, and the like.”14 The community cast out the rebel for the survival of the group. Church discipline at this time was aimed at maintaining purity amongst the community of believers. In order to uphold the virtue of the community in

13 Mitchell B. Merback, The Thief, the Cross and the Wheel: Pain and the Spectacle of Punishment in Medieval and Renaissance Europe (London: Reaktion, 2001), 23. 14 Terpstra, “Body Politics,” 37.

63

God’s eyes, the weak links needed to be cast out. Criminals were these weak links, seen as imperfections of the community body. The march to the gallows in the way of Christ allowed the community to cleanse itself of the impure by sending the tainted individual to ask for God’s forgiveness. The community did their duty in bringing the criminal to

Christ. In this way, the soul of the community was protected and order restored in the eyes of the faithful.

The execution also provided a cleansing of the community from the bad or evil residing within it. The individual himself was seen as an evil member of society, but his soul was the bigger issue. The uncleanliness of the criminal soul meant a tainting of his family and those around him, including the entirety of the community. Tudor monarchs used the concept of God as ruler to control trials and executions, passing off public punishments as “divinely sanctioned.”15 In these sanctions, God saw the truth and preserved the innocent while convicting the guilty. Once determined guilty, the convict’s soul was targeted for cleansing through faith and acceptance that his conviction was the

Will of God. The body was unable to be cleansed by the same means as the soul, so it was alternatively dealt with by execution. Evil was removed from the soul and the destructiveness of the human body was physically removed from the community.

Catholic ideology professed that the goal was for all souls to go to Heaven; execution provided means for the cleansing of the soul to go to Heaven and the community to be rid of the body.

From a community perspective, the execution of criminals was beneficial in many ways. The suppression of vendettas and vengeance seeking forced the public to rely on

15 Karen Cunningham, “Renaissance Execution and Marlovian Elocution: The Drama of Death,” PMLA 105, no. 2 (1990): 209.

64 state-sanctioned executions for justice. The punishment inflicted upon a criminal, if properly administered, provided justice and a sense of relief for the wronged members of the community. Their revenge was carried out in a manner that reflected the pain of the victim in a vindicating way. The public nature of this justice was a method of honour restoration for the people. Once again, a community was seen as honourable, having eliminated and cleansed the dishonour from its streets. By parading the criminal around before his death, the community attempted to save the soul of the individual, as well as that of the community, in the name of Christ. The removal of evil from a city returned its community to its original order and allowed it to resume its “good-natured living” in the eyes of God. These understandings of the execution were a manipulation of the community by state powers. Execution gave the people what they wanted: a way to achieve vengeance and salvation at the same time. The question then becomes about the purpose, about what state authorities aimed to achieve from the publicization of execution.

Control in an Age of Disorder

European communities experienced much change through the early modern era, especially after the religious reformation that began in 1517. People wanted more control over their daily lives, which would in turn take control away from state authorities. As a response to this public outcry, states began strengthening their authority through policing of communities.16 However, policing was not enough to maintain control of the public: execution helped to compensate where the power of the police failed. The publicity

16 Steven Hughes, “Fear and Loathing in Bologna and Rome: The Papal Police in Perspective,” Journal of Social History 21, no. 1 (1989): 97.

65 executions gained throughout the early modern era painted the state in a positive light for many Europeans. The state became the arbiter of conflict by maintaining social order through mercy and conviction, appearing as protectors of what was important to the community, and exerting their power through the frequency of punishment differently during different time-periods. It was important for state governments to maintain a balanced image of justice and mercy in order to maintain control.

Upholding social order was extremely important for the state if it wished to maintain control. State officials quickly became the providers of justice. As discussed previously, the phasing out of duelling and vendetta settling made way for the state to step in and concern themselves with punishment and crime. Although already an issue of the state, capital punishment and the execution theatre became the mainstage for the production of state power. The guilty verdict was often passed in private, while the execution itself remained very public.17 This relationship between private and public gave the people the perception that the state was supplying justice to the “guilty.” The people relied heavily on this system because they themselves were no longer permitted to handle their issues in the same fashion as they might have in the past. They remained unaware of the events occurring behind closed doors, subjected only to the justice being carried out in the execution.

Infanticide was one crime the state concerned itself with in the sixteenth century.

Women were constantly charged, tried, and executed for committing infanticide if their child, unreported and born out of wedlock, died after birth. These convictions were not based on conclusive evidence, as many doctors had a difficult time determining when

17 Muir, 116.

66 children were born dead or alive.18 The trial of Anna Maria Rauin was a questionable one, concerned with the death of her newborn child. The trial, a private matter, proved that doctors were unable to determine if her child was born alive and died afterwards or not. Despite the inconclusiveness of the trial, her confession was enough to prove “that it was the intention of the accused to keep the child from crying by pressing on its throat so that no one would know anything about this birth.”19 Her execution was set to be by the sword, a public affair. The people were not privy to the knowledge of her inconclusive trial, only that she was a child murderess. Executions gave the state the power to be seen to impose justice upon the guilty, a process brought to a resolution with the death of the guilty.20

In addition to providing justice, the state used executions to suppress discontent with the justice system. One of the biggest threats of the public rallying behind state authorities was the possibility of “disrupted” executions. For the state, a “good” execution was both public and peaceful.21 The power of the state was made public and the crowd was convinced the party being executed was guilty. In order to ensure these death sentences were carried out “peacefully,” state-sanctioned Comforters were appointed the duty of tending to the criminal twenty-four hours before their execution time. Their job was to ensure the criminal accepted that he was going to die, by any means possible.

They comforted with prayer, song, assurances about God’s plan, and the necessity to

18 Merry E. Wiesner-Hanks, “Death Sentence of a Woman Accused of Infanticide, Germany 1740,” in Early Modern Europe, 1450-1789 (Cambridge: Cambridge University Press, 2006). 19 Ibid. 20 P.J. Klemp, “’He That Now Speaks, Shall Speak No More For Ever’: Archbishop William Laud in the Theatre of Execution,” The Review of English Studies, New Series, 61, no. 249 (2010): 189. 21 Terpstra, The Art of Executing Well, 1.

67 focus on praying for the soul. In Preparing for Execution: Bolognese Comforters’

Manual, section 22, the biggest fear of the state is outlined. The article reads,

Always make him say some prayer so that he does not think, and so that he does not listen to what is being read. And this is because if he were to hear read out some crime that he did not commit, he would get very agitated. And there are times some of them may say to the notary who is reading, ‘You are lying through your teeth,’ and this is very bad. Firstly, because rage flames up in his heart, and also because he ends up denying everything that is being read and ends up denying the truth. For it’s not possible that the notary doesn’t say or read something true. Therefore be advised to make sure, if you can, that he does not commit this sin.22

The manual asks the comforter to ensure the convict’s mind is not focused on the charges being read, because if he were to shout out in denial against what was being read, there was potential for public outcry as well. As long as the execution remained free from disturbances, the public was kept at bay, and the convicted was executed, the state maintained social order. Following the execution, there was a “subsequent reinstatement of traditional hierarchies and patterns of civilised behaviour.”23

State authorities used one other tool at their disposal: mercy. Maintaining social order required a give-and-take between mercy and convictions, with respect to death sentences. Not all convictions resulted in death for the guilty party – that would give state authorities a bad reputation and cause considerable unrest amongst the community.

Instead, officials decided on appropriate cases where mercy could be afforded. In the case of Antonia Mussona, her case in 1644 was likely dismissed. She petitioned the courts of

Parma for an acquittal of the fines she faced for getting into an argument with her neighbour.24 In this instance, it was easier for the court to absorb the fine and let Antonia off with a warning than to appear cruel. The use of violence for punishment questioned

22 “Preparing for Execution: Bolognese Comforter’s Manual.” 23 Klemp, 189. 24 “Excerpts from denunciations and trials in Bologna, 1600-1700.”

68 who had the right to take such actions and when it was legitimately warranted.25 The act of mercy in more menial cases made the legitimization of punishment and death sentence convictions easier. Power rested solely with the government to convict worthy criminals with the punishment of death.

This power to convict criminals and sentence them to death also meant the government had a great responsibility to the innocent in the community. The control state authorities had over determining who lived and who died meant protecting those within the community who had not been convicted of any crimes. State governments often tried to refrain from involving themselves with conflicts within the community. However, the inability to carry out vengeance meant that the community relied on officials to settle their differences for them. In November 1494, there was a coup that drove Piero di

Lorenzo d’Medici out of Florence. Five men attempted to return Piero to the city in 1497 but were caught before their plan could take hold. They were executed by the Florentine government, the first political execution in Florence since 1481.26 This marked the beginning of an increase of political executions, as members of the elite relied more heavily on state-sanctioned executions. Execution insured the safety of the community and more importantly of the elite; it provided safety of their persons, as well as their reputations. Despite their avoiding involving themselves in issues of conflict, states made issues of the community, like punishment and retribution, a priority.

Issues of the community also included honour. State authorities used the whole execution process as a means of restoring honour in the community it represented. After a

25 Susan Dwyer Amussen, “Punishment, Discipline, and Power: The Social Meanings of Violence in Early Modern England,” Journal of British Studies 34, no. 1 (1995): 4. 26 Nicholas Scott Baker, “For Reasons of State: Political Executions, Republicanism, and the Medici in Florence, 1480-1560,” Renaissance Quarterly 62, no. 2 (2009): 444-445.

69 guilty conviction, the criminal was placed in a prison cell to await his execution date.

Prisons provided answers for every party involved with the criminal: the state’s concerns for public order were put at ease with the private confinement of the criminal, away from the public eye, and families’ problems of “honour [were] no longer put in jeopardy by the deviant behaviour of a family member.”27 The final part of the process, the execution itself, physically removed the dishonour held by the individual from the community. By disposing of individuals within a community who did not hold the required honour to partake in a community, state powers maintained their own honour in the eyes of residents. Thus, honour was used as a double function: remove the dishonour of the individual and increase the honour of the state, making it more favourable to the people.

In the early modern era, the public perception people had of the government determined the government’s overall honour. As long as the state wished to maintain authority, it needed to maintain the status of “honourable” in the eyes of its people. In the case of honour, it was important for the people of a community to both interact with honourable people as well as be governed by an honourable authority.

Maintaining social order and caring for public interests boiled down to how frequently the state ordered executions. The frequency of such convictions was a product of their time and place, varying geographically throughout the entire early modern era.

Initially, gallows and criminal burials were located on state boundaries and at city gate entrances, serving to demonstrate judicial status and as a warning to travellers and subordinates to be mindful of their conduct.28 The sight of hanged bodies was a familiar

27 Pieter Spierenburg, The Prison Experience: Disciplinary Institutions and Their Inmates in Early Modern Europe (Amsterdam: Amsterdam University Press, 2007). 28 Joris Coolen, “Places of Justice and Awe: the Topography of Gibbets and Gallows in Medieval and Early Modern North-Western and Central Europe,” World Archaeology 45, no. 5 (2013): 766.

70 one, at least until the early seventeenth century. Figure 1.2 shows the gradual decrease in frequency of executions in Bologna throughout the seventeenth century.29

As time progressed, popular support for public execution began to diminish.

Community acceptance of public violence changed throughout the seventeenth century, gradually making it a highly contested act. This century saw a major shift in state- administered violence becoming a regularly private event, rather than public. The diminishing of public violence contributed to the pacification of Europe during this time;30 but it also meant a continuation of state authority. The fading popularity of executions was a reflection of community concerns regarding public executions, however, the state was still able to conduct executions as it saw fit. Executions were thus used to show that the state remained in control when a criminal committed certain crimes, despite the growing discomfort amongst the public. Criminals were still used as examples to maintain social order and remind the public who was in charge.

The frequency and timing of executions also represented a government’s ability to maintain power during particular times of unrest. During Queen Elizabeth I’s reign,

England saw 6160 victims hanged at Tyburn.31 Elizabeth did not do away with public executions because of changing public perceptions, but instead used these 6160 criminals to maintain control over her subjects. Elizabethans were fearful of taking action against the state. Executions were just one way the Queen kept her subjects in line and maintained social order. Figure 1.2 also shows the ability of the Bolognese state to maintain power during some difficult years. There are a few outlying years, namely 1609,

29 Archivio di Stato di Bologna. 30 Julius R. Violence in Early Modern Europe, 1500-1800 (Cambridge: Cambridge University Press, 2001). 31 Molly Smith, “The Theater and the Scaffold: Death as Spectacle in The Spanish Tragedy,” Studies in English Literature, 1500-1900 32, no. 2 (1992): 217-232.

71

1625, 1630, 1643, and 1644 with exceptionally high numbers of executions. In these years, there would have been a higher crime rate than usual, requiring more state intervention. That being said, it might be more appropriate to look at outlying years with lower numbers. 1653 saw a spike in executions to 11 from 7 the previous year, presumably due to a higher-than-normal crime rate, or a necessity to exert state power more heavily. In any case, this outlying year represents the Bolognese state’s ability to maintain control over its people in a time when people were acting in disregard of the law.

Conclusions

Although the early modern era saw many changes, one of the biggest was the use of violence and punishment in conviction of criminals. On the surface, execution provided many outlets for the public. The age of vendettas disappeared quickly towards the end of the sixteenth and early seventeenth centuries, forcing a reliance upon the state for support in acquiring justice. The restoration of honour and religious salvation of the community were two by-products of the pageantry of execution. The processes of conviction and death set the community right again, restoring justice and allowing the community to continue to function as it normally would. The state played a major role in facilitating the peace of the community by involving itself with the conflict occurring between two individuals and between individuals and the state. This involvement was not selfless by any means. The state used public executions to maintain its control over a community of people. It sought to use execution as a reminder of social structure and social order, of the power imbalance between it and the people. The show of death on the

72 mainstage of the city or rural area exhibited a sense of state superiority, especially during times of unrest or particularly high crime rates. State authorities appeared merciful in order to gain favour with the communities they ruled, but utilized execution to balance their benevolence and ensure there remained just enough fear to maintain order.

Execution was a function of the state and community that changed over time. It served to comfort the community while ensuring dominance of state authorities. The spectacle of death that surrounded executions illustrates a dependency on public opinion to rule the population. States required the support, as well as fear, of the community they governed in order to maintain control over the public. Execution was one element of daily life manipulated by state powers to, in turn, maintain control over other aspects of life. Its usefulness was directly dependent upon the geographical place and time during the early modern era. Execution was a messy necessity of the state system, ultimately allowing governments to maintain control of, and appease, the people they presided over.

73

Appendix 1

Figure 1.1: Start and Finish of Prisoner Procession

25 Execution Frequency, Bologna 1600-1700 20 15

10

5

0 1609 1612 1621 1639 1642 1651 1669 1672 1681 1699 1600 1603 1606 1615 1618 1624 1627 1630 1633 1636 1645 1648 1654 1657 1660 1663 1666 1675 1678 1684 1687 1690 1693 1696

Year

Figure 1.2: Number of Executions in Bologna, 1600-1700

74

Bibliography Amussen, Susan Dwyer. “Punishment, Discipline, and Power: The Social Meanings of Violence in Early Modern England.” Journal of British Studies 34, no. 1 (1995): 1- 34. Archivio di Stato di Bologna, Gonari delle Giustizie seguite in Bologna del 1050–1730. Baker, Nicholas Scott. “For Reasons of State: Political Executions, Republicanism, and the Medici in Florence, 1480-1560.” Renaissance Quarterly 62, no. 2 (2009): 444- 478. Coolen, Joris. “Places of Justice and Awe: The Topography of Gibbets and Gallows in Medieval and Early Modern North-Western and Central Europe.” World Archaeology 45, no. 5 (2013): 762-779. Cunningham, Karen. “Renaissance Execution and Marlovian Elocution: The Drama of Death.” PMLA 105, no. 2 (1990): 209-222. “Death Sentence of a Woman Accused of Infanticide, Germany 1740.” In Early Modern Europe, 1450-1789, ed. Merry E. Wiesner-Hanks. Cambridge: Cambridge University Press, 2006. http://www.cambridge.org/features/wiesnerhanks/primary_sources.html “Excerpts from denunciations and trials in Bologna, 1600-1700.” Harrington, Joel F. “Lord Judge, Have I Executed Well?” Slate Magazine. 30 May 2013, accessed 24 November 2017. http://www.slate.com/articles/news_and_politics/history/2013/05/executioners_in_ medieval_europe_history_of_capital_punishment.html. Hughes, Steven. “Fear and Loathing in Bologna and Rome: The Papal Police in Perspective.” Journal of Social History 21, no. 1 (1989): 97-116. Klemp, P.J. “’He That Now Speaks, Shall Speak No More For Ever’: Archbishop William Laud In The Theatre Of Execution.” The Review of English Studies 61, no. 249 (2010): 188-213. Merback, Mitchell B. The Thief, the Cross and the Wheel: Pain and the Spectacle of Punishment in Medieval and Renaissance Europe. London: Reaktion, 2001. Muir, Edward. Ritual in Early Modern Europe. Cambridge: Cambridge University Press, 2005. “Preparing for Execution: Bolognese Comforter’s Manual.” “Punishment of a Woman for Theft and Prostitution, Germany 16th century.” In Early Modern Europe, 1450-1789, ed. Merry E. Wiesner-Hanks. Cambridge: Cambridge University Press, 2006. http://www.cambridge.org/features/wiesnerhanks/primary_sources.html Ruff, Julius R. Violence in Early Modern Europe, 1500-1800. Cambridge: Cambridge University Press, 2001. Smith, Molly. “The Theater and the Scaffold: Death as Spectacle in The Spanish Tragedy.” Studies in English Literature, 1500-1900 32, no. 2 (1992): 217-32.

75

Schmidt, Franz, A Hangman’s Diary: Being the Journal of Master Franz Schmidt, public executioner of Nuremberg, 1573-1617. Montclair, NJ: Patterson Smith, 1973. Spierenburg, Pieter. The Prison Experience: Disciplinary Institutions and Their Inmates in Early Modern Europe. Amsterdam: Amsterdam University Press, 2007. Spierenburg, Pieter. Violence and Punishment: Civilizing the Body Through Time. Cambridge: Polity Press, 2013. Terpstra, Nicholas. “Body Politics: The Criminal Body between Public and Private.” Journal of Medieval and Early Modern Studies 45, no. 1 (2015): 7-52. Terpstra, Nicholas. The Art of Executing Well: Rituals of Execution in Renaissance Italy. Kirksville, Mo.: Truman State University Press, 2008.

76

“The Last Battayle is Atte Hande”: Conceptions of Death in Renaissance Italy By: Lucas Coia

Death is a universal phenomenon. All living beings eventually cease to live, whether welcome or not. Since time immemorial humans have grappled with how to deal with and defy this reality. In Italy, beginning in the latter half of the fourteenth century, people turned to a “strategy for eternity” that has been called a “cult of remembrance.”1

While not by any means a rejection of traditional Christian attitudes to the afterlife, interest in the cult of remembrance reflected a desire to “outlive” one’s death on earth through post-mortem memorialization of the individual. Considering the contemporaneous rise of humanistic philosophy and the cult of man, this is perhaps unsurprising. That said, the cult of remembrance and its associated death practices represent a distinct attitude toward death that can be seen as highly individualistic. When analysed in detail, this development is reflective of contemporary social realities. Indeed, this was a time of increasing competition and struggles for power between local families and individuals.2 As will be shown, funeral rites and post-mortem memorial reflected this reality by serving to augment the honour of individuals and their families. At the same time, contemporary death practice remained a highly social and public phenomenon. To describe Italian Renaissance death rites solely in terms of increasing individualism would therefore be too simplistic a narrative, for in this particular time and place the public and the private were often inseparable.

1 Samuel Cohn, The Cult of Remembrance and the Black Death: Six Renaissance Cities in Central Italy (Baltimore: Johns Hopkins University Press, 1997). 2 Sharon T. Strocchia, Death and Ritual in Renaissance Florence (Baltimore: John Hopkins University Press, 1992), 55.

77

The rise of a late fourteenth-century cult of remembrance presupposes the gradual displacement of an earlier medieval attitude toward death. From the earliest centuries of the middle ages to the beginning of eleventh-century papal reform, official church attitudes were decidedly pessimistic. In an age before mendicant preaching and the participatory scheme of the Fourth Lateran Council, monks, nuns and saints held a virtual monopoly in access to post-mortem salvation. Ordinary lay people, whose lifestyle did not measure to the ordered and highly spiritual life of a monk, were simply deemed too sinful to stand a chance at escaping eternal damnation.3 The turn of the new millennium saw the emergence of both an increasingly literate class of people in conjunction with a renewed sense of apostolic zeal. Lay preaching suddenly became popular and the Church began to express an unprecedented interest in the lives and souls of its parishioners. This new emphasis on pastoral care led to a “democratization” of Christianity in which the path to salvation opened up to ordinary lay people.

Alongside this development rose belief in Purgatory. This was an intermediary locus of pain and suffering where most lay people could expect to go after death. It was in Purgatory that sinners were “purged” of their sins until purity signalled their release and subsequent admittance into heaven. Masses for the dead, whose late medieval rise to prominence reflected popular obsessions with Purgatory, soon became crucial for a speedy pass through this intermediary stage.4 Intercessory prayer offered on behalf of the soul of the dead had its roots in the second Book of Maccabees: “For if he were not expecting that those who had fallen would rise again, it would have been superfluous and foolish to pray for the dead … Therefore he made atonement for the dead, so that they

3 Daniel E. Bornstein, A People’s History of Christianity (Minneapolis: Fortress Press, 2009), 355. 4 Kevin Madigan, Medieval Christianity (New Haven: Yale University Press, 2015), 430.

78 might be delivered from their sin.”5 This notion, that one’s prayers on earth could be directed to the benefit of souls in Purgatory, led to an obsession with masses for the dead, the centrality of which in late medieval and early modern lay religion has been noted.6

The other option for purgatorial relief lay in almsgiving, which enjoyed a certain prominence in late medieval wills. 7 The importance of almsgiving reflects the pervasiveness of the mendicant ideal in late medieval culture. Charged with associations of Christ-like poverty, the mendicant orders exemplified the life of worldly renunciation in favour of absolute devotion to poverty and charity. Lay people expressed their commitment to this ideal in a number of ways. First, involvement in lay confraternities, whose devotional structure was often self-consciously rooted in the mendicant example, provided opportunities for charity that could benefit the member’s soul.8 Second, lay people could choose to be buried in mendicant robes, a powerful symbol of their devotion to absolute poverty in death.9 The location of their burial could also reflect contemporary mendicant piety. Indeed, the thirteenth and fourteenth centuries saw a marked increase in burials at both Dominican and Franciscan churches, which could even at times provoke the protest of local parish priests who complained of lost revenue.10

This late medieval interest in almsgiving and masses for the dead is perhaps best illustrated through contemporary wills. For example, the will of one fourteenth-century baker, Bertrucio di Giovanni (d.1337), states “First, wishing to provide for the health of his soul, he leaves 5 Bolognese pounds [in recompense] for ill-gotten gains of which 20

5 2 Maccabees 12:44-45 RSV. 6 Bornstein, 356. 7 Cohn, 39. 8 Nicholas Terpstra, Lay Confraternities and Civic Religion in Renaissance Bologna (Cambridge: Cambridge University Press, 2002), 79. 9 Strocchia, 41. 10 Ibid, 94.

79 solidi must be given each year, beginning from the day of death of the testator until the money runs out, to the Ospedale dei Battuti or dei Devoti and for the poor of that hospital…”11 Additionally, the will of labourer Albertino di Ser Petro (d. 1337) states

“…he leaves 20 solidi [in recompense] for ill-gotten gains … He leaves to his confessor 3 solidi for the singing of masses. He wants his confessor to arrange for his funeral and burial which he wants to be at the church of Santa Maria degli Alemmani.”12 The importance of both masses and charity in the early trecento strategy for eternity is thus evident. It is this preoccupation with the supernatural as opposed to the temporal that perhaps best exemplifies late medieval death practices. This is unsurprising at a time when one’s life on earth could be seen merely in terms of preparing for the next.13

It was against this religious background that the fourteenth-century plagues unfolded in Italy. A time of indiscriminate and severe mortality, the Black Death and subsequent plagues resulted in a surge of concern for death and the afterlife. This is seen through contemporary painting, which in Italy tended to reject earlier humanistic themes in favour of more traditional ones.14 It is also evident considering the rise of the popular ars moriendi genre beginning around 1350.15 This heightened sensibility toward death should have, considering the prevailing culture of mendicant piety and belief in

Purgatory, resulted in an increase in post-mortem charity and masses for the dead. This was not the case. While propitiatory masses saw a brief increase, the number of pious

11 “Four Bolognese Wills (1337),” in Medieval Italy: Texts in Translation, eds. Katherine L. Jansen, Joanna Drell and Frances Andrews (Philadelphia: University of Pennsylvania Press, 2009), 515. 12 Ibid. 13 Peter Burke, Culture and Society in Renaissance Italy 1420-1540 (Princeton: Princeton University Press, 1972), 199. 14 Millard Meiss, Painting in Florence and Siena After the Black Death: the Arts, Religion, and Society in the Mid-Fourteenth Century (Princeton: Princeton University Press, 1978), 67. 15 Bornstein, 357.

80 gifts per testator dropped significantly throughout Italy following the 1363 plague.16 The traditional mendicant-based piety seemed to be giving way to a different strategy for the afterlife. According to Samuel Kline Cohn this marks the beginning of the cult of remembrance.17 Indeed, where the earlier emphasis on charity and masses for the dead may be seen as concerned primarily with the supernatural, this new attitude partly shifted focus to earth and leaving a visible mark to outlive oneself.

The cult of remembrance needs to be understood in terms of contemporary economic and social trends. The Black Death and subsequent fourteenth-century plagues resulted in the enrichment of tradesmen and artisans that gave rise to a class of nouveaux riches in many Italian cities.18 This was also a time of intensified competition for political and social standing as “new” and “old” money jockeyed for greater reputation and esteem.19 Such an environment could lead to a great deal of social tension. Civil strife was not uncommon in the latter half of the fourteenth century in cities like Florence, which in 1378 experienced the Ciompi Revolt.20 It was this context that gave rise to the use of funeral rites as outlets for conspicuous consumption in Italy. Indeed, by the end of the fourteenth century Italian funerals were marked by flamboyant displays that contrasted visibly with the mendicant ideal. Enormous amounts of wax, luxurious bier cloths, and elaborate banners and clothing for the dead became the norm in upper class and merchant funerals.21 Even the labouring classes sought the most respectable burials

16 Cohn, 73. 17 Ibid. 18 Meiss, 69. 19 Strocchia, 56. 20 Ibid, 55. 21 Ibid, 61.

81 possible.22 From the moment the procession began to the end of the requiem, late fourteenth-century funerals exhibited a theatricality that functioned to display and augment individual and familial honour.23 To be sure, the old mendicant ideal had not suddenly disappeared. In fact, tension between new and old forms of death practices reflected deep cultural ambivalences about pomp and excessive displays of wealth.24 That said, the cult of remembrance marked a new attitude toward death that in many ways

“individualized” its experience. At the same time, death and its associated practices could remain simultaneously a concern of the community.

The first aspect that requires our attention is the experience of the individual in the days and moments immediately before death. As will be seen, there remained throughout the early Renaissance a firm emphasis on the role of the community throughout this critical time. Two main primary sources will be instructive. The first is

Boccaccio’s Decameron (1353), which tells of the plague’s disruptive effects on contemporary death rites. An examination of this narrative will therefore reveal contemporary expectations surrounding proper death practice. The second is an account of the death of Pope Alexander VI (d. 1503) by chronicler Johann Burchard, with which we will begin. Far from a simple retelling of events, Burchard’s account needs to be treated as reflective of the personal opinion of its author: “Alexander the sixth, here I lie;

Roma rejoice thee Free now at last; for my death was to mean new life for you.

Alexander the sixth has smothered the world in carnage.”25 Indeed, Burchard’s distaste for the Borgia pope colors his entire work. The account tends to highlight the chaos

22 Ibid, 82. 23 Ibid, 64. 24 Ibid, 63. 25 Johann Burchard, “Pope Alexander VI and His Court,” in The Civilization of the Italian Renaissance: A Sourcebook, ed. Kenneth Bartlett (Toronto: University of Toronto Press, 2011), 220.

82 surrounding Pope Alexander VI’s death. This emphasis on the abnormality of the events surrounding his death serves to reinforce the negative character of the pope’s rule. Such a reading of this document is not far-fetched considering the role of contemporary papal funerals as expressions of nuance of character and biography.26 Burchard’s account will therefore reveal, like the Decameron, contemporary norms regarding death practice.

The moments before Alexander VI’s death indirectly reveal the enduring role of family and community in death. The account begins “On Friday, the 18th, between nine and ten o’clock [Alexander VI] confessed to the Bishop … After his communion he gave the Eucharist to the Pope who was sitting in bed. Then he ended the mass at which were present five cardinals … at the hour of vespers after Gamboa had given him extreme unction, he died.”27 Burchard’s account noticeably omits mention of any family or friends that surrounded Alexander VI on his deathbed. The only sense of encouragement is seen in the presence of five cardinals during his mass, but even these are not mentioned in the pope’s final moments. Only the bishop, Pedro Gamboa, is named, whose omission from the narrative would have deprived even Alexander VI of last rites. The moments before death were perhaps some of the most important in early modern societies. Indeed, death marked the greatest test of faith, which needed to be sustained until one’s very last moments to avoid damnation. 28 For this reason, norms dictated that the sick be surrounded by clergy, friends, and relatives for encouragement to remain steadfast until the end.29 With that in mind, Burchard’s silence here is deafening, especially considering the rest of the text’s painstaking attention to detail. The absence of the pope’s family at

26 Strocchia, 138. 27 Burchard, “Pope Alexander VI,” 217. 28 Bornstein, 358. 29 Ibid.

83 his deathbed is therefore a critical take on Alexander VI’s nepotism in life, a message from Burchard that even those who benefited most from the pope’s rule did not support him in those final crucial moments. This becomes apparent later in the text, when

Burchard remarks that neither Cesare nor Lucretia appeared during the whole illness of the pope.30

The importance of having people around the deathbed for encouragement followed by mourning in the moments after death is well attested, particularly in

Boccaccio’s plague narrative. For example, Boccaccio notes, “It was the custom, as it is again today, for the women relatives and neighbors to gather together in the house of a dead person and there to mourn … [I]n front of the [home] his male relatives would gather together with his male neighbors and other citizens, and the clergy also came…”31

The onset of the plague, according to Boccaccio, led to the breakdown of this practice.

“…[T]his custom … died out and other practices took its place. And so … there were many who passed away without having even a single witness present … most relatives were somewhere else, laughing, joking…”32 The absence of a large group of kin and friends at the deathbed clearly disturbed Boccaccio; it is the fact that people died completely alone that appals him in this excerpt.

Indeed, contemporary confraternal practices reflect Boccaccio’s concerns. The role of confraternal brethren in the death of a fellow member began from the onset of sickness. From this time members were charged with attending to the terminally ill in

30 Burchard, “Pope Alexander VI,” 217-218. 31 Giovanni Boccaccio, Decameron, trans. Mark Musa and Peter Bondanella (New York: New American Library, 1982), 11. 32 Ibid, 12.

84 order to provide comfort and spiritual encouragement.33 They would also read the Bible and other devotional works to the sick.34 According to Nicholas Terpstra, this practice is best understood in the context of the post-plague preoccupation with ars moriendi.35

Indeed, at a time of heightened concern for “proper” death confraternal membership functioned to guarantee the presence of brethren at the deathbed. In the absence of familial support confraternal brethren could "stand” for the family as fictive kin in the days and moments before death, assuring even the most anxious of a “proper” end and passage to the next life.

Having a confessor present was also of utmost importance considering they were necessary for the absolution of venial sins. That said, not all people could afford one.

Indeed, a 1310 Florentine statute obliged all parish rectors to confess the dying without regard for material gain.36 Evidently, this was an attempt by the city authorities to guarantee the right to a confessor for all citizens regardless of socioeconomic background. The 1337 will of the Bolognese labourer Albertino di Ser Petro illustrates the issue, as it allocates ten solidi to a certain friar Alberto, his confessor, “for the benefit of his soul…”37 Initially, the implication here appears unclear. The will continues “He leaves to his confessor 3 solidi for the singing of masses. He wants his confessor to arrange for his funeral and burial…”38 The initial ten solidi, since separate from the rest, must therefore account for something other than masses and funeral expenses. It seems then that this labourer allocated ten solidi to his confessor for the absolution of his venial

33 Terpstra, 71. 34 Ibid. 35 Ibid, 73. 36 Strocchia, 89. 37 “Four Bolognese Wills (1337),” 519. 38 Ibid.

85 sins. This kind of financial gain is likely the kind of abuse the 1310 Florentine statute attempted to suppress.

But not all Italian cities legislated against avaricious confessors and not all people could afford the cost of having one. In such cases the responsibility to provide these services to people extended to confraternities. In Bologna non-wealthy brethren like artisans, labourers, and small merchants could expect access to a confessor before death, a privilege that was otherwise reserved to those who could afford it.39 For some, membership in a confraternity could mean the only chance at having access to this important spiritual tool.

The role of confraternities in the last days of its members taken with the 1310

Florentine statute reveal a culture deeply interested in those final crucial moments before death. More importantly, these examples reveal the very public nature of that interest. In

Florence, the city authorities recognized the problem caused by avaricious confessors. By demanding payment these men barred a portion of the population from access to absolution in death. It is significant that the solution resulted in state interference, a clear indication that the Florentine government saw the deaths of its citizens as a concern of the state. In the absence of such state involvement, confraternities could reaffirm the role of the community by carrying the sick from one life to the next. This was done through spiritual encouragement and providing a confessor to those who could not afford it. With all this in mind, Boccaccio’s remarks on the fact that during the plague some died without a single person present take on an added importance. Over a century later the same sentiment can be glimpsed in Burchard’s account of the death of Alexander VI. In a text

39 Terpstra, 81.

86 that pays painstaking attention to detail, no family or friends are mentioned in the Borgia pope’s final moments. Burchard’s “attack” by omission thus reveals that even by the sixteenth century, it was still better to die surrounded by people than to die alone.

The experience of the terminally ill in early modern Italy was therefore not an individual one. The stakes were too high. Both spiritual encouragement and a confessor were needed to ensure a swift passage through Purgatory. To this end the help of fellow human beings was necessary. As Erasmus poignantly states regarding the dying man,

“The last battayle is atte hande. The space is shorte. He nedethe spedye counsell.”40

After the deceased had taken their final breath, news of the event quickly reached a wide audience. To that end the importance of bells is well attested. Agnolo di Tura’s plague narrative remarks, “[a]nd none could be found to bury the dead for money or friendship. Members of a household brought their dead to a ditch as best they could, without priest, without divine offices. Nor did the death bell sound.”41 Later, during the

1374 plague, the Florentine government prohibited the customary bell ringing for the dead in the interests of minimizing the negative psychological effects.42 Clearly these bells did not go unnoticed and were an important part in the death process. The sound alone, which likely could be heard across an urban space, brought death itself into the streets and piazzas of Italian cities. A highly public phenomenon, the death of a fellow townsman warranted the attention of all citizens.

40 Desiderius Erasmus, “Preparatione to Deathe,” The English ars moriendi, ed. David William Atkinson (Bern: Peter Lang Publishing, 1992), 55. 41 Agnolo di Tura del Grasso, “The Plague in Siena.” 42 Strocchia, 62.

87

After Pope Alexander VI’s death, the body was washed and dressed in a brand new white coat.43 Later, Burchard relates “I returned to the city during the night … I ordered the runner Carlo … under penalty of the loss of his office, to inform the whole clergy of Rome, both regular and secular, that they should be at the Vatican on the morrow … to escort the body from the main chapel to St. Peter’s.”44 Preparations were underway for the pope’s funeral procession. As mentioned, the late trecento saw a general transition from a mendicant-based approach to death to a cult of remembrance characterized by pomp and self-memorialization. In this, funerals became means of displaying honour in a highly public way. This began with the procession. The procession involved placing the body on a bier to be walked publicly through the streets, ending at the desired church to celebrate the requiem mass. A highly ritualistic event, the procession could be full of symbolism and diverse associations. Here one of the most important aspects was the appearance of the deceased, from which two general distinctions can be made. Elite individuals, such as knights, doctors of law and medicine, and persons buried at public expense generally reserved the right to lie on the bier uncovered.45 Ordinary people, on the other hand, were required by law to cover the body.46 The privilege of having the body uncovered was therefore something to be flaunted as it communicated a certain association of honour and social standing. It could also provide an opportunity for conspicuous consumption as the clothing of the deceased became visible to all participants.

43 Burchard, “Pope Alexander VI,” 218. 44 Ibid. 45 Andrew Butterfield, "Social Structure and the Typology of Funerary Monuments in Early Renaissance Florence," RES: Anthropology and Aesthetics, no. 26 (1994): 60. 46 Ibid.

88

Bodies that were uncovered naturally needed to be dressed appropriately.

Dressing the body in the most expensive manner could serve to display honour to the public participants of the procession. It also reflected contemporary concerns with the afterlife, as the clothing worn during the burial, and thus the honour of the individual, was believed to carry into the next life with the dead.47 The specific kind of clothing worn on the body was also of great importance. This could showcase the occupation and status in life of the deceased. For someone like a pope, this was crucial. For Pope Alexander VI,

Burchard reports he was dressed in “a short fanon, a beautiful chasuble, and with stockings.”48 In 1419, Pope John XXIII wore a white miter with his cardinal’s hat resting at his feet.49 Knights could also lay on the funeral bier in the full trappings of their status.50

But dressing Pope Alexander VI for his funeral procession apparently did not go according to plan: “His ring was missing and I could not recover it … [I] covered him with an old rug…”51 Although he was dressed in his papal garments, the absence of the ring deprived the pope the honour of his complete outfit. Further, Alexander VI’s body, according to this account, lay covered by a rug during the funeral procession, denying him the honour of remaining uncovered on the bier. Mention of the “old rug” is also significant. Indeed, the quality of the material covering the body expressed honour and therefore was of great concern at this time, even amongst people like artisans. This is seen in the fact that confraternities often loaned vermillion silk to be used in processions

47 Strocchia, 44. 48 Burchard, “Pope Alexander VI,” 218. 49 Strocchia, 138. 50 Butterfield, 51. 51 Burchard, “Pope Alexander VI,” 218.

89 to non-wealthy members, augmenting their honour in the absence of sufficient wealth.52

Surely a pope, if uncovered to begin with, would be covered in something more expensive than a mere “old rug.” This, in addition to the lost ring, makes Burchard’s message quite clear; this was not by any means a “normal” or honourable funeral. On the contrary, it was dishonourable.

The pope’s procession began when “he was carried from the main chapel to the center of St. Peter’s … About a hundred-and-forty torches were borne for the most part by the clerics…”53 The importance of wax in the funeral procession at this time is well attested. Indeed, the number of candles that accompanied the body functioned to exponentially augment the honour of an individual. As early as the 1350s, Boccaccio lamented “Very few were the dead whose bodies were accompanied to the church by more than ten or twelve of their neighbours … not even carried on the shoulders of honoured … citizens but rather by gravediggers … accompanied by four or six churchmen with just a few candles, and often none at all.”54 Here the small number of candles worries Boccaccio, although a few are apparently better than none.

The number of candles used in processions could even be restricted under local sumptuary laws, as they were in late trecento Florence. Here the demand for large amounts of wax proved quite profitable, and in 1391 alone the city sold seventy-one sumptuary exemptions to the rule.55 Clearly upper-class people saw in wax an air of prestige that could be used within the competitive social climate of late trecento Italy.

Candles were also important to artisans and labourers. In the early Renaissance, corporate

52 Strocchia, 85. 53 Ibid. 54 Boccaccio, 12. 55 Ibid, 62.

90 groups like confraternities often loaned wax torches to the families of deceased members in the case of inadequate wealth.56 The confraternity of Orsanmichele in Florence, for example, recorded in their inventory of goods a collection of bier cloths, cushions, and large candles mounted on staves, which were loaned to members at death.57 Confraternal membership, therefore, could guarantee labourers and artisans access to this “necessity” just as it could provide a confessor in the moments before death.

It is in this context that the ultimate fate of the pope’s hundred and forty candles needs to be understood. Burchard continues, “When the coffin was deposited in the center of the church … some soldiers of the palace-guard attempted to appropriate several torches … [the clergy] fled to the sacristy. And the Pope was left lying there almost alone.” This incident recalls an attempted attack on the honour of the pope. For these soldiers, the appropriation of the candles could visibly deprive the pope of his honour.

For Burchard, its inclusion in the narrative reinforces the dishonourable nature of the pope’s funeral by depriving the pope of wax. For us, it serves to highlight the importance of candles as a highly public indicator of honour.

The positioning of individuals in the procession line could hold a number of associations. Burchard notes “First came the cross, then the monks of St. Onofrio, the

Paulist Fathers, the Franciscans … Then came the body … it was carried by the poor who had stood around it in the chapel …”58 According to Sharon T. Strocchia there existed within funeral processions the double importance of those leading as well as those positioned nearest to the body.59 In this case, the leading cross clearly recognizes the

56 Strocchia, 34. 57 Ibid. 58 Burchard, “Pope Alexander VI,” 218. 59 Strocchia, 7-8.

91 primacy of God. Perhaps most striking is the proximity of the poor to the body. Charged with carrying the funeral bier, the poor played a starring role in this very public spectacle.

Burchard’s mention of them seems to contradict our reading of this text, as it would serve to highlight the pope’s commitment to Christ-like charity. On the contrary, Burchard explicitly names the poor in order to debase the honour of the pope. Indeed, by the sixteenth century the slow erosion of the mendicant ideal that began in the trecento had created an environment in which this kind of humble display of charity could in fact threaten loss of honour.60

For elite funerals, the location of people in the procession could serve to reinforce familial claims to power. For male funerals this could result in the primacy of agnates to the detriment of extended family networks through marriage.61 On the other hand, female elite funerals often gave primacy of place to members of the household in which she had lived at the time of death.62 Here both cases reveal attempts at displaying notions of patrilineage in funeral processions. In much the same way, the location of people in the procession could also be used to strengthen foreign political ties. For example, in ducal

Milan foreign ambassadors walked in front of immediate kin as a sign of honour to

Milan’s foreign friends.63 Disputes over primacy of place in the procession could even break out between ambassadors. Indeed, in the fifteenth century, Mantuan ambassador,

Vincenzo Scalone, complained that the ambassador of Modena had preceded him in the procession line at Francesco Sforza’s mother’s funeral.64

60 Terpstra, 62. 61 Strocchia, 13. 62 Ibid, 14. 63 Ibid, 17. 64 Ibid, 18.

92

Clearly the highly public nature of funeral processions facilitated the use of symbolism to communicate the values of its organizers. Processions then became opportunities of expressing the needs of individual people and families. However, they were not always neatly contained within the sphere of the needs of the individual. Indeed, processions could simultaneously carry associations of a public nature. The case of Vieri de’ Medici will be instructive here. A prominent banker, Medici’s funeral procession took place in 1395 with all the pomp that had become characteristic of late trecento funerals. His bier was covered in gold cloth and draperies.65 The body was dressed in an elaborate belt ornamented with silver, and he wore a dagger and gold spurs to signal his knighthood in life.66 Surrounded by eight family members dressed in black, the funeral procession began in front of San Tommaso, the Medici family church.67 This funeral at first glance appears no different than those discussed earlier. The golden bier cloth is an example of conspicuous consumption that functioned to augment honour. The accoutrements of the deceased were meant to signal Medici’s knighthood in life. Even the fact that the procession began at the Medici family church served to highlight the centrality of the individual family. This would, perhaps, be a straightforward case if it were not for one little detail. The funeral bier was covered in draperies that depicted the

Medici coat of arms alongside those of the commune.68 To participants and onlookers this likely reinforced the ambiguous nature of the funeral. As an “important public figure,”

Vierri de’ Medici warranted a quasi-state funeral. The occasion, while no doubt an

65 John T. Paoletti, “Medici Funerary Monuments in the Duomo of Florence during the Fourteenth Century: A Prologue to ‘The Early Medici,’” Renaissance Quarterly 59, no. 4 (2006): 1149. 66 Ibid. 67 Ibid. 68 Ibid.

93 opportunity to reinforce individual and familial honour, simultaneously concerned the city at large.

But there is something larger to take away from the case of Vierri de’ Medici. It speaks more broadly to an important cultural shift that began in late trecento Italy: the cult of man. The rise of humanism in this period is well known. In the middle ages, if generalizations can be made, prevailing philosophical and cultural trends prioritized spiritual over temporal greatness. Trecento humanism represented a shift towards an increased appreciation for earthly accomplishment and virtue. In the area of death, this shift is related to the rise of the cult of remembrance. Indeed, as will be seen, Vierri de’

Medici’s quasi-state funeral is representative of such trends.

In the later middle ages, the lay deceased were relegated to tombs outside the church walls. Indeed, thirteenth-century canon law restricted access to sacred church grounds to the tombs of ecclesiastics and saints.69 Beginning in the quattrocento this started to change. Gradually, laymen were allowed burial within the church, though not without struggle. Here the example of Florence’s cathedral will be useful. Construction of the building began in 1296 using public funds and as a source of communal pride.70 The structure was therefore both religious and civic from its origins.71 Throughout the trecento numerous individuals sought unsuccessfully to be buried within its walls.72 The outside of the cathedral also became cluttered with the private coats of arms of many local families as individuals sought to privatize public wall space as a display of

69 Butterfield, 59. 70 Paoletti, 1119. 71 Ibid. 72 Ibid.

94 honour.73 In 1385, the resolve to keep the cathedral exterior free of private family emblems resulted in their systematic removal. Within three months this order was rescinded. 74 By 1400, burial within the cathedral remained restricted to cathedral canons.75 The new century, however, marked a shift in policy. The quattrocento saw the gradual transformation of the interior of the cathedral into a civic space that increasingly allocated places of honour to defenders of the city and citizens whose talents were sources of civic pride.76 This shift is exemplified by the construction of Leonardo Bruni’s tomb in the 1440s. An important political and cultural figure, Bruni received his own wall effigy within the cathedral, an honour traditionally reserved for prelates.77

The case of the Florence Cathedral reveals two important pieces of information.

First, it shows that the cult of man had apparently triumphed by the middle of the quattrocento. In this case, the sacred ground of the Florentine cathedral, traditionally reserved for important spiritual figures, turned its attention for the first time to great men of civic and, therefore, secular importance. Indeed, it appears that man had crossed the physical threshold separating the godly from its opposite. At the same time, the cult of man needs to be understood within a public context. As early as the fourteenth century, the cathedral struggled to prevent the appropriation of this public space by private symbols and tombs. That struggle was ultimately unsuccessful, a reflection of the ambiguous relationship between the public and the private. After all, the monuments of men like Leonardo Bruni simultaneously brought honour to the individual as well as to the city. Here the two overlapped.

73 Ibid, 1126. 74 Ibid. 75 Ibid. 76 Ibid, 1156. 77 Butterfield, 61.

95

One further example will demonstrate this ambiguous relationship. The Colleoni

Chapel in Bergamo was built in 1472-76 at the behest of the knight Bartolomeo Colleoni.

It was here that the condottiere sought to be buried within an altar at the back of the building. The chapel itself is attached to the side of Santa Maria Maggiore church. The façade contains two busts of Julius Caesar and the emperor Trajan, each with the number of years they ruled, which when added together equal the years Colleoni served as army commander.78 Within the chapel lies Colleoni’s tomb. On top rests a wooden equestrian statue over sculpted lions, clearly a deliberate imitation of the statue next door of San

Alessandro at Santa Maria Maggiore.79

The busts of the Roman emperors, considering the clever mathematics of the inscriptions, are clearly meant to stand for Colleoni. At this time Julius Caesar and Trajan were respectively renowned for their military greatness and justice.80 In addition, the wooden statue of Colleoni within the chapel, as a near replica of that of San Alessandro outside Santa Maria Maggiore, evidently was designed to associate the virtues of the saint with its patron.81 All these associations with virtue would have brought honour to

Colleoni. At the same time, as Giles Knox notes, virtue could serve to justify the exercise of power.82 The Colleoni Chapel therefore functioned to augment the individual prestige and legitimacy of its patron.

Yet the decision to attach this chapel to the Santa Maria Maggiore church prompts a secondary reading of the building. As a highly important public space, Santa Maria

78 Giles Knox, “The Colleoni Chapel in Bergamo and the Politics of Urban Space,” Journal of the Society of Architectural Historians 60, no. 3 (2001): 293. 79 Ibid, 304. 80 Ibid, 293. 81 Ibid, 304. 82 Ibid, 296.

96

Maggiore carried strong associations of civic identity. The attached Colleoni Chapel therefore reads as a separate but important component of that idea. This is perhaps unsurprising considering Colleoni repeatedly sought to legitimize his role in the political life of Bergamo.83 At the same time, it would be inaccurate to view the chapel solely in terms of Colleoni’s relationship to the city. On the contrary, the chapel itself was a public building. The busts of the Roman emperors were designed to inspire and edify onlookers.

The role of funerary monuments for public edification in this period is well attested.

Indeed, quattrocento humanist Leon Battista Alberti wrote that the state should build memorials of great men in order to inspire patriotism and virtue among its citizens.84

Such considerations came into play in the construction of the tomb of Leonardo Bruni.

Indeed, this tomb was almost surely designed as an exemplum worthy of imitation.85

Further, contemporary tomb slabs often included inscriptions mentioning the deceased’s virtuous and exemplary character.86 Thus, the Colleoni Chapel served a dual function. It could serve to augment the individual honour of its patron and legitimize his power. At the same time, this was a building that displayed its patron’s virtue in the interest of public edification.

In Italy the late trecento rise of a cult of remembrance marked a general shift from a mendicant-based approach to death to one that increasingly centred on the individual.

That said, death remained in this period a highly social phenomenon that concerned the community at large. Indeed, the individual experience of death was firmly rooted in the presence of people around the deathbed. In these last crucial moments people were

83 Ibid, 297. 84 Butterfield, 56. 85 Ibid. 86 Ibid, 51.

97 necessary to provide spiritual encouragement and comfort. The sick also needed a confessor present in order to receive absolution. In the absence of both, governments and corporate groups could intervene to provide these “necessities” to people of all socioeconomic levels. To that end, confraternities could guarantee both the presence of people and a confessor around the deathbed. Also, city governments could provide universal access to deathbed absolution through legislating against avaricious confessors.

Funeral processions reflected the competitive social environment of Renaissance Italy.

Indeed, individuals and families could manipulate them to reflect their own interests and values. To that end, the clothing of the deceased proved an important display of wealth and honour. The amount of wax also functioned as a way to display and quantify the honour of the individual. In the absence of candles, confraternities could loan these for the funerals of less wealthy members. At the same time, funeral processions could evoke notions of communal identity, as the cult of great men like Leonardo Bruni often operated within a public framework. Indeed, funerary monuments that brought honour to an individual often simultaneously evoked notions of civic identity and could be designed as an exemplum for the edification of onlookers. The ambiguous nature of death in this period as both a private and public matter undoubtedly reflects a culture deeply interested in the afterlife. As a crucial transition point, death and its practices required the attention of friends, family, and the entire community. In this way death remained in this time and place much as it had always been: a truly universal concern.

98

Bibliography Agnolodi, Tura. “The Plague in Siena.” http://www.u.arizona.edu/~afutrell/w%20civ%2002/plaguereadings.html Bartlett, Kenneth R., ed. The Civilization of the Italian Renaissance: A Sourcebook. Toronto: University of Toronto Press, 2011. Boccaccio, Giovanni. Decameron. Translated by Mark Musa and Peter Bondanella. New York: New American Library, 1982. Bornstein, Daniel E. A People’s History of Christianity. Minneapolis: Fortress Press, 2009. Burchard, Johann. “Pope Alexander VI and His Court.” The Civilization of the Italian Renaissance: A Sourcebook, ed. Kenneth Bartlett. Toronto: University of Toronto Press, 2011. Burke, Peter. Culture and Society in Renaissance Italy 1420-1540. Princeton: Princeton University Press, 1972. Butterfield, Andrew. "Social Structure and the Typology of Funerary Monuments in Early Renaissance Florence." RES: Anthropology and Aesthetics, no. 26 (1994): 47-67. Cohen, Elizabeth S. and Thomas V. Cohen. Daily Life in Renaissance Italy. Santa Barbara: Greenwood Press, 2001. Cohn, Samuel Kline. The Cult of Remembrance and the Black Death: Six Renaissance Cities in Central Italy. Baltimore: Johns Hopkins University Press, 1997. Duffy, Eamon. The Stripping of the Altars. New Haven: Yale University Press, 1992. Erasmus, Desiderius. “Preparatione to Deathe.” The English ars moriendie, ed. David Atkinson. Bern: Peter Lang Publishing, 1992. Jansen, Katherine L., Joanna Drell and Frances Andrews, Medieval Italy: Texts in Translation. Philadelphia: University of Pennsylvania Press, 2009. Knox, Giles. "The Colleoni Chapel in Bergamo and the Politics of Urban Space." Journal of the Society of Architectural Historians 60, no. 3 (2001): 290-309. Korpiola, Mia and Anu Lahtinen. Cultures of Death and Dying in Medieval and Early Modern Europe. Helsinki Collegium for Advanced Studies, 2015. http://www.helsinki.fi/collegium/journal/volumes/volume_18/Death%20and%20 Dying%20in%20Medieval%20and%20Early%20Modern%20Europe.pdf Madigan, Kevin. Medieval Christianity. New Haven: Yale University Press, 2015. Marshall, Louise. "Manipulating the Sacred: Image and Plague in Renaissance Italy." Renaissance Quarterly 47, no. 3 (1994): 485-532. McManamon, John M. "Marketing a Medici Regime: The Funeral Oration of Marcello Virgilio Adriani for Giuliano De' Medici (1516)." Renaissance Quarterly 44, no. 1 (1991): 1-41.

99

Meiss, Millard. Painting in Florence and Siena after the Black Death: the Arts, Religion, and Society in the Mid-Fourteenth Century. Princeton: Princeton University Press, 1978. Paoletti, John T. “Medici Funerary Monuments in the Duomo of Florence during the Fourteenth Century: A Prologue to ‘The Early Medici.’” Renaissance Quarterly 59, no. 4 (2006): 1117-1163. Sayers, William. “Florence and Beyond: Culture, Society and Politics in Renaissance Italy. Essays in Honor of John J. Najemy.” Annali D’italianistica 27 (2009): 386. Strocchia, Sharon T. Death and Ritual in Renaissance Florence. Baltimore: John Hopkins University Press, 1992. Strocchia, Sharon T. “Remembering the Family: Women, Kin, and Commemorative Masses in Renaissance Florence.” Renaissance Quarterly 42, no. 4 (1989): 635- 654. Terpstra, Nicholas. Lay Confraternities and Civic Religion in Renaissance Bologna. Cambridge: Cambridge University Press, 2002.

100

The Trials of Jamestown: an Investigation of the External Factors Influencing

England’s First American Colony

By: Ally Dries

As the first permanent English colony in what would become the United States of

America, Jamestown necessarily faced a multitude of setbacks that produced an image of a failed attempt at colonization, an image which has been accepted by many historians as a true representation of this colony. It is commonly assumed that only “by the heroic

Captain John Smith”1 was the colony eventually saved, as expressed by L.H. Roper, a theory that erases the individual actions of the settlers themselves. In response to this,

Karen Kupperman declares in her book The Jamestown Project, “Jamestown makes us uncomfortable,” referring to the ambiguous ideologies surrounding this early settlement as to the extent of its success.2 Historians have attempted to find the source of the

“failure” in Jamestown, yet many overlook the significance of its persistence. When considering the number of complications it faced, Jamestown should be praised as vital to the history of America, not in spite of its faults, but because of them. The social and political conditions of this early expedition set an inherently flawed model for Jamestown before the ships even left the ports of England. The fact that this colony was able survive its first years in light of the various factors working against it suggests not only the adaptability and resolution of the settlers, but also the significance of Jamestown’s place as the first permanent American colony. Due to the number of external factors

1 L.H. Roper, The English Empire in America, 1602-1658: Beyond Jamestown (New York: Routledge, 2009), 1. 2 Karen Kupperman, The Jamestown Project (Cambridge: The Belknap Press of Harvard University Press, 2007), 1.

101 influencing this expedition, it can be argued that Jamestown was ultimately a successful colony that set a precedent for all future English colonization in America.

The study of Jamestown has produced a diverse collection of historiographical works that attempt to either praise or condemn this early colony. Frank E. Grizzard Jr. and D. Boyd Smith focus on the failures of the settlers, claiming “this was an inverted society, wrongly selected for tasks that did not exist.”3 Idleness is one of the most commonly accepted causes for the struggles of Jamestown during its first years as a settlement. Grizzard Jr. and Smith address the issue of idleness among settlers, making the claim that they “did not adjust, did not work, became dependent and passive, lay down in large numbers, and died.”4 Edmund S. Morgan also draws attention to this standard assumption in his article “The Labour Problem at Jamestown,” however he notes that “idleness is more of an accusation than an explanation.”5 It would be convenient to accuse the settlers of laziness, however this explanation is too simple and does little to examine the various external factors surrounding this expedition. Tony Williams explains some of these external factors in his book The Jamestown Experiment by considering the early months of this colony as “simply a struggle to survive and endure,” and mentioning the spread of diseases, the falsity of English promotional letters regarding the bountiful nature of the land, as well as tense relationships with the Native Americans.6 Williams refines this failure to focus on the economic aspect of the colony, arguing that “the

3 Frank E. Grizzard, Jr. and D. Boyd Smith, Jamestown Colony: A Political, Social, and Cultural History (Santa Barbara: ABC-CLIO, Inc., 2007), xxviii. 4 Ibid, xxviii. 5 Morgan, 596. 6 Tony Williams, The Jamestown Experiment (Naperville: Sourcebooks, Inc., 2011), xi.

102

Crown did not offer any direct financial support” and that “the settlers were largely on their own.”7

With Williams’ economic argument in consideration, the instinct to blame the individual settlers for the failures of the colony becomes less realistic. Williams goes on to praise the early settlers for their “entrepreneurial spirit that would shape and define the

American character,” as the construction and perseverance of the colony was dependent on the traits of the settlers, including “private property, individual initiative, personal incentives to seek profit, and the freedom to pursue one’s own happiness.”8 By shifting the perspective of Jamestown from negative to positive, Williams opens up the opportunity to study Jamestown as a successful colony that was able to overcome the trials of early American life and therefore eliminate the biased view of Jamestown as a complete failure. Karen Kupperman, another historian choosing to regard Jamestown in an optimistic light, argues that this colony was “an enormous accomplishment achieved in a very short period of time,” and addresses the general favour given to New England by stressing the fact that the Pilgrims were able to follow the model set for them by

Jamestown’s successes and failures. 9 Essentially, as Kupperman states, “all other successful English colonies followed the Jamestown model,” therefore it cannot be classified as a complete failure as it created “the archetype of English colonization.”10

The social organization of Jamestown created several problems for the colony before the settlers even landed on American shores. Since the exact intentions for this colony were ambiguous—whether it was to be a hunt for gold, a military expedition, or a

7 Ibid, xi. 8 Ibid, xii. 9 Kupperman, 2. 10 Ibid, 3.

103 search for a passage to the Pacific—the selection of men for the journey was ill-fitted for colonial life.11 As Morgan states in his article, “The Virginia Company was loaded with noblemen,” as opposed to men of practical trades such as farming.12 The effects of this lack of skilled tradesmen can be seen in various primary accounts, including “Description of the River and Country,” presumably written by Gabriel Archer, which explains that these first settlers, in order to begin farming, “threw the seeds at random, carelessly” and also requested a “skillful man” to properly farm the produce.13 James Horn argues that the majority of settlers “would have come from London and surrounding regions,” so

“they would have been struck by the absence of the familiar,” as “the colony lacked the complexity and density of English society” and “the social gradations and centuries-old traditions and customs that regulated everyday life.”14 The severe contrast between the social conditions these men experienced in England and in the colony would have influenced their motivations as well as their ability to actually perform the work required to successfully run a colony. The lack of skilled labourers contributed to the “unequivocal misery of disease, want, and fear” that occurred during the first two years of the colony, as Ed Southern describes in The Jamestown Adventures: Accounts of the Virginia Colony,

1605-1614.15

The social structure of the colony also coincides with the social expectations placed on the men of Jamestown. By the time England began to experiment with

11 Morgan, 607. 12 Philip L. Barbour, The Jamestown Voyages Under the First Charter 1606-1609 (Cambridge: Cambridge University Press, 1969), 10. 13 Ibid, 10. 14 Randall J. Stephens, “Jamestown Redivivus: An Interview with James Horn,” Historically Speaking 8, no. 4 (2007): 7. 15 Ed Southern, ed., The Jamestown Adventures: Accounts of the Virginia Colony, 1605-1614 (Winston- Salem: John F. Blair, Publisher, 2004), 148.

104 colonization in the New World, stories and legends of Spanish success in the Americas had been circulating throughout Europe for some time. Due to the spirit of competition, as well as a variety of assumptions regarding the New World, as Kupperman explains,

“the migrants had been sent over with notoriously unrealizable goals: to find a good source of wealth, preferably precious metals.”16 The riches of the Spanish conquests mythicized the entire New World, creating the false assumption that the land the English were colonizing would be “a land of abundance.” 17 This expectation would have significantly influenced the mindset of the settlers as well as the preparations for this expedition. George Percy, an English explorer present during the earliest years of

Jamestown, often blamed the Starving Time—the period of extreme hunger during the winter of 1609-1610—on the colony’s lack of supplies.18 Although much of this can be attributed to insufficient preparation for the winter, the expectations regarding the richness of the land in America could have led to insufficient preparation for the trip as a whole, before the settlers even set sail. Regardless of the extent to which expectations played a role in the physical deconstruction of the colony, the social and mental implications of the mythicization of the New World cannot be ignored.

An extension of the social issues disrupting life in Jamestown during its earliest years can be examined through the irrational political structure set in place by English royalty. Due to the reliance on experimentation present during this period of colonization, those organizing the colonies needed to attempt new modes of government that they believed would be effective in a colonial state. In an essay written about the history of

Virginia, William Stith presents a list of the orders and authorities granted by King James

16 Kupperman, 9. 17 Morgan, 600. 18 Southern, 35.

105

I regarding the colony. In the document, Stith analyses the political and judicial structure that was to be followed in Jamestown. According to this framework, Stith argues that

King James I had “placed the whole legislative power solely on [the presidents and councils] without any representative of the people,” which, as he reasons, is an

“extravagant and illegal power” that is “contrary to a noted maxim of the constitution: that all freeman are to be governed by laws, made with their own consent, either in person, or by their representatives.”19 As English citizens, the colonists would have expected to be governed according to the English constitution, so any discrepancy in governmental structure would have caused immediate feelings of isolation and alienation from the British administration.

Although it is true that the colonization of Jamestown was an act that required much experimentation and improvisation on the part of both the settlers and the government, there is little evidence to support the benefits of this particular structure outside of the personal gain of the governors except the suppression of potential riots.

The imbalance of power present within Jamestown would have significantly influenced the lives of the settlers on an individual and social level. As Tony Williams argues, “the authoritarian model of absolute leadership and the communitarian methods of living were fundamentally at odds with each other.”20 The settlers’ lack of political voice combined with the rural community-based lifestyle would have been quite the shock for a group of gentlemen accustomed to the upper-class aristocratic lifestyle they left behind in England.

The stark contrast between the cultures of England and the New World would have alienated the settlers from the colony and from each other. When combined with the

19 William Stith, “The History of the First Discovery and Settlement of Virginia: Being an Essay Towards the General History of this Colony,” America’s Historical Imprints. Accessed 23 October 2017, 41. 20 Williams, xi.

106 foreign experience of marginalization felt due to the imbalance of power within the government, this alienation could have been a significant factor contributing to the early years of failure within the colony. Alienation would have deprived the settlers of incentive and contributed heavily to the social unrest of this dark period. The settlers’ ability to overcome the extreme political and judicial disadvantage they faced, as well as the shift in cultural climate, suggests the significance of this colony as a model of perseverance that the other colonies were able to follow. The errors of this political system allowed future colonies to establish more effective models, enabling a shorter adjustment period to life in the New World and less social disorder within the communities of early America.

While these many problems may suggest the failure of the colony, there is a greater amount of implicit confirmations of the significance of this colony as a vast success for America. It can be tempting to examine first-hand accounts of the misery of

Jamestown as solid evidence for the failures of the colony, however problems arise with shallow analyses of these documents. An initial issue found in much of the evidence surrounding the errors of Jamestown is the nature of the accounts that have survived. As

Kupperman points out, most surviving records “were produced by leaders on both sides of the Atlantic” and consisted primarily of “complaints, special pleading, and excuses.”21

While these accounts are viewed by historians such as Grizzard Jr. and Smith as evidence of failure, it is important to understand both the experimentation and individuality of this expedition. As Kupperman discusses, the “promoters laid plans, but the ordinary people who carried them out… were the ones who had to deal with realities on the ground and

21 Kupperman, 9.

107 who ultimately founded a successful colony.”22 The various external factors working against the settlers prompted the need for improvisation.

Although this use of trial and error typically resulted in error, the colonists’ later success is increasingly impressive when examining the sheer mass of factors working against them. Through its many failures, Jamestown established what Kupperman describes as “the successful archetype,” which is essentially the effective model of colonization.23 With this model in place, other colonies could follow an example framed by the successes of Jamestown and secured against its failures by means of experience, a privilege which Jamestown did not receive, as it was the first successful English colony in the New World. When analyzing the value of this colony as an archetype of colonization, it can be argued that the study of Jamestown should be framed in light of its successes rather than in spite of its faults.

The discourse surrounding Jamestown continues to be controversial in attempting to determine the extent to which this colony was successful. An examination of the various external factors influencing this success determine the significance of this colony to the history of America. Further, the perseverance of the settlers in the face of extreme social and political disorder presents an altered image of Jamestown, not as a colony that should be accepted in spite of its flaws, but a group of individuals who should be praised for overcoming them. The unfamiliar social situations and unrealistic English expectations, combined with a severe imbalance of political and judicial power, all worked to prevent the creation of this colony. Rather than disintegrate as the previous

22 Ibid, 11. 23 Ibid, 11.

108

English attempts at settlement had, Jamestown was able to endure these harsh conditions and develop a reliable model for future colonization in the New World.

109

Bibliography Barbour, Philip L. The Jamestown Voyages Under the First Charter 1606-1609. Cambridge: Cambridge University Press, 1969. Grizzard Jr., Frank E. and D. Boyd Smith. Jamestown Colony: A Political, Social, and Cultural History. Santa Barbara: ABC-CLIO, Inc., 2007. Kupperman, Karen. The Jamestown Project. Cambridge: The Belknap Press of Harvard University Press, 2007. Morgan, Edmund S. “The Labour Problem at Jamestown, 1607-18.” The American Historical Review 76, no. 3 (1971): 595-611. Roper, L.H. The English Empire in America, 1602-1658: Beyond Jamestown. New York: Routledge, 2009. Southern, Ed, ed., The Jamestown Adventures: Accounts of the Virginia Colony, 1605- 1614. Winston-Salem: John F. Blair, Publisher, 2004. Stephens, Randall J. “Jamestown Redivivus: An Interview with James Horn.”Historically Speaking 8, no. 4 (2007): 6-8. Stith, William. “The History of the First Discovery and Settlement of Virginia: Being an Essay Towards the General History of this Colony.” In America’s Historical Imprints. Accessed 23 October 2017. Williams, Tony. The Jamestown Experiment. Naperville: Sourcebooks, Inc., 2011.

110

Science versus ‘Science’: Exploring the Life of Benjamin Banneker in the Context of

Thomas Jefferson’s Views in Notes on the State of Virginia

By: Emma Evans

Benjamin Banneker’s life has significance in the history of the United States of

America as a prominent scholarly figure and agent for social change due to the way his life directly and indirectly interacted with the ideology laid out by Thomas Jefferson in his book Notes on the State of Virginia. As a free African American man living in the

United States during the eighteenth century, Banneker gained recognition for his work in fields such as mathematics, surveying, and astronomy.1 These accomplishments earned him a reputation as one of the most accomplished African Americans in the eighteenth century gaining international recognition.2 At the same time as many of Banneker’s early successes, secretary of state Thomas Jefferson was authoring his only published book

Notes on The State of Virginia.3 In Query XIV of his book Jefferson provides a detailed account of his views of race, notably his belief that African Americans lack equal mental capabilities as white Americans.4 Banneker’s career and accomplishments indirectly contradict this ideology. His 1791 correspondence with Jefferson additionally directly addresses Jefferson’s views and argues them. Although Benjamin Banneker’s indirect influence through his career and direct interaction via correspondence with Thomas

Jefferson did not instigate a new outlook on race relations or change Jefferson’s views,

1 Mark G. Spencer, ed. “Benjamin Banneker,” in The Bloomsbury Encyclopedia of the American Enlightenment, Vol. 1 (New York: Bloomsbury Publishing Inc, 2015), 111-112. 2 LaGarrett J. King, “More Than Slaves: Black Founders, Benjamin Banneker, and Critical Intellectual Agency,” Social Studies Research and Practice 9 no. 3 (2014): 94. 3 Jordan Winthrop, White Over Black: American Attitudes Towards the Negro 1550-1812 (Chapel Hill: University of North Carolina Press, 1968), 449-486. 4 Thomas Jefferson, Notes on the State of Virginia (Chapel Hill: University of North Carolina Press, 1955), 137-143.

111

Banneker’s success in the domain of science and ability to innovatively frame discussions of race and slavery established his significance in challenging the emerging form of scientific racism supported by Thomas Jefferson in Notes on The State of Virginia allowing him to be used by others as a template for social action.

To help comprehend how Banneker’s life is significant to racism in the United

States, one must understand the argument Jefferson dictates about race and intellect in

Notes on The State of Virginia. Jefferson wrote his Notes for private circulation in

European circles describing his home state of Virginia.5 When he published the book, the most discussed chapter became Query XIV: Laws, in which he provided a brief but detailed account of his views on African Americans as a race.6 Though this chapter did not solely focus on race, the text made a clear statement that Jefferson viewed African

Americans as both physically and mentally inferior. While he states slavery is a moral sin and suggests emancipation,7 he nonetheless reinforces the concept that the African

American race is separate and inferior, seeking to support these claims under the guise of science. Jefferson highlights his views of a distinctive separation of the races when he states, “whether it proceeds from the colour of the blood, the colour of the bile, or from that of some other secretion, the difference is fixed in nature.”8 He goes into numerous accounts of their inferiority such as the inadequacy of their women, difference in odour, and their intellect. When discussing the intellectual abilities of African Americans

Jefferson writes “in imagination they are dull, tasteless and anomalous,”9 and “never yet

5 Frederick Binder, The Color Problem in Early National America as Viewed by John Adams, Jefferson and Jackson (The Hague, Netherland: Mouton & Co, 1968), 48-60. 6 Jefferson, Notes on The State of Virginia. 7 Ibid, 137. 8 Ibid, 138. 9 Ibid, 139.

112 could I find that a black has uttered a thought above the level of plain narration.”10 These quotations show Jefferson saw African Americans as intellectually weaker than their white counterparts and wanted to spread these views to his intellectual colleagues.11

Jefferson qualifies his views of racial inferiority as “suspicion only.”12 However, his reactions in later life to opposing views and evidence suggests he often disregarded counter arguments, suggesting he held stronger held views privately. He emphasizes this racial difference as important, writing “the circumstance of superior beauty, is thought worthy attention in the propagation of our horses, dogs, and other domestic animals; why not in that of man?”13 Jefferson later uses this claim of racial inferiority to justify continued slavery saying, “this unfortunate difference of colour, and perhaps of faculty, is a powerful obstacle to the emancipation of these people.”14 These views together sculpt stereotypes and pseudo-science into an unproven thesis. Jefferson proposes that the

African American race is scientifically inferior to white Americans intellectually and physically. Banneker’s life challenges this viewpoint as his career acts as a direct opposition to Jefferson’s ideology.

The historiographical approach often taken when analysing Banneker in relation to Thomas Jefferson uses Jefferson as a representation of the typical culture in the southern states. Fredrick Binder for example shows Jefferson as an individual sympathetic to emancipation but who authored discriminatory views in Notes to appease political critics who disliked his support of abolition.15 Annette Gordon-Reed writes

10 Ibid, 140. 11 Ibid, 141. 12 Ibid, 143. 13 Ibid, 138. 14 Ibid, 143. 15 Binder, 48-60.

113 about Jefferson as “useful to blacks in a way that highlights the divide between black and white Americans' perceptions of the world.”16 LaGarrett J. King refers to the Jeffersonian dogma of racial inferiority which Banneker challenged as a “popular ideology” of the time.17 New research on Jefferson challenges this premise instead portraying him an instigator of ideas about ‘natural’ and ‘scientific’ racial inferiority, which was an idea shown to be unpopular in the late eighteenth century.

New research into Jefferson has reflected upon his writings, influence, and cultural attitudes of the time, finding that the view he promoted regarding race in Notes of

The State of Virginia was unique and helped encourage an emerging form of racial discrimination in the United States. Some earlier historians did initially promote this perspective. Jordan Winthrop notes that Jefferson more than any other individual in early

American history framed the debate around the mental capacity of African Americans acting as a central point of reference and influence on the topic.18 Winthrop points out that the Virginian defence of slavery in the eighteenth century did not use the narrative of racial inferiority prior to Jefferson publishing Notes on The State of Virginia.19 After

Notes, discussion of racial inferiority increased and was followed by immediate criticism.

Such criticism included individuals like Gilbert Imlay, an American businessman and diplomat who refuted Jefferson’s views and wrote “it is certain that Negroes and whites are essentially the same in shape and intellect,”20 showcasing a lack of consensus on racial inferiority. Winthrop concluded that Jefferson “remodelled an anti-negro diatribe

16 Annette Gordon-Reed, “Engaging Jefferson: Blacks and the Founding Father,” The William and Mary Quarterly 57, no. 1 (2000): 172. 17 King, 96. 18 Winthrop, 436. 19 Ibid, 449-486. 20 Ibid, 441-442.

114 into a scientific hypothesis, thus effectively depersonalizing a matter which was for him obviously of some personal importance,”21 arguing that Jefferson’s writings framed the race debate in a modern way: through the lens of science rather than personal hatred.

Despite Winthrop’s research Jefferson was still often portrayed as a representation of many southern slave owners.

Historians in the twenty-first century have expanded the study on Jefferson’s influence by examining his construction of a new paradigm of race. Robert Forbes refers to Jefferson’s attitude on race as a momentous shift in views of race which he calls the

‘Jeffersonian Turn.’22 He states that Jefferson’s views created a new ideological view which combined the views of southern slaveholders with the rhetoric and support of enlightenment ideals.23 Like Winthrop, Forbes notes that the views of racial separation and inferiority had been minimal and decreasing in the mid-eighteenth century in the face of increasing criticism of the slave trade, writing “Anglo-American opinion tended to downplay [race’s] significance, and by the time of [Jefferson’s] writing, colour prejudice was viewed as a mark of reaction, ignorance, or infidelity.”24 Bruno Carvalho argues that

Jefferson acted as a precursor to scientific racism which emerged as the form of racism that took hold of the United States in the nineteenth and twentieth centuries.25

Understanding how Jefferson impacted views of race and racial ability is important when comparing him to Banneker because it downplays the impact of

Banneker’s challenge of Jefferson on the overall understanding of race in the United

21 Ibid, 439. 22 Robert P. Forbes, “‘The Cause of This Blackness’: The Early American Republic and Construction of Race,” American Nineteenth Century History 13, no. 1 (2012): 67. 23 Ibid, 65-94. 24 Ibid, 75. 25 Bruno Carvalho, “Writing Race in Two Americas: Blackness, Science, and Circulation of Knowledge in the Eighteenth-Century Luso-Brazilian World and the United States,” The Eighteenth Century 57, no. 3 (2016): 304.

115

States. Banneker’s life challenged Jefferson’s views as his career path as a scientist and his accomplishments in science indirectly oppose Jefferson’s central argument: that

African Americans were intellectually inferior and unequal by nature. Portraying

Jefferson’s views as just another typical voice, Banneker’s criticism in his correspondence and his indirect influence through his career is only as significant as all other early critiques of race in America. But when understanding Jefferson as an instigator in an emerging and powerful form of racial discrimination, Banneker as one of the first formal challenges to this modern ideology set precedents for addressing similar forms of racist rhetoric. When you re-examine research comparing Banneker and

Jefferson through the lens of Jefferson’s role as a paradigm shift, Banneker’ action as the first political and personal engagement of a black man with Jefferson is of greater importance. He is the first individual to converse with one of the men shaping the racism of the future and challenge his views. Acknowledging his significance draws attention to how important Banneker’s scientific success was and the unique ways Banneker adapted his critiques to counter to modern racism.

Banneker’s scientific achievements in astronomy were a formal challenge to the pseudoscience of racial inferiority by demonstrating the mental intellect in African

Americans. This demonstration of capability directly contradicts Jefferson’s thesis of

African American inferiority. Beyond exhibiting his intellectual capabilities, he did so in science which was the basis for claims against his race. As Winthrop notes in his book

White Over Black, “if evidence of intellect was sought what more likely field than science.”26 While Jefferson acknowledges that African Americans have some intellectual

26 Winthrop, 455.

116 ability (in particular, African American poets and other creative individuals), he states that though “in memory they are equal to whites; in reason much inferior.”27 In contrast,

Banneker’s six almanacs received international recognition, had numerous reprints,28 and provided a solid example of reason and intellectual thought. The public and scholars agreed that Banneker displayed some of the highest levels of intelligence and challenged common conceptions about African Americans’ scholastic capabilities. James McHenry, a surgeon and member of two presidential cabinets, wrote the introduction for Banneker’s first almanac assuring the public of its accuracy and the authenticity of Banneker’s efforts. In the introduction McHenry writes “I consider this negro as a fresh proof that the powers of the mind are disconnected with the colour of the skin… in every civilized country we shall find thousands of whites, liberally educated, and who have enjoyed greater opportunities of instruction than this negro, his inferior of those intellectual acquirements and capacities.”29 This passage asserts McHenry’s belief that Banneker challenged the emerging rhetoric of racial inferiority and that he possessed greater intellectual capability than some white individuals. Creating this distinction exhibits

Banneker as intelligent rather than merely intelligent for his race, promoting a larger scale of equality, and challenging the forms of views proposed by Jefferson in Notes.

Public support often mirrored attitudes like McHenry such as when The Georgetown

Weekly Ledger wrote about Banneker’s abilities in March of 1791 saying they “already prove that Mr. Jefferson’s concluding that race of men were void of mental endowment was without foundation.”30 These attitudes and writing show that Banneker’s success

27 Jefferson, Notes on The State of Virginia, 139. 28 King, 88-105. 29 Silvio A. Bedini, The Life of Benjamin Banneker (New York: Scribner, 1972), 180. 30 Winthrop, 450.

117 acted as evidence of the equal racial capability for knowledge and provided a strong argument against the points Jefferson proposes in Notes which makes it surprising that

Jefferson maintained his views following Banneker’s successes and correspondence.

Banneker’s accomplishments did not change the minds of all opponents and failed to shape views on race and slavery. His correspondence and almanac furthermore did not change the views of Jefferson on African American intellect. Jefferson responded to

Banneker’s letter optimistically writing “nobody wishes more than I do to see such proofs as you exhibit.”31 However, when privately writing to friend Joel Barlow, Jefferson’s tone changed writing that Banneker’s work was not “without suspicion of aid from

Ellicott.”32 This attitude mirrors his views of racial inferiority from Notes and showed his opinion to remain unchanged despite Banneker’s success. Despite no formal shift in attitude, Banneker’s work and writing provided a strong basis for social action against the emerging racism of the future.

Banneker is often spoken about after his time as a role model for action and challenging racism. Frederick Douglass encouraged publishing Banneker’s biography stating “my newly emancipated people… are especially in need of just such examples of mental industry and success as I believe the life of Banneker furnish.” 33 African

American scholar Lerone Bennet Jr. wrote about Banneker as a man with the potential to inspire the social actions of others saying "For men who appeal from the gutters to the stars, for men who stand up and protest, no matter what the odds, the star-gazer remains a

31 “From Thomas Jefferson to Benjamin Banneker, 30 August 1791,” Founders Online, National Archives, 2017. 32 Winthrop, 454. 33 Angela G. Ray, “‘In My Own Hand Writing’: Benjamin Banneker Addresses the Slaveholder of Monticello,” Rhetoric & Public Affairs 1, no. 3 (1998): 387.

118 persuasive and articulate example."34 As noted by Angela G. Ray, “as a rhetor, Banneker failed to persuade in the immediate instance, but he articulated a speaking persona that would help establish the conditions for future rhetorical expressions.”35 One technique that Ray credits Banneker for innovating is evoking the speech of those in power such as

Christianity and ideas of justice, equality, and freedom.36 He aligned the push for social action with dominant religious views and the desires that sparked the revolution to create unity. In his letter to Jefferson he invokes religion as a unifying force saying, “That one universal Father hath given being to us all, and that he hath not only made us all of one flesh, but that he hath also without partiality afforded us all the Same Sensations, and endued us all with the same faculties … however diversifyed in Situation or colour, we are all of the Same Family.”37 He also evokes images of independence, writing “that you publickly held forth this true and invaluable doctrine, which is worthy to be recorded and remember’d in all Succeeding ages. ‘We hold these truths to be Self evident, that all men are created equal.’”38 Using the ideas of independence and religion, Banneker draws attention to Jefferson’s hypocrisy writing "but Sir how pitiable is it to reflect…that you should at the Same time be found guilty of that most criminal act, which you professedly detested in others, with respect to yourselves.”39 This method of challenging racism appears later in American history with Martin Luther King Jr. In his social action, King uses similar imagery of religion and the founding principles of the United States to

34 Ibid, 388. 35 Ibid, 400. 36 Ibid, 387-405. 37 “To Thomas Jefferson from Benjamin Banneker, 19 August 1791,” Founders Online, National Archives, 2017. 38 Ibid. 39 Ibid.

119 challenge ideas based upon notions of racial inferiority, often taking form in arguments that mirror those made by Banneker.

In conclusion, Benjamin Banneker's career, accomplishments, and correspondence with Jefferson are more significant for understanding race in America when considered in comparison to Thomas Jefferson’s introduction of a modern form of racism. Banneker was an early challenge to this novel form of racial discrimination and showed how to confront it. His life story and unique literary techniques are important inspirations for further social action movements. With the rise of re-surging racial discrimination in the United States like that spread by Jefferson, the life of Banneker serves as inspiration for facing this rhetoric and striving for equality.

120

Bibliography Bedini, Silvio A. The Life of Benjamin Banneker. New York: Scribner, 1972. Binder, Frederick. The Color Problem in Early National America as Viewed by John Adams, Jefferson and Jackson. The Hague, Netherland: Mouton & Co, 1968. Carvalho, Bruno. “Writing Race in Two Americas: Blackness, Science, and Circulation of Knowledge in the Eighteenth- Century Luso-Brazilian World and the United States.” The Eighteenth Century 57, no. 3 (2016): 303-324. Forbes, Robert P. “‘The Cause of This Blackness’: The Early American Republic and Construction of Race.” American Nineteenth Century History 13, no. 1 (2012): 65-94. “From Thomas Jefferson to Benjamin Banneker, 30 August 1791.” Founders Online, National Archives, last modified 29 June 2017. http://founders.archives.gov/documents/Jefferson/01-22-02-0091. Gordon-Reed, Annette. “Engaging Jefferson: Blacks and the Founding Father.” The William and Mary Quarterly 57, no. 1 (2000): 171-182. Jefferson, Thomas. Notes on The State of Virginia. Chapel Hill: University of North Carolina Press, 1955. King, LaGarrett. J. “More Than Slaves: Black Founders, Benjamin Banneker, and Critical Intellectual Agency.” Social Studies Research and Practice 9, no. 3 (2014): 88-105. Ray, Angela G. “‘In My Own Hand Writing’: Benjamin Banneker Addresses the Slaveholder of Monticello.” Rhetoric & Public Affairs 1, no. 3 (1998): 387-405. Spencer, Mark. G, ed. “Benjamin Banneker.” In The Bloomsbury Encyclopedia of The American Enlightenment. Vol 1. New York: Bloomsbury Publishing Inc, 2015. 111-112. "To Thomas Jefferson from Benjamin Banneker, 19 August 1791." Founders Online, National Archives. 29 June 2017. Accessed 14 March 2018. http://founders.archives.gov/documents/Jefferson/01-22-02-0049. Winthrop, Jordan. White Over Black: American Attitudes Towards the Negro 1550-1812. Chapel Hill: University of North Carolina Press, 1968.

121

Art in Early Modern Italy: Artemisia Gentileschi and Caravaggio

By: Joslin Holwerda

During the early modern period, Italy became the centre of Europe’s most notable cultural movements. In the late sixteenth century, a new era of art and culture began to emerge. The Baroque period originated in Italy during a time of intense religious, social, and political turmoil in Europe.1 This new era of painting, architecture, and music gave artists and patrons the necessary means to express their abilities and interests in increasingly extravagant ways. Unlike past movements, paintings and sculptures featured dramatized depictions of historic and religious events, created to incite emotions. A majority of the paintings from this time became distinguished parts of Rome, located throughout the city’s buildings and churches.

With the Baroque era came new artists who had the passion and necessary skill to thrive as professional painters. Rome, in particular, became home to a diverse array of painters whose works and achievements came to define the Baroque period. Among the many painters from this era were two whose careers and art played pivotal roles in the prosperity and prominence of Rome’s Baroque period. Michelangelo Merisi da

Caravaggio and Artemisia Gentileschi had illustrious careers as painters in Rome, though their lives and works came to represent broader gender relations in seventeenth-century

Italy.

Michelangelo Merisi da Caravaggio was one of Rome’s most celebrated, and controversial, artists of the seventeenth century. Though he began his career in poverty,

Caravaggio quickly developed a following in Rome’s art circles. His success is

1 Cait Caffrey, "Baroque Period," 2016, Salem Press Encyclopedia Research Starters, EBSCOhost.

122 demonstrated by the considerable amount of commissions he began receiving from notable, wealthy figures throughout Italy. While he continued to thrive as a professional artist, Caravaggio’s personal life began to have an effect on his art. His notoriously violent personality, which led to an extensive criminal record, had an influence on the mood of his paintings. Caravaggio often faced backlash for his violent and disgraceful portrayals of biblical scenes. Rather than preventing him from thriving as an artist, however, Caravaggio’s controversial paintings and violent personality established him as

Italy’s greatest painter of the Baroque period.

Caravaggio’s career in Italy differed greatly from that of Gentileschi’s. During her career of over forty years, Artemisia Gentileschi established herself as one of Italy’s leading female painters. Despite her talent, she was not given the same opportunities as her male contemporaries, putting her career at a clear disadvantage. However, her lack of professional art training did not stop her from pursuing a career as an artist. While she lived most of her life in poverty and in debt, Gentileschi was able to make a name for herself internationally.2 Numerous high-ranking patrons supported her art, even when most considered it to be drastically different from others in the Baroque era, especially when compared to the works of male artists. Gentileschi’s predominant use of female protagonists, depicted mostly as dominant figures, is often attributed to her experiences in life as a woman and as a rape survivor. Regardless of her success, Gentileschi’s reputation, recognized by both contemporaries and by scholars today, has turned on gender rather than her accomplishments.

2 Elizabeth Cropper, "New Documents for Artemisia Gentileschi's Life in Florence," The Burlington Magazine 135, no. 1088 (1993): 760.

123

Although there was a distinct female presence in Roman streets and urban spaces, there was a clear dichotomy between genders in the seventeenth century. Despite being a renowned painter whose accomplishments were internationally celebrated, Gentileschi experienced professional and personal disadvantages during her career that Caravaggio never encountered as a male painter. The professional challenges that Artemisia

Gentileschi faced, as a woman in the seventeenth century, reflects the broader gender relations of early modern Italy.

By the early seventeenth century, Rome was a newly transformed metropolitan city with a passion for the arts.3 In years prior, the papal city had experienced processes of urban reform, led by Pope Sixtus V. Sixtus’ plan was to improve the existing urban fabric of Rome and create a transparent and modern city.4 Large new architectural spaces were created across Rome, with wide straight avenues encircling the dense urban core of the city.5 Existing palaces, churches, and monasteries were expanded and decorated in ways that conformed them to the city’s extravagant new look.6 Rome’s urban expansion, occurring in the last years of the sixteenth century, coincided with Europe’s Counter-

Reformation. This meant new churches were emerging throughout Rome, changing the overall landscape and architecture of the city.

In order to accomplish everything that was set out in Sixtus’ urban reform plan, groups of workers, builders, and sculptors were drawn to Italy to begin the immense task

3 Mary D. Garrard, Artemisia Gentileschi: The Image of the Female Hero in Italian Baroque Art (Princeton: Princeton University Press, 1988), 13. 4 Charles Burroughs, "Opacity and Transparence: Networks and Enclaves in the Rome of Sixtus V," RES: Anthropology and Aesthetics, no. 41 (2002): 57-59. 5 Garrard, Artemisia Gentileschi, 13. 6 Elizabeth S. Cohen, “To Pray, To Work, To Hear, To Speak: Women in Roman Streets c. 1600,” Journal of Early Modern History 12 (2008): 296.

124 of renovating Rome.7 Entire communities of artists and builders, mostly from northern

Italy, took over Rome as it became the epicentre of cultural change and opportunity.8 The architectural transformation of the city was quickly followed by an increase in paintings, on both canvas and in fresco, specifically commissioned for the palaces and churches of

Rome.9 The artistic endeavours of the early seventeenth century became part of the newly emerging Baroque period. The Baroque style originated in Italy and quickly became popular among painters, sculptures, and architects.10 Most of the art commissioned in the later years of Sixtus’ urban transformation was characterized by the Baroque style. This style largely focused on ecclesiastical art, with religious themes and religious events depicted in most paintings and sculptures. 11 Scenes were created, and often commissioned, in order to reiterate the importance of the church and religious values following the Reformation.12 A rising demand from private buyers in the elite and middle classes also led to genre paintings, depicting everyday life in landscapes and still-lifes, although biblical stories remained the most desirable.13

Rome’s urban renewal created a modern environment with a much denser population, offering the city and its people increasing mobility. Roman streets were dangerous urban spaces, but they were also full of opportunity and allowed people the ability to grow social connections.14 Most notably, urban streets gave women in Rome the opportunity to increase their mobility in society. Despite males dominating urban places,

7 Stefano Zuffi, Caravaggio: Masters of Art (Munich, New York: Prestel, 2012), 14. 8 Alfred Moir, Caravaggio (New York: H.N. Abrams, 1982), 15. 9 Garrard, Artemisia Gentileschi, 13. 10 Caffrey. 11 Ibid. 12 Ibid. 13 Federico Etro, et al, "The Labor Market in the Art Sector of Baroque Rome,” Economic Inquiry 53, no. 1 (2015): 367. 14 Cohen, “To Pray, To Work, To Hear, To Speak,” 298.

125 women regularly made use of them as well. Working-class women spent a great deal of time in urban spaces for various reasons, but for most women it was essential because they could participate in work obligations.15 Such work included art, and the changes occurring in Rome gave artists the necessary opportunities to make art a viable career.

However, most female artists were not given the same opportunities as men to achieve success. In 1559, Annibale Caro, an Italian writer and poet, called painting “the profession of gentlemen” and expressed the general reluctance of men to acknowledge women’s artistic skills.16 Along with being labelled as incapable and essentially excluded from the profession, women were also prohibited from guilds, academies, and artistic training.17 Despite the disadvantages of their gender, some women in early modern Italy did succeed as artists, although they were often overshadowed by the careers of male artists.

Caravaggio was one such artist to benefit from the urban and cultural transformations in Rome in the seventeenth century. Born Michelangelo Merisi on 29

September 1571, he grew up in Milan in a middle-class family.18 His father, Fermo

Merisi, and his father’s new wife, Lucia Aratori, managed the estate and properties of the

Marchese of Caravaggio, a well-known farming community north of Milan.19 In 1576, when Caravaggio was only six years old, one of Europe’s deadliest plagues made its way into Italy, causing tens of thousands to die. Fermo Merisi, along with Caravaggio’s

15 Ibid, 306-307. 16 Babette Bohn and James M. Saslow, “From Oxymoron to Virile Paintbrush: Women Artists in Early Modern Europe,” in A Companion to Renaissance and Baroque Art (Chichester, West Sussex: Wiley- Blackwell, 2013), E-book edition, chap. 11. 17 Ibid. 18 Zuffi, 10. 19 Ibid.

126 grandfather and uncle, were all victims of the plague in late 1577.20 Caravaggio remained in Milan, living with his mother and four siblings until November of 1590 when Lucia

Merisi died.21 The now orphaned Merisi children were left deeply in debt, prompting

Caravaggio to sell most of his inherited land in the following years, though he likely spent the money he received unwisely. In 1592, with the intention to move to Rome,

Caravaggio quickly divided his inherited property a final time between his two remaining siblings, leaving him with an inheritance of 393 Imperial pounds.22 With this being enough to support him for the next few years, Caravaggio left for Rome, arriving around

1592 at the age of twenty-one with the desire to become a painter.

Before this move, while growing up in Milan, Caravaggio spent most of his childhood receiving professional artistic training from his apprenticeship with Milanese painter Simone Peterzano.23 His apprenticeship with Peterzano began at the age of twelve, where he worked for four years in Peterzano’s home while being taught how to paint. During his time in Milan, Peterzano was involved in the decoration of many churches in the Renaissance styles of the late sixteenth century. Peterzano used his own paintings, as well as the works of Titian and Leonardo da Vinci, to teach Caravaggio the fundamentals and basic principles of painting in the Milanese style.24 Despite being a difficult student, and oftentimes refusing to learn techniques like painting frescoes,

Caravaggio was a well-educated and gifted painter.

In 1592, after the death of his mother, Caravaggio moved to Rome, where urban renewal had made the city increasingly appealing to artists. By the late sixteenth century,

20 Howard Hibbard, Caravaggio (New York: Harper & Row, 1982), 12. 21 Zuffi, 14. 22 Hibbard, 5. 23 Zuffi, 13. 24 Hibbard, 4.

127 the city had established itself as the epicentre for art, but Caravaggio did not immediately find success. During his early years in Rome, Caravaggio was extremely poor and worked for minor artists, doing demeaning work in order to make money.25 He also began showing his paintings in public outdoor exhibits, which was considered the lowest category of artist merchandising.26 His early works were painted with the sole intention to sell large quantities, as he did not yet have commissions. Roman streets and those he encountered in his everyday life often influenced these paintings. His first patron, the influential Cardinal Francesco Maria del Monte, took note of Caravaggio’s paintings, purchased several, and encouraged him to continue a career in art. The Cardinal also offered him accommodations, giving Caravaggio access to the Roman aristocracy and other wealthy patrons.27 From this point on, Caravaggio became Rome’s newest and most impressive painter.

Caravaggio’s first public commission in 1599 was for two paintings, including an altarpiece, for San Luigi dei Francesi, the national church of France in Rome.28 His specific instructions were to create a piece related to the theme of St. Matthew.

Caravaggio’s first depiction of St. Matthew, however, was rejected by the church for its crude realism, but was accepted after his second attempt, receiving four hundred scudi in return.29 The Calling of St. Matthew was Caravaggio’s final painting in the style he had developed while in Rome, now devoting his career exclusively to religious themes and subjects.30

25 Ibid, 6-8. 26 Ibid, 16-17. 27 Zuffi, 16. 28 Ibid. 29 Hibbard, 91-92. 30 Zuffi, 16-20.

128

Caravaggio received almost constant commissions in the early seventeenth century, becoming well-known both in Roman art circles and across Europe. He was commissioned by several patrons to decorate chapels and churches throughout Rome.

Despite becoming a renowned painter, Caravaggio was also one of Italy’s most controversial. Around 1605, Caravaggio received a commission from the

Archconfraternity of the Papal Grooms to complete a painting intended for St. Peter’s

Basilica. The painting was intended to show Mary, with Jesus standing on the head of a serpent, a symbol for sin, accompanied by St. Anne, the mother of Mary. Instead,

Caravaggio chose to paint Mary with her foot also on the serpent’s head, to show that she had overcome Original Sin with the help of her son.31 The Archconfraternity took issue with Caravaggio’s disregard for the original message, and his realistic depiction of the holy figures. The nakedness of Jesus, the elderly portrayal of St. Anne, and the fact that

Caravaggio had a prostitute model Mary, all played a part in the Archconfraternity’s rejection of the Madonna dei Palafrenieri (Figure 2.1).

Caravaggio’s controversial paintings were a continuous theme in his career as an artist. Patrons often refused to display his work, but his commissions were never seriously affected, and he continued to receive commissions from some of Italy’s most notable figures. Despite constant backlash from patrons, Caravaggio’s controversial style and unwillingness to adapt for commissioned works did not have an effect on his career.

The professional challenges faced by Caravaggio were largely self-produced and did not derive from social restrictions.

31 Ibid.

129

The style and themes depicted in Caravaggio’s paintings changed throughout his years in Italy. In the early years of his career, those he encountered on the streets of Rome influenced much of Caravaggio’s art. The Fortune Teller, The Cardsharps, and later,

Death of the Virgin, all demonstrate the impact that ordinary life in Rome had on

Caravaggio’s art style. In The Fortune Teller a gypsy woman is telling the fortune of a well-dressed young man (Figure 2.2).

While reading his palm, the woman slips a ring off of his finger without him noticing. This painting, of an event from everyday life, is a reminder not to be fooled by attractive deceptions. 32 Caravaggio painted The Fortune Teller in 1594 soon after arriving in Rome and experiencing the city’s streets for the first time.33 The Cardsharps, also painted around 1594, shows two young boys playing cards, one being a cardsharp

(Figure 2.3). The cardsharp attempts to trick the unsuspecting boy with the help of an older man, most likely his accomplice.34 The tear in the older man’s gloves and the feather in the cardsharp’s hat offer realistic impressions. Both The Fortune Teller and The

Cardsharps express realistic scenes of Roman street life. They successfully demonstrate

Caravaggio’s interest in the drama of everyday life on the streets of Rome, depicted in several of his paintings.

In the 1600s, Caravaggio established himself as a religious artist, focusing on biblical scenes and religious themes. However, his carefree handling of religious subjects shocked Rome and his patrons. For example, The Death of the Virgin was a commission for the church of Santa Maria della Scalla (Figure 2.4).35 The painting shows the

32 Ibid, 36. 33 Ibid. 34 Ibid, 38. 35 Ibid, 104.

130

Apostles and Mary Magdalene grieving together over the deathbed of Mary. Caravaggio had the ability to bring reality to his religious paintings, reflected in the figures and setting of this painting. The overly-realistic portrayal of Mary was one of the reasons why

The Death of the Virgin was controversial. Her chalk-white, bloated figure and her bare feet hanging off the bed upset the monks who had commissioned it.36 In addition,

Caravaggio again used a prostitute to model for Mary, increasing the offensiveness of the painting. And yet, despite his controversial art style and paintings, Caravaggio’s career continued to be a success.

Since his time in Milan, Caravaggio was known to have a violent personality with frequent confrontations with authorities. On 28 May 1606, Caravaggio was involved in a violent altercation between himself and a group of men, over a disagreement about a wager for a tennis match.37 During the fight, Caravaggio killed Ranuccio Tomassoni and subsequently fled Rome in order to escape execution. For the next several years,

Caravaggio lived in Naples where he continued his successful career as a commissioned painter. He painted for the Knights Hospitaler of St. John in Malta in 1608, receiving an honorary knighthood.38 After a brief imprisonment in Malta for attacking a knight,

Caravaggio travelled to Syracuse, Sicily, and Naples, continuing to receive commissions while still wanted for his crimes in Rome. In early 1610, Ferdinando Gonzaga of Mantua became cardinal and began soliciting a pardon for Caravaggio.39 However on 18 July

1610 on the shore of the Tyrrhenian Sea, while travelling back to Rome in hopes of

36 Ibid. 37 Hibbard, 206. 38 Ibid, 228. 39 Ibid, 254.

131 reconciliation, Caravaggio died.40 Despite his violent and controversial life, Caravaggio’s successful career and art represented the Baroque period and thoroughly transformed

Italian painting. The lack of social restrictions and professional challenges that

Caravaggio faced, despite being a controversial figure, is reflective of the gender hierarchy in Italy at this time.

In contrast, Artemisia Gentileschi—despite her significant accomplishments and illustrious career—encountered great adversity as a female artist in seventeenth-century

Italy. The personal and professional disadvantages she faced, especially when compared to male artists such as Caravaggio, had an immense impact on the success of her career.

Artemisia Gentileschi was born on 8 July 1593 in Rome to Orazio Gentileschi and his wife Prudentia Montone.41 As the only daughter of Orazio, a moderately successful painter, growing up in Rome surrounded by artists, Gentileschi quickly developed an interest for the arts and painting. However, women in early modern Italy were not allowed to participate in normal paths to artistic careers. Along with being excluded from apprenticeships, women were not allowed to train with more than one established master, travel, or have membership in guilds.42 Instead, like many aspiring women artists at this time, Gentileschi received her artistic training through her father. 43 By nineteen, with her extraordinary talent and extensive informal training, Gentileschi began assisting in her father’s workshop, as well as helping him on major projects, including decorating several of the new palazzi being built in early seventeenth-century Rome.44 She also began

40 Ibid. 41 Garrard, Artemisia Gentileschi, 13. 42 Ibid. 43 Bohn, chap. 11. 44 Griselda Pollock, Differencing the Canon: Feminist Desire and the Writing of Art's Histories (New York: Routledge, 1999), 103.

132 working on her own paintings, with her first most likely being Susanna and the Elders in

1610, when she was only seventeen years old.45

Gentileschi’s exclusion from professional training, while putting her at a disadvantage, did not prevent her from continuing to learn. Being a professional painter in Rome, Orazio had access to many talented colleagues who were willing to teach

Gentileschi the fundamentals of painting. In 1611, Orazio arranged for his daughter to study perspective with his friend and colleague, Agostino Tassi.46 In May of 1611, while working on a portrait at her home, Tassi began flirting with Gentileschi and when she denied his advances, he raped her.47 As she fought back against the assault, Tassi promised marriage to mollify her and she accepted.48 While the loss of one’s virginity before marriage did not completely destroy a girl’s life in seventeenth-century Rome, premarital sex was still religiously and socially prohibited.49 Gentileschi’s marriage prospects would have significantly declined after forcibly losing her virginity, thus a promise of marriage, even by her rapist, would have seemed like her only option.

Tassi and Gentileschi continued a sexual relationship until March 1612, when

Orazio brought a suit against Tassi for the rape of his daughter.50 The trial lasted seven months, with Tassi and several other alleged accomplices testifying against the charges.

During the trial, Tassi proposed a series of counter-allegations against Gentileschi’s chastity in an attempt to make her seem immoral, claiming that “[Orazio] stated to me

45 Garrard, Artemisia Gentileschi, 13. 46 Pollock, 105. 47 Elizabeth S. Cohen, "The Trials of Artemisia Gentileschi: A Rape as History," The Sixteenth Century Journal 31, no. 1 (2000): 49. 48 Ibid. 49 R. Ward Bissell, Artemisia Gentileschi and the Authority of Art: Critical Reading and Catalogue Raisonné (University Park, PA.: Pennsylvania State University Press, 1999), 17. 50 Mary D. Garrard, “Testimony of the Rape Trial of 1612,” in Artemisia Gentileschi: The Image of the Female Hero in Italian Baroque Art (Princeton: Princeton University Press, 1988), 403.

133 that by saying that his daughter was leading a bad life he meant that she was a whore, and that he didn’t know how to remedy this.” 51 Along with being accused by Tassi,

Gentileschi was voluntarily tortured with the sibille on several occasions, in order to prove that she was being truthful during her testimony.52 Ultimately, it is likely that Tassi was convicted but not sentenced and was subsequently released after spending only nine months in prison.53 Gentileschi’s rape and trial were pivotal moments in her life. While her rape did not damage her reputation, it was a turning point in her career as an artist.

Several of Gentileschi’s paintings offer portrayals of vulnerable women heroically resisting attacks, images of strong and assertive women, and of resistance to the patriarchy.54 Instead of repressing the rape and trial, Gentileschi used these experiences as inspiration for her paintings.

As a professional female painter in seventeenth-century Italy, Gentileschi encountered both resistance and acceptance. After the rape trial concluded in 1614, she moved to Florence with her new husband to pursue a career separate from that of her father.55 By 1615, Gentileschi was described by contemporaries as a well-known artist in

Florence.56 Her easy acceptance into Florentine art circles can be explained by her position as protégée of Michelangelo Buonarroti the Younger, a prominent Florentine and long-time patron during her career.57 Gentileschi’s first documented commission was for

51 Garrard, “Testimony of the Rape Trial of 1612,” 447. 52 Ibid, 404. 53 Ibid, 406. 54 Cohen, "The Trials of Artemisia Gentileschi,” 47. 55 Keith Christiansen et al, Orazio and Artemisia Gentileschi (New York: Metropolitan Museum of Art; New Haven: Yale University Press, 2001), 313. 56 Garrard, Artemisia Gentileschi, 34. 57 Mieke Bal, The Artemisia Files: Artemisia Gentileschi for Feminists and Other Thinking People (Chicago: University of Chicago Press, 2005), 119.

134

Buonarroti’s home in 1615, for which she received a substantial amount.58 Throughout her career, Gentileschi received numerous commissions from prominent people across

Europe. This included an extended professional relationship with the Grand Duke and

Duchess of Florence, who supported her career for many years until the Duke’s death in

1621.59 By the time she returned to Rome in the 1620s, following the Duke’s death, numerous notable artists and members of Rome’s society had pronounced her a marvel.

For many years after 1621, Gentileschi periodically travelled throughout Italy, and briefly to England, where she gained immense international recognition for her paintings. Her lack of large fresco commissions did not impact her fame.60 In personal letters from 1635, she stated that she had previously received commissions from several

European monarchs, including the kings of France and Spain, and, most notably, King

Charles I of England.61 In 1638, Gentileschi resided in England at the court of Charles I, where she worked alongside her father, for a final time, to complete his commission of decorating the ceiling of the Queen’s house at Greenwich.62 For nearly a decade after her return to Italy in 1642, Gentileschi, despite facing professional limitations throughout her life, was a remarkably successful painter, whose determination and skill attracted prestigious patrons from across Europe.

And yet, despite her success, Gentileschi faced financial and professional struggles throughout her life. Her years in Florence are the best documented of her career, showing that a significant concern for her was money. Gentileschi was always in need of money and had a constant problem with debt, primarily as a result of not being able to

58 Garrard, Artemisia Gentileschi, 34-35. 59 Keith Christiansen et al, Orazio and Artemisia Gentileschi, 228. 60 Ibid, 271. 61 Garrard, Artemisia Gentileschi, 90. 62 Ibid, 112.

135 pay the high costs for materials, including models for her paintings.63 These debts also led to many years of legal troubles, though the Grand Duke attempted to help her find ways to reduce her debt.64

On several occasions, Gentileschi sent letters to patrons asking for an advance on their commissions. In 1649, she asked for a deposit from Don Antonio Ruffo, her primary patron in the later years of her career, while also pleading for him not to reduce the price of his commission. “I cannot give it to you for less than I asked,” she wrote, “as I have already overextended myself to give the lowest price.”65 In her letter, Gentileschi attempted to prove to Ruffo that her painting was worth the price for which that she had asked. This letter, which was sent at the end of her career, demonstrates how Gentileschi still had to justify her talent to patrons, despite being a well-known professional female artist in seventeenth-century Italy.

While living in Rome, Gentileschi could not become a member of the Roman

Accademia di S. Luca, which prohibited women from attending any of the meetings or private instructional programs it held.66 After moving to Florence and emerging as a successful artist in the community, however, Gentileschi used her personal connections to gain entry into the Florentine Accademia del Disegno. In 1616, Gentileschi became the first female member of the Florentine Academy, which helped artists develop and sustain their notability. 67 Despite facing professional challenges in her career, Gentileschi’s

63 Ibid, 135. 64 Cropper, 760. 65 Artemisia Gentileschi to Don Antonio Ruffo, 1649, in Early Modern Europe, 1450–1789, ed. Merry E. Wiesner-Hanks (Milwaukee: Cambridge University Press, 2013). 66 Garrard, Artemisia Gentileschi, 34. 67 Bal, 119.

136 acceptance into the Accademia del Disegno was a massive personal and professional accomplishment for both Gentileschi and female artists in early modern Italy.

The struggles that Gentileschi encountered in her life as a woman in seventeenth- century Italy had a profound impact on her interpretations of bible stories and how she chose to portray women in her paintings. Throughout her works, there are noticeable themes that reflect the feelings she held towards the patriarchy and violence against women. Being a rape victim herself, Gentelischi’s portrayal of Susanna in Susanna and the Elders differs greatly from other, primarily male artists, who worked with this subject

(Figure 2.5). The painting, based off a biblical story, shows Susanna and two elderly men demanding that she submit to them sexually, threatening to accuse her of adultery if she does not comply. Susanna refuses, even though the punishment for adultery is death, and is pictured visibly denying their advances.68 Though the painting was dated 1610, prior to

Gentileschi’s rape and trial, her depiction of sexual violence and violation is a clear reflection of how she felt about society’s problems with sexual abuse and sexual vulnerability.

Gentileschi’s painting from 1620, Judith and Her Maidservant, demonstrates her portrayal of female protagonists as strong heroines (Figure 2.6). In the painting, Judith denotes an imposing forcefulness with her sword, showing that she is attentive and ready to fight if needed.69 Like Judith and Her Maidservant, Gentileschi’s Jael and Sisera also depicts the woman in a strong, assertive position (Figure 2.7). Unlike other portrayals of

Jael, Gentileschi’s interpretation focuses on Jael acting in quiet deliberation rather than

68 Bohn chap. 11. 69 Bal, 88.

137 with passion.70 Her positive portrayal of Jael differs greatly from that of male painters, especially as it removes any sexual appeal and unnecessary evil connotations.71 The abundance of female protagonists represented in Gentileschi’s paintings is unlike any others at this time. Her heroic characterizations of women and their resistance to submissiveness demonstrate how Gentileschi’s experiences in life, including her resistance to dominant gender norms, influenced her career as an artist. The professional challenges and limitations she experienced during her career represent the lives of women in early modern Italy. Despite her immense talent and success, Gentileschi was not considered equal when compared to her male contemporaries. As a woman in the public sphere, Artemisia Gentileschi encountered disadvantages solely because of her gender.

One of the most prevalent biblical figures in paintings during the Baroque period was the Israelite widow, Judith. During their careers, both Caravaggio and Gentileschi featured the same biblical story of Judith, in their paintings. Their interpretations of her story differ greatly, despite the influence of Caravaggio’s version on Gentileschi.

Gentileschi, who had grown up in Rome at the height of Caravaggio’s career, was surrounded by his works. Additionally, Gentileschi’s father was a colleague and close follower of Caravaggio’s, which meant she was taught primarily in Caravaggio’s style of painting. 72 By the time Gentileschi began her apprenticeship with her father,

Caravaggio’s painting, Judith Beheading Holofernes, was well-known in Rome since its creation in 1599 (Figure 2.8).

70 Ibid, 109. 71 Garrard, Artemisia Gentileschi, 340. 72 Ibid, 14.

138

However, when compared to Gentileschi’s Judith Slaying Holofernes dated 1614, it is apparent that gender had a significant impact on their interpretations of the story

(Figure 2.9). In Caravaggio’s depiction of the beheading of the Assyrian General

Holofernes, Judith is seen standing to the side, focused intently on the victim. Her face shows either a mixture of cold-bloodedness and revulsion or awkward determination, with her servant calmly watching at her side.73 While the lighting in the painting makes it so Judith and her servant are the focus, they are overshadowed by Holofernes who is taking up most of the scene. The women are not given equal space in the painting, which is a recurring issue in Caravaggio’s work.74 Caravaggio’s entire focus is on the action of the beheading and of death. While Judith is the one beheading Holofernes, she is ultimately portrayed as submissive in this depiction. The distance between her and

Holofernes and the lack of emotion on Judith’s face make her seem extremely passive.

Gentileschi’s depiction of the bible story, through her painting Judith Slaying

Holofernes, differs greatly in comparison to Caravaggio’s. The most obvious and noteworthy difference between the two interpretations is the positioning of Judith and her servant. Rather than being relegated to the side as in Caravaggio’s painting, Judith and her servant are directly on top of Holofernes, holding him down while beheading him.

Even though he is struggling, seen in his attempt to push the servant off him, the women are able to successfully overpower him. Gentileschi’s interpretation suggests that she expresses her hostile feelings towards men in power.75 Judith was painted in 1614, shortly after Gentileschi’s rape and the subsequent trial.76 This would account for the

73 Hibbard, 85. 74 Ibid, 85. 75 Pollock, 105. 76 Garrard, Artemisia Gentileschi, 32.

139 violence of the piece, as if she employed the beheading of Holofernes as a means to convey anger towards Tassi. Gentileschi emphasises Judith’s power in her interpretation, focusing on the strength and determination of the two women. The decision to portray

Judith in this way is a result of Gentileschi’s personal experiences as a woman in Italy, and specifically as a victim of rape. Her encounter with Tassi, in which she was overpowered and forced to endure sexual assault, influenced the way in which she chose to depict the relationship between Judith and Holofernes. Unlike Caravaggio’s portrayal of Judith, in which she is persistent but reserved, Gentileschi’s portrayal of strong female protagonists is an explicit refusal to comply with seventeenth-century gender norms.

Throughout their careers, Caravaggio and Gentileschi had complex reputations, shaped by both personal challenges and social expectations of gender. Prejudices against women in the seventeenth century posed considerable obstacles for female artists. Flawed preconceptions about the role of women in society hindered those who tried to achieve some measure of artistic success in early modern Italy. The disadvantages in professional training and opportunity, coupled with expectations of moral and social behaviour of women, defined the success and reputations of female artists such as Gentileschi. While

Caravaggio was described as the Baroque era’s greatest artist despite his violent past,

Gentileschi was often characterized by her sexuality.

There are explicit differences between the contemporary reputations of

Caravaggio and Gentileschi. Caravaggio’s carefree handling of religion in his paintings shocked his patrons, establishing himself as a controversial artist. Despite having critics and rivals across Italy, his career never suffered. Caravaggio received numerous public commissions during his career, but his controversial style and the subsequent rejections

140 did not have any great impact. However, his personal struggles overshadowed his professional scandals. Caravaggio’s noted violent behaviour, specifically his murderous actions in 1606, damaged his reputation briefly after his death.77 Although his reputation often centered on his controversial behaviour and art, Caravaggio never experienced restrictions as a professional male artist because of his gender.

Despite being referred to as the greatest female artist of the Baroque age,

Gentileschi has never been on equal standing with her contemporary male artists. As she became more popular in Italy, she also experienced greater criticism. Some of her contemporary critics believed it was inappropriate for female artists to be working in public spaces, as they were solely for men to inhabit.78 Gentileschi was aware of her disadvantages as a female artist in a masculine field.79 In a letter from 1649 to her patron

Ruffo about not being paid for a commission she had completed, she said, “If I were a man, I cannot imagine it would have turned out this way.” 80 The professional disadvantages Gentileschi encountered, and the fact that she was regarded as inferior because of her gender, did not go unnoticed.

Though her art was never as controversial as Caravaggio’s, the themes depicted in her paintings also emphasized her difference from contemporaries. Gentileschi shifted the focus away from nude models and the sexualisation of women and gave women agency.

However, many of her paintings have been lost over the years. This is largely due to the prejudices of contemporaries and the fact that a significant amount of female artwork has

77 Zuffi, 29. 78 Bohn, chap. 11. 79 Bissell, 116. 80 Artemisia Gentileschi to Don Antonio Ruffo, 1649, in Artemisia Gentileschi: The Image of the Female Hero in Italian Baroque Art (Princeton: Princeton University Press, 1988), 398.

141 been wrongly credited to male painters.81 This has had a significant impact on the study and commemoration of female artists from the early modern period. However, as a considerable portion of her accomplishments have been preserved since her death,

Gentileschi has become well-known and celebrated for choosing to resist male violence and gender norms throughout her career and life in seventeenth-century Italy.

Art is undoubtedly representative of broader gender relations in early modern

Italy. The professional restrictions and limitations that were experienced by female artists in the seventeenth century are illustrative of the overall perception of women in the public sphere. The inability for women to receive professional art training and the doubt patrons had about the capabilities of female painters exemplifies the dichotomy between genders in the early modern period. While both Gentileschi and Caravaggio were from similar middle-class families, Caravaggio’s ability to become a professional artist was significantly easier than it was for Gentileschi. At a time when there were very few successful female artists in Rome, Gentileschi was able to build a respectable and prosperous career despite the disadvantages facing women in the seventeenth century.

Gentileschi’s career provided a critical turning point for women in early modern Italy, illustrating how talented woman working in an almost exclusively male profession could overcome considerable obstacles to achieve prominence.

81 Bohn, chap. 11.

142

Appendix 2

Figure 2.1. Michelangelo Merisi da Caravaggio, Madonna dei Palafrenieri, 1605-1606. Oil on Canvas, 115 x 83″. Galleria Borghese, Rome. ©Wikimedia Commons/Web Gallery of Art.

Figure 2.2. Michelangelo Merisi da Caravaggio, The Fortune Teller, 1595. Oil on Canvas, 115 x 150 cm. Musei Capitolini, Italy. ©Flickr/Rodney (CC BY- SA 2.0)

143

Figure 2.3. Michelangelo Merisi da Caravaggio, The Cardsharps, 1594. Oil on Canvas, 37 x 52″. Kimbell Art Museum, Texas. ©Wikimedia Commons/Google Art Project.

Figure 2.4. Michelangelo Merisi da Caravaggio, Death of the Virgin, c. 1606. Oil on Canvas, 3.69 x 2.45 m. The Louvre, Paris. ©Wikimedia Commons/The Yorck Project/Directmedia

144

Figure 2.5. Artemisia Gentileschi, Susanna and the Elders, 1610. Oil on Canvas, 1.7 x 1.21 m. Pommersfelden, Germany. ©Wikimedia Commons/Web Gallery of Art.

145

Figure 2.6. Artemisia Gentileschi, Judith and her Maidservant, 1612-1613. Oil on Canvas, 1.14 m x 94 cm. Palazzo Pitti, Florence. ©Flickr/Jennifer Mei.

Figure 2.7. Artemisia Gentileschi, Jael and Sisera, 1620. Oil on Canvas, 86 x 125 cm. Museum of Fine Arts, Budapest. ©Wikimedia Commons/Web Gallery of Art.

146

Figure 2.8. Michelangelo Merisi da Caravaggio, Judith Beheading Holofernes, 1599. Oil on Canvas, 145 x 195 cm. Galleria Nazionale d'Arte Antica, Italy. ©Wikimedia Commons/All Art Painting

Figure 2.9. Artemisia Gentileschi, Judith Slaying Holofernes, 1614. Oil on Canvas, 158.8 x 125.5 cm. Museo di Capodimonte, Italy. ©Wikimedia Commons/Web Gallery of Art

147

Bibliography Bal, Mieke. The Artemisia Files: Artemisia Gentileschi for Feminists and Other Thinking People. Chicago: University of Chicago Press, 2005. Bissell, R. Ward. "Artemisia Gentileschi- A New Documented Chronology." The Art Bulletin 50, no. 2 (1968): 153-168. Bissell, R. Ward. Artemisia Gentileschi and the Authority of Art: Critical Reading and Catalogue Raisonné. University Park, PA: Pennsylvania State University Press, 1999. Bohn, Babette, and James M. Saslow. “From Oxymoron to Virile Paintbrush: Women Artists in Early Modern Europe.” In A Companion to Renaissance and Baroque Art. Chichester, West Sussex: Wiley-Blackwell, 2013. 229-249. Burroughs, Charles. "Opacity and Transparence: Networks and Enclaves in the Rome of Sixtus V." RES: Anthropology and Aesthetics, no. 41 (2002): 56-71. Caffrey, Cait. "Baroque Period." 2016. Salem Press Encyclopedia Research Starters, EBSCOhost. Caravaggio, Michelangelo Merisi da. Death of the Virgin, c. 1606. Oil on Canvas, 3.69 x 2.45 m. The Louvre, Paris. ©Wikimedia Commons/The Yorck Project/Directmedia Caravaggio, Michelangelo Merisi da. Judith Beheading Holofernes, 1599. Oil on Canvas, 145 x 195 cm. Galleria Nazionale d'Arte Antica, Italy. ©Wikimedia Commons/All Art Painting Caravaggio, Michelangelo Merisi da. Madonna dei Palafrenieri, 1605-1606. Oil on Canvas, 115 x 83.” Galleria Borghese, Rome. ©Wikimedia Commons/Web Gallery of Art. Caravaggio, Michelangelo Merisi da. The Cardsharps, 1594. Oil on Canvas, 37 x 52. Kimbell Art Museum, Texas. ©Wikimedia Commons/Google Art Project. Caravaggio, Michelangelo Merisi da. The Fortune Teller, 1595. Oil on Canvas, 115 x 150 cm. Musei Capitolini, Italy. ©Flickr/Rodney (CC BY-SA 2.0) Christiansen, Keith, Judith Walker Mann, Orazio Gentileschi, and Artemisia Gentileschi. Orazio and Artemisia Gentileschi. New York: Metropolitan Museum of Art; New Haven: Yale University Press, 2001. Cohen, Elizabeth S. "The Trials of Artemisia Gentileschi: A Rape as History." The Sixteenth Century Journal 31, no. 1 (2000): 47-75. Cohen, Elizabeth S. “To Pray, To Work, To Hear, To Speak: Women in Roman Streets c. 1600.” Journal of Early Modern History 12 (2008): 289-311. Cropper, Elizabeth. "New Documents for Artemisia Gentileschi's Life in Florence." The Burlington Magazine 135, no. 1088 (1993): 760-761.

148

Etro, Federico, Silvia Marchesi, and Laura Pagani. “The Labor Market in the Art Sector of Baroque Rome.” Economic Inquiry 53, no. 1 (2015): 365-387. Academic Search Complete, EBSCOhost. Garrard, Mary D. Artemisia Gentileschi: The Image of the Female Hero in Italian Baroque Art. Princeton: Princeton University Press, 1988. Garrard, Mary D. “Testimony of the Rape Trial of 1612.” In Artemisia Gentileschi: The Image of the Female Hero in Italian Baroque Art. Princeton: Princeton University Press, 1988. 403-488. Gentileschi, Artemisia. Jael and Sisera, 1620. Oil on Canvas, 86 x 125 cm. Budapest, Museum of Fine Arts. ©Wikimedia Commons/Web Gallery of Art. Gentileschi, Artemisia. Judith and her Maidservant, 1613. Oil on Canvas, 93.5 x 114 cm. Italy, Palazzo Pitti. ©Flickr/Jennifer Mei. Gentileschi, Artemisia. Judith Slaying Holofernes, 1614. Oil on Canvas, 158.8 x 125.5 cm. Italy, Museo di Capodimonte. ©Wikimedia Commons/Web Gallery of Art. Gentileschi Artemisia. Susanna and the Elders, 1610. Oil on Canvas, 1.7 x 1.21 m. Pommersfelden, Germany. ©Wikimedia Commons/Web Gallery of Art. Gentileschi, Artemisia to Don Antonio Ruffo, 1649. In Artemisia Gentileschi: The Image of the Female Hero in Italian Baroque Art. Princeton: Princeton University Press, 1988. Gentileschi, Artemisia to Don Antonio Ruffo, 1649. In Early Modern Europe, 1450– 1789, ed. Merry E. Wiesner-Hanks. Milwaukee: Cambridge University Press, 2013. Hibbard, Howard. Caravaggio. New York: Harper & Row, 1982. Moir, Alfred. Caravaggio. New York: H.N. Abrams, 1982. Pollock, Griselda. Differencing the Canon: Feminist Desire and the Writing of Art's Histories. New York: Routledge, 1999. Zuffi, Stefano. Caravaggio: Masters of Art. Munich, New York: Prestel, 2012.

149

The Trials and Significance of Nazi War Criminals and Collaborators in France By: Gabrielle Marshall

Following the trials of Nazi war criminals and collaborators that transpired immediately after World War II, decades passed before the trials of Nazi war criminal

Klaus Barbie and French collaborators Rene Bousquet and . While the reason for the delayed trials differed, the shared relevance of the trials was their allowance for the resurrected testimony of survivors of German occupation and the subsequent Holocaust. , otherwise known as the ‘Butcher of ,’ was convicted of war crimes and sentenced to death in 1952 but with the help of the American government was able to evade capture until 1983. While Barbie’s defence lawyer Jacques

Verges attempted to deflect his client’s crimes by arguing for the immorality of French military actions in Algeria, it was the testimony of Barbie’s victims and Holocaust survivors that remained the most prominent and significant aspect of the proceedings.

Barbie’s trial served as a reminder of the incalculable suffering inflicted by the Nazi regime even by lower level officials and allowed for the humanization of wartime statistics that took on a deeper significance when compiled with the testimony of its victims.

At the same time, the indictments of French collaborators Rene Bousquet and

Maurice Papon served as a reminder to the French population of the role of the Vichy government in the execution of Nazi policy as they collaborated in the deportation and systematic killing of the Jewish population. In 1949, Bousquet, the General Secretary of the Police, was tried and convicted of national unworthiness and sentenced to five years in prison but had his sentence commuted because of his last-minute aid of a group of

Resistance fighters destined for execution. It was not until 1991, following an illustrious

150 postwar career as a banker and businessman, that Bousquet was indicted for crimes against humanity. However, he was assassinated in 1993 before his trial began. A victim of vigilante justice, Bousquet was never properly tried for his crimes, allowing for the testimony of his victims to be lost and the narrative surrounding his name to be focused on his own violent death. In contrast, Papon, the Secretary General of the Gironde

Prefecture from 1942 to 1944, faced his trial in 1997 for his role in the deportation of

1,560 Jews from Bordeaux to concentration camps. While Papon, like Bousquet, enjoyed decades of success in the postwar era, a 1981 exposé began the chain of indictments that culminated in his 1998 conviction for crimes against humanity. The Papon trial was significant because it garnered widespread attention from the media and brought the extent of French collaboration during World War II to the forefront of the national zeitgeist decades after the events transpired. While the trials of Barbie, Bousquet, and

Papon occurred long after the initial wave of postwar convictions, their significance was compounded by the emergence of occupation and Holocaust survivors who created a legal and historical record of the horrors of the Nazi regime and the function of French collaboration in its execution.

Throughout World War II Klaus Barbie became notorious for the extreme acts of violence he committed against members of the French Resistance and the Jewish residents of Lyon while they were held under Nazi occupation. After joining the Nazi movement in 1934 Barbie was accepted into the SS, and by 1942 he had become the head of the in , the centre of the Resistance in the south of France. It was reported that under Barbie’s command in the final two years of occupation he had about

151

4,000 people executed in Lyons and another estimated 8,000 deported to death camps.1 In this time, Barbie earned the title of ‘Butcher of Lyon’ from his routine use of torture to extract information from members of the Resistance, as well as those of the Jewish faith.

At the end of the war the French government placed Barbie on its list of war criminals to be tried but by 1947 Barbie was working with the American Counter Intelligence Corps

(CIC) gathering intel on communist organizations.2 In spite of his work with the CIC

Barbie was tried in absentia at the Permanent Military Tribunal of Lyon in 1952 for the murder of civilians and torture of military personnel.3 While Barbie was convicted and sentenced to death he evaded capture with the help of the American government and immigrated to Bolivia where he lived under an assumed name. Barbie remained in hiding until 1983 when the Bolivian government complied with a French arrest warrant and extradited Barbie back to France to await trial. Due to various legal inquiries the trial did not commence until 1987.4 The delay in the trial was largely due to the statute of limitations on war crimes and the inability to retry crimes of which he had already been convicted, ultimately negating many of the charges laid against him. Barbie was, however, charged with crimes against humanity for certain incidents including the raid of the Union Generale des Israelites de France (UGIF), the arrest and deportation of 44 children and their caretakers from Izieu, and a deportation of 650 people in one of the last convoys to leave Lyon bound for a concentration camp.5

1 Guyora Binder, “Representing Nazism: Advocacy and Identity at the trial of Klaus Barbie,” Yale Law Journal 98, no. 7 (1989): 1325. 2 Ibid, 1325-1326. 3 Christian Delage, “The Klaus Barbie Trial: Traces and Temporalities,” Comparative Literature Studies 48, no. 3 (2011): 320. 4 Binder, 1326. 5 Delage, 320-321.

152

Significant to Barbie’s trial proceedings was the defence used by his lawyer

Jacques Verges, who had previously defended several known terrorists and enemies of

France. In a defence strategy that Verges referred to as ‘rupture,’ he continually attempted to mitigate the crimes of Barbie by bringing up allegations of French torture and murder during the Algerian War.6 In addition to Verges’ accusations of French conduct in the Algerian War, he also detailed the storied record of French collaboration and discussed the hypocrisy of some French Resistance heroes.7 Verges wanted to use the performative nature of the Barbie trial to “put a mirror to France's own raddled face,” to prove that the French government had no basis of morality on which to judge Barbie’s actions. 8 While effective in making headlines, the prosecution quickly highlighted distinctions between actions of the French in Algeria and the crimes of the Nazi regime.

As Guyora Binder notes, the prosecution indicated that it was the ideologically motivated and systematic nature of the crimes committed by the Nazis, Barbie included, that made them distinct as crimes against humanity.9 Thus, while the prosecution had to connect all of Barbie’s crimes to the Nazi ideology, it allowed for the prosecution to move forward, as the French had defined their actions in Algeria as a crime of war, distinct from

Barbie’s crimes against humanity. Verges went on to protest that the trial had become a persecution of an ideology and not the judgement of a single man. While Verges was able to create a media frenzy around French crimes in Algeria, it was the testimony of

Barbie’s victims that consistently brought the significance of the trial back into focus.

Barbie was convicted for crimes against humanity and sentenced to life imprisonment

6 “Jacques Verges; Notorious French Lawyer who ‘defended the indefensible’ including the Nazi War Criminal Klaus Barbie,” The Times (17 August 2013). 7 Ibid. 8 Ibid. 9 Binder, 1338.

153 due to France’s recent abolishment of the death penalty and served his sentence until his death in 1991.

On 4 July 1987, a month after Barbie’s trial commenced, journalist Michael

McCarthy published an article in The Times discussing the significance of putting an old man on trial decades after his crimes were committed.10 McCarthy’s article is significant as he discusses not only the need for justice for those who died at Barbie’s order, but the necessity of humanization of Holocaust statistics as the survivors entered the witness box to painfully recount their stories of torture and death.11 The courtroom presence of survivors and the families of those killed during Barbie’s trial served as a reminder to the public that, decades after the conflict had ended, the wounds of occupation were still present. Additionally, as mentioned in the testimony of Léon Reifman, a survivor of the

UGIF raid, the lack of regret expressed by Barbie left many in search of a larger meaning to his conviction. As Reifman noted, his hope was that “not so much out of vengeance, but as a lesson for history … through this one man the whole ideology that dishonours mankind might itself be condemned.”12 The nature of the Barbie trial indicates that a crucial aspect of the post-war memory in France was the distinction between the French and the Nazi party and removal of any implication of collaboration or similarity. This was evident in the deflection of Verges’ allegation of French crimes against humanity in

Algeria, as the French court went on to define their own crimes as distinct for their lack of Nazi ideology. Thus, Barbie’s trial served as a symbolic education for a new generation about the incalculable suffering inflicted by the Nazi party, in addition to establishing a complete separation of Nazi activity from French nationalism.

10 “Barbie-Now I Know Why They Bothered,” The Times (1987). 11 Ibid. 12 Delage, 329.

154

Another significant postwar trial in the latter half of the twentieth century was that of Rene Bousquet, an ambitious politician who rose to prominence under the Laval administration as the General Secretary of the Police in 1942. Bousquet argued that collaboration was the only way to preserve some semblance of French autonomy.

Bousquet indicated that the Vichy government had to permit French police to carry out

German orders to avoid increased destruction and harm to the French population, which may have come at the hands of German forces.13 In his own perverse attempts to keep the peace Bousquet went beyond the necessities of collaboration and duty, as he and Prime

Minister Laval proposed the inclusion of Jews under the age of 16 into deportations.14

But in an effort to reduce public outcry Bousquet proposed the deportation of foreign

Jews from both the occupied and unoccupied zones of France in exchange for leaving

French Jews alone, at least for the time being.15 During the Rafle du Vel’ d’Hiv on 16 and

17 June 1942, roughly 13,000 Jews were rounded up in Paris while a reported 47,000 more were deported from the rest of the country and sent to concentration camps before being filtered into death camps.16 Bousquet was tried in 1949 for his wartime activities and sentenced to five years for collaboration, but due to his last minute protection of

Resistance officials in 1943 before he lost his position as chief of police, his sentence was suspended and Bousquet served no jail time. Throughout his trial no questions regarding

Bousquet’s role in mass Jewish deportations were explored. This was most likely due to the efforts of officials to mitigate the role of French collaboration in the war and in the

13 Daniel Singer, “Death of a Collaborator,” Nation 257, no. 3 (1993): 201. 14 Ian Buruma, “The Vichy Syndrome,” Tikkun 10, no. 1 (1995): 44. 15 Ibid. 16 Eric Epstein, “Fit to be Tried: Maurice Papon and the Vichy Syndrome. Defeat and Collaboration,” Journal of Genocide Research 1, no. 1 (1999): 119.

155 mass murder of the Jewish population. This is evident in the fact that, until 1983, French history textbooks described Jewish deportations as a wholly German effort.17

Indicative of the postwar amnesia regarding French collaboration was Bousquet’s ability to resume his life as a notable figure, enjoying success in banking as well as selling luxury goods before retiring to a posh Paris apartment. By 1953, the French government had passed two amnesty laws that ended legal action against officials involved in the Vichy regime, but significantly these laws allowed for the persecution of those who carried out criminal Nazi policies.18 While such laws signified the end of the initial flow of persecutions of Vichy officials, it did leave room for the trials of collaborators thought to have carried out the will of Nazi ideology. Thus, in 1991

Bousquet was set to go on trial for charges of crimes against humanity regarding the

Rafle du Vel’ d’Hiv as well as his cancellation of regulations that would have protected certain classes of Jewish children from deportation. But Bousquet never made it to trial as

Christian Didier assassinated him on 8 June 1993. Thus, he was never formally tried and brought to justice for his role in the deportation of thousands of Jews.

According to Serge Klarsfeld, the attorney who filed the suit against Bousquet in

1989 for crimes against humanity and was instrumental in the capture of Klaus Barbie,

Bousquet’s trial was to have two major theses. “One thesis is that the Vichy regime was criminal, because of its persecution of the Jews,” he argued. “The other is that it was not, because under the circumstances it could not have acted otherwise.”19 But following the precedent of Barbie’s sentence to life imprisonment, of which he only served four years before his death, it left many French feeling that their courts were too humane and that

17 Ibid. 18 Delage, 321. 19 Buruma, 44.

156 vigilante justice was the only means of retribution. One such man was Christian Didier, born in 1944 in Nazi-occupied France. In 1987 Didier pretended to be a doctor and snuck into the prison that held Barbie in an attempt to assassinate him for his murder of French

Resistance leader .20 While Didier failed and spent a year in prison, he succeeded in his assassination of Bousquet. After disguising himself as a carrier of official documents for the upcoming trial, Didier gained access to Bousquet’s building and shot him when he opened the door to receive the package.21 While there were undoubtedly many French who did not mourn the death of Bousquet, the loss of his trial was significant since the trials provided more than mere vengeance against an individual, but aided in resurrecting the past as part of the collective memory of the French people.

Many leaders in the French Jewish community as well as Klarsfeld, who had worked for over a decade to bring Bousquet to trial, had hoped that such a high profile trial would force French citizens to acknowledge the extent of French collaboration during the war.22

While Bousquet did not go unpunished for his crimes, the extent of French collaboration and the thousands deported were no longer the focus, as his name became part of the collective memory as a victim of vigilante justice. This misappropriation of facts allowed for postwar amnesia about French collaboration to continue and thus further inhibited the ability of French citizens to accept their chequered past.

Another significant trial was that of collaborator Maurice Papon, the former

Secretary General of the Gironde Prefecture, charged with crimes against humanity for the deportation of 1,560 Jews from Bordeaux, 223 of which were children. Most of those deported were sent to Auschwitz where a majority were murdered. In his time as

20 Ibid. 21 Ibid. 22 “I Was Good and he Was Evil,” Newsweek 121, no. 25 (21 June 1993): 44.

157

Secretary, General Papon was in charge of traffic, rationing gas, requisitions, as well as

Jewish questions, in which Papon supervised the identification of Jews and the redistribution of their possessions to non-Jews.23 In spite of Papon’s collaboration during the war, his political career did not suffer, and he was able to maintain positions of power in the Fourth and Fifth Republics, serving as Charles De Gaulle’s Police Prefect of Paris from 1958 until 1966 and the budget minister from 1978 until 1981 when damning accusations were brought forth by the families of the Jewish deportees.24 Significant to

Papon’s time as Prefect of the Paris Police was his connection to the savage suppression of demonstrators against the Algerian War in which an estimated 100 people were beaten to death or drowned in the River Seine by the Paris Police force.25 While it is unlikely

Papon encouraged the violence that occurred, he refused to acknowledge that anything unjust occurred in the suppression of Algerian protestors, a fact that was used in his subsequent trial.

In 1983, following a highly publicized 1981 exposé in Le Canard Enchaîné in which he was accused of the deportation of Jews from Bordeaux, Papon was formally indicted for crimes against humanity. Investigations followed and new indictments were issued in 1991, and following the conviction of fellow Vichy collaborator in

1994 there were more indictments for Papon and an increased call to see him tried.26 A major issue that stalled Papon’s trial was the establishment of his guilt in the death of those whom he had deported, as it was widely held that it was not through ideological

23 Keith Coloquhoun and Ann Wroe, “Maurice Papon,” Economist Book of Obituaries (2008): 286. 24 Nancy Wood, “Memory in Trial in Contemporary France: The Case of Maurice Papon,” History and Memory 11, no. 1 (1999): 50. 25 Richard J. Golsan, "Memory's Bombes à Retardement: Maurice Papon, Crimes Against Humanity, and 17 October 1961," Journal of European Studies 28, no. 1-2 (1998): 153. 26 Wood, 50.

158 agreement with the Nazi party or an entrenched anti-Semitism that Papon carried out

Nazi policy. Papon’s rationale for collaboration was closer to that of Bousquet as he argued for the need to participate in deportation and subjugation of the Jewish population in Bordeaux in order to maintain French autonomy in government and mitigate the presence of Germans in France.27

Throughout the investigation into Papon’s crimes between 1995 and 1996, the prosecution worked to establish that, because Papon had taken the position of Secretary

General of the Gironde Prefecture with the knowledge that Germans aimed to deport

Jews to the east, he would have therefore been aware of the fate awaiting them. His acceptance of the position subsequently indicated a general approval of Nazi policy.28

During the Nuremberg trials, crimes against humanity were defined as criminal acts against a specific group of people, while the French trials placed culpability in the establishment of ideologically motivated action, as was made evident in the trial of

Barbie in the defense of French actions in Algeria. Significant to Papon was a 1997 ruling of the Supreme Court of Appeal, which stated that it was no longer necessary to prove ideological complicity in order to be found guilty of crimes against humanity.29 Papon’s trial began in October of 1997 and went on for six months, the longest in French history.

Due to the French government’s ruling that the court no longer had to establish ideological adherence, Papon was convicted of crimes against humanity and sentenced to ten years in prison. During his trial, the lawyer for the prosecution Arno Klarsfeld, son of

Serge Klarsfeld, asked Papon “to voluntarily participate in something one knows to be a crime—doesn’t that amount to being complicit in it, even without having desired the

27 Ibid, 51. 28 Ibid. 29 Ibid, 52.

159 consequences?” to which Papon replied, “in law or in morality?”30 While Papon’s potential knowledge of the Final Solution could not be legally confirmed nor denied, it was his refusal to disobey orders and remove himself from his political post within the collaborating regime that allowed the jury to find Papon guilty. Papon’s sentencing was significant because, in spite of simply following orders from his superiors and playing a relatively low-ranking position in the Vichy government, he was unable to deflect individual blame for his role in the deportation and eventual death of 1,560 Jews.

In 1999, following a rejected appeal of his case, Papon fled to Switzerland stating he would rather live in exile than face a French jail. Papon was quickly extradited back to

France where he served his jail time until advanced age and illness meant a release only three years into his sentence.31 Significantly, during Papon’s trial the Catholic Church, the office of the Magistrate, and the French police force apologized for their actions under the Vichy regime, indicating a massive shift in public discourse on French collaboration during World War II. Additionally, mass media coverage on television as well as in print—including several articles on Vichy and Papon in every issue of French newspaper for the six-month trial— indicated that French society was not ignoring their past.32 As Nancy Wood argues, Papon’s trial offered an opportunity for analysis of the specific function of the Vichy administration in the execution of the

Nazi’s Final Solution.33 The trial of Papon, much like that of Barbie and others complicit in the realization of Nazi policy, served as not only a stage for justice but for officially recording the trauma of their victims. Additionally, Papon’s trial allowed for the

30 Wood, 60. 31 Craig R. Whitney, “Maurice Papon, Convicted Vichy Official, 96, Dies,” The New York Times (18 February 2007). 32 Richard Vinen, “Papon in Perspective,” History Today 48, no. 7 (1998): 6. 33 Wood, 57.

160 reinstatement of memories of occupation into the French collective consciousness. It thus educated a new generation and provided an established public record of French collaboration.

While many questioned the necessity of indicting and trying old men for crimes they committed decades prior, it is evident in the cases of Klaus Barbie, Rene Bousquet, and Maurice Papon that the crimes they committed during the German occupation of

France left wounds that remained painful and relevant to survivors and the families of those who died. The trial of Barbie allowed for public acknowledgment of the magnitude of human suffering caused by even low-level officials within the Nazi regime, as well as the resurrection of testimonies of Holocaust survivors into the collective memory of

World War II. While Bousquet was assassinated before trial, the indictment of his crimes and the publicity surrounding his death, albeit not entirely focused on his crimes nor his victims, brought a rise in public awareness that the deportation of Jews was not an entirely German affair. In addition to Bousquet, the trial and conviction of Papon incited discussion of the role of the French government in the execution of the Final Solution. In spite of the indictments of Barbie, Bousquet, and Papon occurring decades after they committed their respective crimes against humanity, their trials established a legal and historical record of those who died under German occupation and the role of the collaboration of the French government in the execution of the Final Solution.

161

Bibliography “Barbie-Now I Know Why They Bothered.” The Times (4 July 1987). Binder, Guyora. “Representing Nazism: Advocacy and Identity at the trial of Klaus Barbie.” Yale Law Journal 98, no. 7 (1989): 1321-1383. Buruma, Ian. “The Vichy Syndrome.” Tikkun 10, no. 1 (1995): 44-50. Coloquhoun, Keith and Ann Wroe. “Maurice Papon.” Economist Book of Obituaries (2008): 286-287. Delage, Christian. “The Klaus Barbie Trial: Traces and Temporalities.” Comparative Literature Studies 48, no. 3 (2011): 320-332. Epstein, Eric. “Fit to be Tried: Maurice Papon and the Vichy Syndrome. Defeat and Collaboration.” Journal of Genocide Research 1, no. 1 (1999): 115-121. Golsan, Richard J. "Memory's Bombes à Retardement: Maurice Papon, Crimes Against Humanity, and 17 October 1961." Journal of European Studies 28, no. 1-2 (1998): 153-172. “I Was Good and He Was Evil.” Newsweek 121, no. 25 (21 June 1993), 44. “Jacques Verges; Notorious French Lawyer Who ‘Defended the Indefensible’ Including the Nazi War Criminal Klaus Barbie.” The Times (17 August 2013). Singer, Daniel. “Death of a Collaborator.” Nation 257, no. 3 (1993): 101-103. Vinen, Richard. “Papon in Perspective.” History Today 48, no. 7 (1998): 6-8. Whitney, Craig R. “Maurice Papon, Convicted Vichy Official, 96, Dies.” The New York Times (18 February 2007). Wood, Nancy. “Memory in Trial in Contemporary France: The Case of Maurice Papon.” History and Memory 11, no. 1 (1999): 41-76.

162

Vietnamese Farmers That Changed the World: the Impact of the

Vietnam War on the Cold War

By: Michael Martignago

The Vietnam War was the quintessential Cold War conflict between the United

States and the Sino-Soviet supplied, nationalistic North Vietnamese. This war saw the world’s most wealthy and dominant military force suffer a long, drawn out defeat to a poverty-stricken society of farmers, armed with nothing but unyielding nationalism and outdated weaponry.

This paper examines the United States’ involvement in Vietnam throughout the

Vietnam War and also explores the ways in which the Vietnam War affected the Cold

War. Beginning with President Harry S. Truman in 1945 and ending with President

Gerald Ford in 1975, this paper examines the motivations behind each of the six United

States Presidential Administrations during the Vietnam War and gives an in-depth explanation for the crucial decisions that were made by the United States Government over the course of the war. The effect that these foreign policy decisions and directives had on the Cold War atmosphere is also heavily analysed. The faults and failures of the

United States that led to their humiliating defeat in Vietnam consequently altered the

Cold War atmosphere. In order to fully understand the Cold War, it is necessary to understand the Vietnam War and its impact on United States foreign policy.

The Vietnam War was the longest war in American history, beginning with the

United States’ involvement in the French Colonial struggle during the early 1950s and ending with the tragic and humiliating fall of Saigon in April 1975. The United States was involved in Vietnam under leadership from six different Presidential

163

Administrations, and, because of the length of the war, many aspects of the Cold War changed as the conflict in Vietnam progressed and escalated. The war continued, although through significantly different approaches. Beginning with President Harry S.

Truman, the United States began supplying the French colonial forces in the southern half of Vietnam with military aid in 1950.1 The war became significantly more meaningful and closer to the hearts of Americans when President Lyndon B. Johnson escalated the conflict by authorizing the deployment of American ground troops in Vietnam in 1965.

President Richard Nixon ushered in a new era of the Cold War with Détente and attempted to bring diplomacy to the forefront of the Vietnam War. However, he was unsuccessful in putting a concrete end to the Vietnam War before he was forced to resign due to domestic scandal. Finally, the conflict ended with President Gerald Ford declaring the end of the Vietnam War, and the last Americans were evacuated from the rooftops of the American embassy in South Vietnam.2

The Cold War atmosphere that was present when the Vietnam War first began in the early 1950s was vastly different than the Cold War atmosphere when the Vietnam

War came to an end in the spring of 1975. The Cold War dynamically evolved over the course of the two decade long conflict. Although there were six changes of Commander in Chief in the United States during the Vietnam War, there were similar key features that all the Presidential Administrations shared, as well as major differences in how each

Administration went about fighting the war in Vietnam. This essay aims to answer the question, how was the Cold War impacted by the Vietnam War? The Vietnam War impacted the Cold War by provoking change in United States foreign policy, altering the

1 Mark Atwood Lawrence, The Vietnam War: A Concise International History (New York: Oxford University Press, 2008), 40. 2 Ibid, 167.

164

Cold War atmosphere, and by creating dissent across American society, and the globe, against the United States government’s involvement in Vietnam, consequently initiating the rise of anti-war peace protests and the counterculture of the 1960s and 1970s. United

States foreign policy went through tremendous change over the course of the Vietnam

War and experienced considerable adjustments in the years following the American defeat in Vietnam. These changes made to United States foreign policy as a result of the

Vietnam War were crucial to the remainder of the Cold War. Alongside United States foreign policy, the Vietnam War impacted the Cold War atmosphere globally. The conflict scarred the view the rest of the world had of the United States. America’s hegemony seemed to be slipping away after the Vietnam War, and the Soviets were rapidly approaching parity with the United States with respect to their nuclear arsenals and strength. The Vietnam War effectively demonstrated the strength and resolve of

Third World nations and showcased the rise of the Third global power in the formerly bilateral Cold War. Finally, as the Vietnam War continued throughout the 1960s,

American citizens began to criticize their government on the decisions being made in

Vietnam and dissent spread across the country and globe, igniting the anti-war movement across the Western world. The Vietnam War was a major Cold War conflict, and in order to understand the Cold War in its entirety, the Vietnam War must be analysed and explored with respect to United States foreign policy, the changing Cold War consensus, and the dissent against United States involvement in Vietnam.

For the United States, involvement in Vietnam began in 1950 with America pledging their support to the French colonial forces in their fight against the nationalistic

165

Democratic Republic of Vietnam.3 The French war in Vietnam began shortly after the end of World War II when Japan surrendered and relinquished their control over southeast Asia. After Japan left, Vietnam found itself in a power vacuum. The French desired to regain colonial control of Vietnam and re-establish themselves as a world superpower following devastation and loss of global prestige in World War II.4 The

French attempted to transform South Vietnam into an industrialized colony that would help restore economic prosperity to France. However, the gross mistreatment, harsh conditions, and suppression of national self-determination experienced by the Vietnamese at the hands of the French led the Vietnamese people to revolt against the French colonizers. Under the leadership of Ho Chi Minh, the people of Vietnam, facilitated by communist doctrine and beliefs, engaged in an almost decade long conflict with the

French colonial forces that were occupying Vietnam from 1946-1954.5 A majority of the world criticized France for continuing their war with Vietnam, claiming that France was desperately clinging to an “outdated colonial mentality.”6 However, Britain and the

United States backed France in their colonial conquest in Indochina.

Before examining America’s involvement in Vietnam, it is first important to understand what is meant by “the Cold War.” Directly after World War II, the United

States and the Soviet Union emerged as the dominant global superpowers. However, the

United States was vastly superior to the Soviet Union during the years directly following

World War II. Europe had been devastated by the effects of the war, and its nations were attempting to rebuild and regain control of society. It was clear that the United States and

3 Ibid, 40. 4 Ibid, 29. 5 Ibid, 28. 6 Ibid, 35.

166 the Soviet Union were the dominant global forces. The Soviet Union followed communism and its doctrine, whereas the United States subscribed to capitalism. When determining what would become of the post-World War II world, the Soviets and

Americans came into constant conflict, with each side desiring the rest of the world to adhere to their respective ideologies. Americans had a deep hatred for communism, and

Soviets had a deep hatred for capitalism. Ultimately, their ideologies had fundamental differences and were incompatible. The term “Cold War” came to be used to describe the bilateral ideological conflict between the communist Soviets and the capitalist

Americans; it became a battle of East vs. West. The Americans feared that the spread of communism across the globe would limit their markets and ultimately bring down their capitalistic society. The consistent effort to subvert communism and protect capitalism formed the basis of American foreign policy for the entire Cold War, from 1945 to 1991.

The term “Cold War atmosphere” or “Cold War dynamics” refers to the way that war was being fought, not necessarily physically, at a specific period of time. It refers to events and situations occurring in the conflict between the Soviets and the Americans at a specific time. For instance, in 1950, the Cold War atmosphere can be described by the following: America had been fighting the communists in Korea, who were supported by the Soviets and Chinese; the Soviets had recently successfully tested their own atomic bomb; America was in the midst of a Red Scare in the Hollywood film industry; and the

United States was economically supporting the French colonial forces and the Bao Dai regime in Vietnam, while the Soviets and Chinese were supporting Ho Chi Minh and the

Vietnamese communists. Everything that was occurring for both the United States and

167 the Soviet Union affected the Cold War atmosphere.7 When the Korean War ended, the

Cold War atmosphere changed; there was no longer the tension of physical war between the Soviets and the Americans. The Vietnam War drastically altered the Cold War atmosphere. As the largest, longest, and bloodiest conflict to take place during the Cold

War, the Americans, Soviets, and Chinese all altered the Cold War atmosphere and, as a result, drastically changed how the Cold War was waged.

The Truman Administration was the first United States presidential administration to become involved in Vietnam, as his predecessor, President Franklin D. Roosevelt, had been strongly against French involvement in Vietnam and believed that Vietnam should have the right of self-determination.8 The Truman Administration, although themselves faced with the war in Korea, pledged their support to the French colonial forces by means of military equipment such as tanks, naval vessels, weapons, and other equipment required to wage war. Even while fighting their own war in Korea, the United States shouldered one third of the cost of the French war in Vietnam.9 Alongside their military investment in French forces, the United States also economically funded the Bao Dai government in the southern half of Vietnam as an attempt to discourage communist uprising in Vietnam. The United States was not the only global superpower to become invested in the French-Vietnamese conflict. The Soviet Union and newly formed

Communist China began to intervene economically, militarily, and politically, with the

Soviets in 1947 and the Chinese in January 1950.10 By the spring of 1950, the bilateral

Cold War had been forced into Vietnam, with the Communist Sino-Soviets supporting

7 Ibid, 40. 8 Ibid, 29. 9 Ibid, 40. 10 Ibid, 36.

168 the Viet Minh and Ho Chi Minh and the Americans backing the French colonial forces and the Bao Dai regime. The United States primarily saw the Vietnam conflict as a Cold

War battle against communism, whereas the French saw the conflict as a last hope for regaining and preserving their colonial power. However, on 7 May 1954, the French colonial forces, with American economic support, were conquered by the Viet Minh in the Battle of Dien Bien Phu which lasted fifty-five days.11 Although the Battle of Dien

Bien Phu was not strategically detrimental to the war against the Viet Minh, psychologically the defeat foreshadowed France’s future in Vietnam. The day after the

Battle of Dien Bien Phu ended, a ceasefire was called and international talks regarding

Vietnam began at the Geneva Accords.12 Vietnam was divided into the communist North

Vietnam and non-communist South Vietnam, with the promise to hold democratic elections that would unify Vietnam. However, the United States did not follow through with this pledge for Vietnamese self-determination, and so began the two decade long

American conflict in Vietnam.

According to Secretary of State Dean Acheson in a ministerial meeting in Paris on

8 May 1950, the United States did not believe that national independence or democratic evolution could possibly exist in a Soviet dominated area and thus believed they were required to aid the French and Bao Dai regime in order to combat communism and seek peace and democracy in Vietnam.13 Acheson clearly confirms the failure of the United

States to see the nationalism in Vietnam behind the Viet Minh revolution and consequently places the blame on Soviet influence, therefore subsequently bringing the

11 Ibid, 45. 12 Ibid, 47. 13 Dean Acheson, “Sponsoring French Colonialism (Acheson Statement Excerpt, May 8, 1950),” in Vietnam: History, Documents and Opinions on a Major World Crisis, ed. Marvin E. Gettleman (Greenwich CT: Fawcett Pub, 1965), 89.

169

Cold War to Vietnam. American foreign policy historian and Cold War revisionist

Gabriel Kolko supports the belief that the United States, by 1947, “had become wholly convinced that the Soviet Union was in some crucial manner guiding many of the political and social upheavals in the world that were in fact the outcome of poverty, colonialism, and oligarchies.” 14 The failure of the United States government to distinguish between nationalism and communism in Vietnam was arguably their greatest downfall in the entire Vietnam War.

In terms of United States foreign policy, it is important to examine what changes were made, but arguably it is more important to examine what remained constant between the six different Presidential Administrations over the course of the Vietnam

War. With President Harry S. Truman, the American conflict in Vietnam holds its origins. President Truman authorized economic and military aid to be sent to assist the

French colonial forces in Vietnam. Truman’s Administration, as shown by Secretary of

State Dean Acheson, followed the monolithic view of communism, which caused them to mistake nationalism for Soviet communist influence in Vietnam. This was the main driving factor for American presence in Vietnam. Although the Soviets and Chinese were supplying the Viet Minh, communism was not the primary motivation for the Vietnamese revolution. Part of the tension and ignorance of nationalism in Vietnam by the Truman

Administration came from their fears of the Soviets’ successful development of the atomic bomb in 1949, which the United States feared would be used as a diplomatic tool against them for their role in Third World conflicts.15 Another aspect of fear by the

14 Gabriel Kolko, Anatomy of a War: Vietnam, the United States, and the Modern Historical Experience (New York: Pantheon Books, 1985), 77. 15 William Appleman Williams et al, eds., America in Vietnam: A Documentary History (Garden City, NY: Anchor Press, 1985), 68.

170

United States during the early 1950s was a result of the Korean War. From 1950 to 1953, the Truman Administration was directly involved militarily in the Korean War. The threat of the North Korean communist invasion of South Korea forced the United States to intervene militarily and deploy American ground troops to counter the communist invasion. With the Korean War still raging on during the early 1950s, the primary fear for the Truman Administration was that Vietnam would turn into a Korean War scenario.

The United States government wanted to prevent a communist takeover of Vietnam, and this fear therefore validated NSC-68 and the United States to provide economic and military aid to the French forces fighting against the communist uprising in Vietnam.16

NSC-68 was a document drafted by the United States National Security Council in 1950 that recommended that the United States provide military aid to any nation under the threat of communism. The fear of another Korean War is something that did not remain exclusive to the Truman Administration, but continued to his successor. It was not only the fear of the communist uprising in Vietnam that warranted United States economic aid to Vietnam. The Truman Administration believed that Southeast Asia was crucial to

Japan regaining economic stability post-World War II. According to United States diplomat William Butterworth in a State Department Memo from 1947, “I believe a strong case can be made for the fact that economic revival of Japan is dependent upon the economic revival of Asia as a whole and vice-versa.”17 Butterworth believed that a “Far

Eastern Marshall Plan” would provide the best result in stimulating Japan’s economy, but

16 National Security Council Report, “United States Objectives and Programs for National Security (April 14, 1950),” in America in Vietnam: A Documentary History, eds. William Appleman Williams et al (Garden City, NY.: Anchor Press, 1985), 71. 17 William Butterworth, “State Department Memo on Far Eastern Marshall Plan,” in America in Vietnam: A Documentary History, eds. William Appleman Williams et al (Garden City, NY: Anchor Press, 1985), 66.

171 also to subvert communist uprisings in Asia.18 Although President Truman was only in office until 1953, the changes and precedents his Administration set consequently shaped the United States’ role in Vietnam for the next two decades, dramatically altered the Cold

War atmosphere, and set the stage for future Cold War conflict.

When President Dwight D. Eisenhower took office in 1953, the Korean War was still underway and continued for the next six months. However, following the end of the

Korean War, the attention of the United States turned to Southeast Asia and Vietnam.

Analogous to his predecessor, President Eisenhower and his Administration shared the monolithic view of communism and the determination to combat Sino-Soviet communist subversion across the globe aggressively. After the Geneva Accords in 1954 divided

Vietnam into halves, the United States broke the agreement that prohibited the external involvement of foreign nations in Vietnamese politics. By breaking this agreement, the

United States brought the Cold War directly to Vietnam. The Soviets did not sit back and allow the United States to imperialize the Vietnamese people like the French had previously. Vietnam became a Cold War proxy war between the United States and the

Soviet Union, but also a nationalistic conflict for the Vietnamese people. The reasoning behind intervention in South Vietnam can be attributed to “The Domino Theory.” The

Eisenhower Administration coined the term “Domino Theory,” which means, according to Secretary of State John Foster Dulles, that if communist forces were successful in unifying Vietnam under communism, the rest of Southeast Asia would soon fall to communism as well, like dominoes falling, and the United States would lose their sphere

18 Ibid, 66.

172 of influence in Southeast Asia entirely.19 The “Domino Theory” is what motivated the

United States to pursue more aggressive intervention in Vietnam and remained a staple of

United States foreign policy for the remainder of the Vietnam War. The underlying fear for the United States still remained communism. Understanding the idea of the “Domino

Theory” makes it possible to understand the United States’ foreign policy decisions, which were made to prevent the “dominoes” from falling.20 The Korean War was very influential in determining and setting United States foreign policy for the early years of the Vietnam War. President Eisenhower in his “Change for Peace” speech on 16 April

1953 claimed that the “Korean Armistice would be a fraud if it merely released aggressive armies for attack elsewhere.”21 President Eisenhower was essentially stating that the results and lessons learned from the Korean War would be disgraced if the United

States allowed a similar communist uprising to break out in Vietnam. This also reinforces the fact that the Eisenhower Administration, like the Truman Administration, wholeheartedly subscribed to the monolithic view of communism. They believed that the communists in North Korea followed the same doctrine and beliefs as the communists in

North Vietnam. The United States government again completely disregarded the possibility that the revolution in Vietnam was mainly rooted in nationalism.

The Eisenhower Administration also initiated a drastic change regarding United

States foreign policy in Vietnam. They aimed to create a robust, anti-communist South

Vietnam, and, to accomplish this objective, the Eisenhower Administration created a

19 John Foster Dulles, “Speech to The Overseas Press Club (March 29, 1954),” in Vietnam: History, Documents and Opinions on a Major World Crisis, ed. Marvin E. Gettleman (Greenwich CT: Fawcett Pub, 1965), 90. 20 Dwight D. Eisenhower, “Counting the Dominoes (Eisenhower Speech Excerpt April 7, 1954)” in America in Vietnam: A Documentary History, eds. William Appleman Williams et al (Garden City, NY: Anchor Press, 1985), 156. 21 Dulles, “Speech to The Overseas Press Club (March 29, 1954),” 90.

173 puppet regime in South Vietnam under the leadership of nationalist and vehement anti- communist Ngo Dinh Diem in 1954.22 Diem was a devout Catholic, which immediately placed him in the small minority of the primarily Buddhist Vietnamese people. Placing

Ngo Dinh Diem in the leadership role of South Vietnam turned out to be a major strategic failure for the United States. Diem was a corrupt leader, practising nepotism in his regime and further alienating the South Vietnamese peasants. Diem’s regime and policies did more harm than good in their attempt to build up a strong anti-communist South Vietnam, infused with American ideology. Instead, Diem unintentionally fostered anti-American beliefs among the South Vietnamese lower-class which, in turn, led to an increased hatred of the United States by the Vietnamese people.23 The Eisenhower Administration’s failure to foresee the Diem regime as being problematic and contradictory to what the

United States had hoped to achieve is one of the major failures by the United States in the

Vietnam War. The Eisenhower Administration failed to understand the nature of South

Vietnamese society and failed to see the true wants and desires of the Vietnamese people due to their obsession with defeating the supposed communist threat.

President Eisenhower also cemented the Vietnam War as solely America’s war with his failed recruiting of Winston Churchill and Britain through his 1954 letter to

Churchill asking for the British to support the United States in their fight for Vietnam.

Eisenhower reached out to Churchill personally, but Churchill declined to pledge

Britain’s support to the United States in Vietnam.24 Later on in the Vietnam War, Britain and other American allies including Canada, France, and West Germany all advised the

22 Lawrence, 56. 23 Ibid, 61. 24 Department of State, “Foreign Relations of the United States, 1952-1954, XIII, Indochina, Part 2,” in America in Vietnam: A Documentary History, eds. William Appleman Williams et al (Garden City, NY: Anchor Press, 1985), 153.

174

United States to stop the fighting and pull out of Vietnam.25 The United States foreign policy under Eisenhower revolved around training the South Vietnamese and assisting them so they could fight their own battle against the North Vietnamese. However, according to Vice-President Richard Nixon in a speech on 17 April 1954, “It is hoped the

United States will not have to send troops there, but if this government cannot avoid it, the Administration must face up to the situation and dispatch forces.”26 The Cold War atmosphere at this point was beginning to intensify. The Warsaw Pact had been established, and the Soviet sphere of influence in Eastern Europe was growing. The

Cuban Revolution, led by Fidel Castro, had placed communists in “America’s Backyard.”

Tensions were beginning to rise for the United States with respect to the Cold War.

Although the Vietnam War was experiencing a period of “anguished peace” by 1959, the conflict in Vietnam began again as Ho Chi Minh declared war to unify Vietnam.27 The

United States government under President Eisenhower set the stage for the Kennedy

Administration and America’s further involvement in Vietnam, with the continued monolithic view of communism and the installation of the Diem regime in South

Vietnam.

When President John F. Kennedy took office in January 1961, the situation in

Vietnam was a major concern. President Kennedy was victorious in the 1960 Presidential election, mainly due to his campaign promise that he would wage the Cold War far more vigorously than the Eisenhower Administration.28 By 1961, the Diem regime in Saigon was becoming more destabilized and facing far greater resistance from the Vietnamese

25 Ibid, 153. 26 Richard Nixon, “New York Times Speech (April 17, 1954),” in Vietnam: History, Documents and Opinions on a Major World Crisis, ed. Marvin E. Gettleman (Greenwich CT: Fawcett Pub, 1965), 91. 27 Lawrence, 47. 28 Ibid, 68.

175 people. President Kennedy and his Administration shared the consistent invariable view of communism, as did both of his predecessors. According to Kennedy’s Secretary of

Defense Robert McNamara, who, like most Americans, saw communism as monolithic,

“I believed the Soviets and Chinese were cooperating in trying to extend their hegemony.

In hindsight, of course, it is clear that they had no unified strategy after the late 1950s.”29

This quote from Secretary McNamara provides incredible insight into the view of the

United States government. They were so overcome with the fear and threat of communism that they were completely ignorant to the idea that communism was not uniform, but rather expressed in a variety of different forms, as seen in the Soviet Union,

Red China, Korea, Cuba, and Vietnam.

With their underlying monolithic view of communism, the Kennedy

Administration, according to McNamara, followed two primary foreign policy ideas. The first was a derivative of the “Domino Theory”; they believed that if South Vietnam fell to communism it would be a grave threat to the United States and to the Western nations.

Secondly, the Kennedy Administration believed that the South Vietnamese were the only people who were capable of defending their own nation and that the United States was only able to play an advisory role.30 Although the Kennedy Administration subscribed to the monolithic view of communism, it is interesting that President Kennedy was able to compromise with communists in Laos in 1961, but was vehemently against compromise with the North Vietnamese communists.31 This hints that there was more than the

29 Robert S. McNamara and Brian VanDeMark, In Retrospect: The Tragedy and Lessons of Vietnam (New York: Vintage Books, 1996), 30. 30 Ibid, 29. 31 Lawrence, 71.

176 communist threat that was keeping the United States in the conflict in Vietnam, perhaps because Vietnam was a quintessential Cold War conflict, and Laos was not.

The Cold War atmosphere was about to reach a critical point. By the early 1960s the Cold War atmosphere was completely different from the previous decade. The

Korean War was in the past. The nuclear arms race and the space race between the

Soviets and the Americans was underway. The Cuban Missile Crisis in 1962 marked the closest the Cold War had come to a nuclear war. Both Ngo Dinh Diem and President

Kennedy had been assassinated. The early 1960s were arguably the tensest years of the

Cold War. The conflict in Vietnam dramatically raised the tensions primarily for the

Americans, but also for the Soviets and Chinese as well. As the Cold War became more tense, the Vietnam War inched closer towards full-scale American intervention.

President Kennedy’s outlook on the Vietnam War is confusing and somewhat contradictory. This is mostly because he was never able to finish his term as President due to his assassination in November 1963. In a National Security Action Memorandum on 13 November 1961, President Kennedy approved the recommendation that the

Department of Defense should stand ready with plans that would authorize the use of

American military forces in South Vietnam under one of three conditions. The first was the use of American forces to defend the South Vietnamese and to boost their morale; the second was the use of American forces to assist in the suppression of Vietcong insurgents in South Vietnam; and the third was the use of American forces to assist if there was a

South Vietnamese communist military intervention in South Vietnam.32 This document provides evidence that President Kennedy and his Administration were planning for the

32 National Security Action Memorandum, “Kennedy Administration Decisions in Vietnam (November 13, 1961),” in The Cold War: A History in Documents and Eyewitness Accounts, eds. Jussi Hanhimäki and Odd Arne Westad (London: Oxford University Press, 2003), 215.

177 potential deployment of United States ground troops. However, to directly counter that claim, according to Secretary McNamara, the Kennedy Administration had been actively planning for “phased withdrawal of United States forces in 1963.”33 McNamara claims that the Department of Defense had plans to withdraw one thousand men from Vietnam by the end of 1965 and that, according to United States foreign policy, these men were in

Vietnam strictly as advisors. 34 These two claims by the Kennedy Administration contradict each other. One claims that the Kennedy Administration was planning for deployment of the American military troops, and the other claims that the Kennedy

Administration was planning for de-escalation and eventual withdrawal from Vietnam.

Although there is no way of definitively proving which course of action President

Kennedy ultimately would have taken, during his last statement made about Vietnam before his death, when asked “Are we going to give up South Vietnam?” President

Kennedy responded “The most important program, of course, is our national security, but

I don’t want the United States to have to put troops there.”35 Secretary McNamara believes that, had President Kennedy lived, he would have pulled the United States out of

Vietnam. “He would have concluded that the South Vietnamese were incapable of defending themselves, and that Saigon’s grave political weakness made it unwise to try to offset the limitations of the South Vietnamese forces by sending United States combat troops on a large scale.”36 However, it is important to remember that McNamara is speaking in retrospect and hypothetically. Like the Presidents before him, Kennedy and his Administration failed to see the nationalism behind the Vietnamese revolution and

33 McNamara, In Retrospect, 29. 34 Ibid, 79. 35 Ibid, 86. 36 Ibid, 96.

178 attributed the entire conflict to communism. Examining the facts and not the hypotheticals, by the end of Kennedy’s presidency, there were over 16,000 military advisors in South Vietnam. The Diem Regime had been overthrown in a military coup d’état, and Ngo Dinh Diem was murdered. By the end of 1963, South Vietnam was rapidly crumbling. Newly inaugurated President Lyndon B. Johnson was forced to take up the mantle and replace President Kennedy, while also dealing with the political power vacuum in South Vietnam that left the United States with not many options for dealing with the Vietnam War.

With the Johnson Administration, the Vietnam War drastically escalated.

President Johnson shared many of the same goals for Vietnam as Kennedy and also shared a similar cabinet. President Johnson was thrust into a difficult position having to pick up where Kennedy left off with the crumbling South Vietnamese government.

Johnson also had to prepare and campaign for the 1964 Presidential Election if he wanted to remain President. Johnson feared that any abrupt changes to United States foreign policy in Vietnam might be detrimental to the Presidential Election, so Johnson delayed on major foreign policy moves until after he had won the election.37 During the election,

President Johnson aggressively and publicly denied that he had plans to escalate the war, while secretively considering escalation as the correct route to take United States foreign policy.38 Similar to his recent predecessors, Johnson followed the monolithic view of communism. The Johnson Administration desired an independent, free, non-communist

South Vietnam. They believed that Vietnam must be free, but also able to accept

37 Lawrence, 85. 38 Williams et al, 239.

179

American assistance to maintain security.39 The Johnson Administration believed in the

“Domino Theory,” and according to Secretary McNamara in a memo to President

Johnson in 1964, “Unless we can achieve this objective in South Vietnam, almost all of

Southeast Asia will probably fall under Communist dominance.” 40 The Johnson

Administration initiated a major change in United States foreign policy when on 2 August

1964, the USS Maddox reported that the North Vietnamese had attacked with torpedoes in the Gulf of Tonkin. 41 Following this incident, the United States Congress overwhelmingly passed the Gulf of Tonkin Resolution, which became the legal basis for the United States to be involved militarily in the Vietnam War. This resolution enabled

President Johnson to circumvent Congress and allowed him to repel any attacks against the United States in South Vietnam by whatever measures were necessary.42 Johnson essentially used the Gulf of Tonkin Resolution to wage an undeclared war in Vietnam.

Under President Johnson, American troops landed in South Vietnam, and they remained there for the next decade. United States foreign policy had rapidly shifted from

Americans being advisors and military trainers to the South Vietnamese, to Americans fighting the war for the South Vietnamese. Not only were American ground troops in

Vietnam, but President Johnson also authorized Operation Rolling Thunder in 1965, which was a massive scale bombing campaign that dropped over 600,000 tons of bombs on North Vietnam.43 Secretary McNamara contends that Congress never intended the

39 Robert S. McNamara to Lyndon B. Johnson, “Memorandum entitled South Vietnam (March 16, 1964)” in America in Vietnam: A Documentary History, eds. William Appleman Williams et al (Garden City, NY: Anchor Press, 1985), 234. 40 Ibid, 234. 41 Lawrence, 86. 42 “The Gulf of Tonkin Resolution, 1964” in The Cold War: A History Through Documents, eds. Edward H. Judge, and John W. Langdon (Upper Saddle River, NJ: Prentice Hall, 1999), 135. 43 Lawrence, 89.

180

Gulf of Tonkin Resolution to be used in such a manner as President Johnson did.

However, by the end of Johnson’s Presidency, America was too far overextended and firmly entrenched in the Vietnam War. The monolithic view of communism had brought

Americans into direct combat with the North Vietnamese. The decision to deploy ground troops to Vietnam had a monumental impact on the Cold War atmosphere. The Vietnam

War was no longer a proxy war. Americans were directly involved in the combat, and

American blood was being spilled by the Sino-Soviet supplied Vietcong. The Vietnam

War became a quintessential, international Cold War conflict. It became a war that the

United States was fighting singlehandedly, against the advice of their allies in North

America and Europe; a war directly between the United States and the Communist Bloc.

The Vietnam War also had the potential to go nuclear. According to Secretary McNamara in an off-the-record interview with the New York Times, he claims that the United States had not completely ruled out nuclear weapons and that the United States government agreed to only use nuclear weapons after completely exhausting their non-nuclear arsenal.44 Under President Johnson, the Vietnam War had become the primary conflict in the Cold War atmosphere, and the world’s eyes were locked on what was happening in

Vietnam.

President Richard Nixon was the first United States President to break the monolithic view of communism that had haunted and plagued the United States during the first three decades of the Cold War. According to President Nixon’s National Security

Advisor Henry Kissinger, the United States refused to leave Vietnam in a defeat by communism. However, Kissinger told Soviet Foreign Minister Andrei Gromyko in a

44 Robert S. McNamara, “Off-the-Record Interview with Secretary of Defense McNamara by the New York Times (April 22, 1965),” in America in Vietnam: A Documentary History, eds. William Appleman Williams et al (Garden City, NY: Anchor Press, 1985), 247.

181 meeting in Moscow in May 1972 that “We are prepared to leave so that communist victory is not excluded, though not guaranteed.”45 This essentially demonstrates that the

United States still did not prefer communism in South Vietnam, but did not want to be viewed by the world as being “defeated” by communism. They were no longer as vehemently opposed to communism as the previous four Presidential administrations had been. The movement away from the monolithic view of communism by the Nixon administration comes during a time in the Cold War where the Soviet Union and the

United States had reached a nuclear parity.46 The fear transitioned from communism to the Soviets themselves.

Nixon won the Presidential Election in 1968 and was inaugurated in January

1969. With the presidency came the daunting task of finding a way to de-escalate and end the Vietnam War, something that Nixon had promise during his campaign.47 Once Nixon took office, he slowly began to de-escalate the Vietnam War by lowering the number of

American troops in Vietnam.48 The way Nixon hoped to end the war was through a peace deal with either the Soviets or the Chinese. Nixon’s decision to openly acknowledge the

People’s Republic of China and open contact with the Chinese for the first time since their revolution in 1949 was motivated by his hope that China would be able to pressure

North Vietnam into peace talks. 49 The diplomatic recognition of China by Nixon drastically changed the Cold War. The war was no longer a bilateral conflict, and the

45 Henry Kissinger-Andrei Gromyko, “Record of Conversation, 1972,” in The Cold War: A History in Documents and Eyewitness Accounts, eds. Jussi Hanhimäki, and Odd Arne Westad (London: Oxford University Press, 2003), 230. 46 Jussi Hanhimäki, “Conservative Goals, Revolutionary Outcomes: The Paradox of détente,” Cold War History 8, no. 4 (2008): 506. 47 Lawrence, 137. 48 Richard Nixon, “Address to the Nation (23 January 1973)” in The Cold War: A History Through Documents, eds. Edward H. Judge, and John W. Langdon (Upper Saddle River, NJ: Prentice Hall, 1999), 166. 49 Ibid, 166.

182

Nixon Administration utilized this connection to China in attempts to set the Chinese and the Soviets against each other.50 China became a political tool for the United States to use against the Soviet Union.

In regard to the Cold War atmosphere, Vietnam was still a major focal point but with the introduction of Détente, the Cold War experienced a thaw, and the focus shifted away from conflict and towards negotiations and trade between the Soviet Union and the

United States. The greatest change to United States foreign policy and the Cold War that occurred during the Vietnam War was Détente. Détente was the term used for the improving relations and the managing of conflict between the Cold War superpowers and the increased number of meetings and negotiations that took place between these powers.

The Cold War superpowers were primarily focused on the signing of the Strategic Arms

Limitation Treaty (SALT I) and the Anti-Ballistic Missile Treaty (ABM) which both were ways in which the Soviets and Americans agreed to limit and de-escalate their respective nuclear arsenals, which at the time were drastically increasing.51Although both sides continued to pursue the same objectives as the earlier years of the Cold War,

Détente was simply a new method employed by both sides in the Cold War. Détente arose as a strategy employed by the Nixon Administration to combat the Cold War through diplomacy, while simultaneously seeking an honourable exit from Vietnam. The

Nixon Administration dramatically changed foreign policy in regards to Vietnam with the

Nixon Doctrine in 1969. In this doctrine, Nixon proclaimed that the allied nations of the

United States were responsible for their own security. However, the United States would

50 Lawrence, 142. 51 Jussi Hanhimäki and Odd Arne Westad, eds. The Cold War: A History in Documents and Eyewitness Accounts (Oxford University Press, 2003), 303.

183 continue to act as a “nuclear umbrella” when requested. 52 The Nixon Doctrine, sometimes referred to as Vietnamization, gave the United States a way out of Vietnam that showed to the world that the United States did all they could to help the South

Vietnamese: the honourable exit Nixon was searching for. The Nixon Doctrine not only helped to set the United States on track to finally withdraw from Vietnam, but it also set a precedent for future involvement in the Third World during the Cold War, so that the

United States was no longer required to intervene militarily in future conflicts. However, the United States continued to use their economic and diplomatic power to influence the

Third World.

Alongside Détente, during the 1960s and 1970s, the peace protests and anti-war movement began to sweep across American society. Dissent against the United States government reached new heights. American citizens began to criticize their government on the political and military decisions being made with respect to the Vietnam War. The

Vietnam War shocked the American public, and, as the war progressed, this dissent only got worse, especially after the Tet Offensive in 1968, when the North Vietnamese offensive showed the world what the communists in North and South Vietnam were capable of strategically.53 Vietnam War critics, such as Robert F. Kennedy and Senator

Eugene McCarthy, gained a following and presented the anti-war movement to American politicians for them to support and advocate the end the Vietnam War.54 Civil rights leaders, such as Malcolm X, even became involved in the anti-war movement, advocating

52 Richard Nixon, “The Nixon Doctrine Speech, 1969” in America in Vietnam: A Documentary History, eds. William Appleman Williams et al (Garden City, NY: Anchor Press, 1985), 282. 53 Earle Wheeler to Lyndon B. Johnson, “Memorandum of February 27, 1968” in America in Vietnam: A Documentary History, eds. William Appleman Williams et al (Garden City, NY: Anchor Press/Doubleday, 1985), 269. 54 Robert F. Kennedy, “RFK Calls Vietnam an Unwinnable War (Feb. 8, 1968),” in Vietnam Documents: American and Vietnamese Views of the War, ed. George N. Katsiaficas (London: M.E. Sharpe, 1992), 90.

184 the end of the Vietnam War and global peace.55 The pinnacle of the anti-war movement occurred following the shooting at Kent State University on 4 May 1970, where the

National Guard opened fire into a crowd of anti-war student protesters, killing four students and wounding nine others.56 The Kent State protests had occurred as a result of the Nixon Administration authorizing the military invasion of Cambodia on 30 April 30

1970.57 The Kent State killings sparked outrage across the nation and subsequently led to a major increase in the anti-war protests across American university and college campuses.58 The anti-war protests did not solely involve students, but also Vietnam War veterans who took to the streets to protest their discontent with the war in Vietnam. In terms of the impact on the Cold War, the anti-war movement had a tremendous impact on the Cold War atmosphere. The anti-war movement affected nations across the globe, specifically West Germany and its Free University.59 The protest against the United

States government over their involvement in Vietnam directly led to Détente and the

Nixon administration's efforts to finally put an end to the Vietnam War. Nixon needed to respond to the public dissent with direct action if he was to be successfully re-elected in the upcoming 1972 Presidential Election. The global anti-war movement directly influenced American foreign policy and, with the creation of Détente, influenced the remainder of the Cold War. The anti-war protests during the Vietnam War set a precedent for public objection and protest by the America public for future generations to follow when the people disagreed with their government’s policies.

55 George N. Katsiaficas, Vietnam Documents: American and Vietnamese Views of the War (London: M.E. Sharpe, 1992), 120. 56 Williams et al, 289. 57 Ibid, 283. 58 Ibid, 291. 59 Jeremi Suri, “The Cultural Contradictions of Cold War Education: The Case of West Berlin,” Cold War History 4, no. 3 (2004): 5.

185

The formal end of the Vietnam War began in 1973 with the Paris Peace Accords, which was an agreement that laid out the procedure in which the United States was to follow for withdrawal from Vietnam and the agreement for the release of the American prisoners of war held in North Vietnam.60 Following President Nixon’s resignation after the Watergate Scandal, President Gerald Ford took office. President Ford was tasked with removing the remainder of the American troops from Vietnam. From 1973 to 1975, the

Vietnam War continued with American bombings of North Vietnam, but the United

States significantly reduced the amount American troops in South Vietnam. 61 The

Vietnamization had been implemented, and the South Vietnamese were left to defend themselves against the attacks of the North Vietnamese, with a very small amount of

American officials left in Saigon. Finally, on 29 April 1975, the evacuation of Saigon began, and the last Americans in Vietnam were evacuated via helicopter from the roof of the American embassy. South Vietnam fell to the North, and the Vietnam War was over.62 Even though the Vietnam War had come to an end, the United States supported anti-government factions in Cambodia into the 1980s and upheld a very strict trade embargo with Vietnam until 1994.63 The United States had been defeated as nations across the globe witnessed the humiliating American evacuation from Saigon.

Throughout the Vietnam War, the United States government, until President

Nixon, subscribed to the monolithic view of communism. This was the underlying cause for failure in Vietnam. The United States government did not understand the culture of

Vietnam or the social, political, and nationalistic desires of the Vietnamese people. They

60 Lawrence, 162. 61 Ibid, 163. 62 Ibid, 167. 63 Craig Lockard, "Meeting Yesterday Head-on: The Vietnam War in Vietnamese, American, and World History," Journal of World History 5, no. 2 (1994): 230.

186 were completely ignorant of the power that nationalism held.64 The motivation that nationalism brought to the Vietnamese people allowed them fight and die for their country. The United States underestimated the strength and resolve of the Vietnamese.

Under-Secretary of State George W. Ball believed that Americans had contempt for the

Vietnamese enemy. He claimed that the Vietnamese, who were considered to be unsophisticated, poorly equipped, peasant farmers would stand no chance against the might and power of the United States military. Ball claimed that the Vietnamese were not motivated by deep convictions like the Americans were.65 It is clear from George W.

Ball’s opinion that the United States government vastly underestimated the Vietnamese, and, as a result, suffered the same fate that the French colonial forces suffered two decades earlier. An aphorism by George Santayana perfectly summarizes the United

States’ involvement in Vietnam: “Those who refuse to learn from history are doomed to repeat it.”66 The lack of synergy and cooperation between the civilian and military leaders of the United States also heavily contributed to their defeat in Vietnam.67 Similarly, the

United States failed to adapt to the guerrilla style of warfare that the North Vietnamese were employing. The Vietnam War was fought on a terrain that rendered the traditional styles of warfare useless.68 The Vietnamese had lived on that land for centuries and had an incredible advantage that the United States was never able to master. Along with guerrilla warfare came the question asked by Kennedy, “How can we tell if we are

64 McNamara, In Retrospect, 321. 65 Gettleman, 62. 66 Ibid, 63. 67 Robert Buzzanco, Masters of War: Military Dissent and Politics in the Vietnam Era (New York: Cambridge University Press, 1997), 216. 68 George C. Herring, “America and Vietnam: The Unending War,” Foreign Affairs 70, no. 5 (1991): 112.

187 winning?”69 There was no way for the United States to evaluate their success in the war with no physical front for the war; there were no battle lines in the jungle. The United

States military attempted to measure their progress in Vietnam by using statistics such as enemies killed, number of weapons seized, and prisoners taken but these were often erroneous and held no real significance in determining whether the Americans were

“winning” or not.70 The view that Communism was all encompassing and homogeneous, the gross ignorance about the desire for self-determination held by the Vietnamese people, and the inability of the United States military to wage limited, guerrilla warfare were the major reasons why the all-powerful United States could not win a war against

Vietnamese farmers.

In conclusion, the Vietnam War can be considered the most influential Cold War conflict. The changes that occurred to United States foreign policy throughout the

Vietnam War had drastic effects on the Cold War atmosphere. In all aspects the Vietnam

War was the quintessential Cold War conflict. It included proxy war between the United

States and the Soviet Union, it had the potential to escalate into nuclear war, it resulted in the diplomatic recognition of the People’s Republic of China, it led to arms reduction treaties between the major superpowers, and brought diplomacy to the forefront of Cold

War strategy. The Vietnam War impacted the Cold War by generating change in

American foreign policy, consequently altering the Cold War atmosphere, and by creating dissent across the globe against the United States government for their role in the

Vietnam War. Out of all the critical events that occurred during the Cold War era, the

69 Ibid, 112. 70 McNamara, In Retrospect, 48.

188 conflict between a hegemonic global superpower and a poverty-stricken society of farmers turned out to have the biggest influence on the Cold War.

189

Bibliography Buzzanco, Robert. Masters of War: Military Dissent and Politics in the Vietnam Era. Cambridge: Cambridge University Press, 1997. Ellsberg, Daniel. Secrets: A Memoir of Vietnam and the Pentagon Papers. New York: Viking Press, 2002. Gettleman, Marvin E. Vietnam: History, Documents and Opinions on a Major World Crisis. Greenwich, CT: Fawcett Pub, 1965. Hanhimäki, Jussi and Odd Arne Westad, eds. The Cold War: A History in Documents and Eyewitness Accounts. London: Oxford University Press, 2003. Hanhimäki, Jussi. “Conservative Goals, Revolutionary Outcomes: The Paradox of Detente.” Cold War History 8, no. 4 (2008): 503-512. Herring, George C. “America and Vietnam: The Unending War.” Foreign Affairs 70, no. 5 (1991): 104-119. Judge, Edward H., and John W. Langdon. The Cold War: A History Through Documents. Upper Saddle River, NJ: Prentice Hall, 1999. Katsiaficas, George N. Vietnam Documents: American and Vietnamese Views of the War. Armonk, NY: M.E. Sharpe, 1992. Kolko, Gabriel. Anatomy of a War: Vietnam, the United States, and the Modern Historical Experience. New York: Pantheon Books, 1985. Lawrence, Mark Atwood. The Vietnam War: A Concise International History. New York: Oxford University Press, 2008. Lockard, Craig. "Meeting Yesterday Head-on: The Vietnam War in Vietnamese, American, and World History." Journal of World History 5, no. 2 (1994): 227- 270. McNamara, Robert S., and Brian VanDeMark. In Retrospect: The Tragedy and Lessons of Vietnam. New York: Vintage Books, 1996. Suri, Jeremi, “The Cultural Contradictions of Cold War Education: The Case of West Berlin.” Cold War History 4, no. 3 (April 2004): 1-20. Williams, William Appleman et al, eds. America in Vietnam: A Documentary History. Garden City, NY: Anchor Press, 1985.

190

The Formidable Widow: a Comparison of Representations and Life Accounts of

Widows in Early Seventeenth-Century England

By: Rebecca Nickerson

The portrayal of widows in seventeenth-century English ballads and comedies exaggerated and played with the cultural notions of the day surrounding the social and economic status of widows. A dichotomy existed between the portrayal of widows in popular English arts and the actual lives of many widows. This research paper explores the popular cultural portrayals of widows in ballads and plays, in contrast to specific case studies of widows themselves in order to understand a variety of these women and their experiences in England in the seventeenth century.

England in the seventeenth century was a nation in which women did not have many property or individual rights outside of marriage. In the eyes of the law, women were the subjects of the closest male figure in their lives, and thus all legal decisions regarding women were often made by men.1 One of the only ways in which a woman could have control over property, have an income of her own, and make established investments was if she was widowed. A widowed woman was able to exercise many of the same rights as a male citizen if they had the property or monetary means to do so.2

Upon the death of her husband, a widow most often became the executor of her husband's estate and received a dower, which during the seventeenth century legally consisted of one-third of the deceased husband’s land or a lifetime provision from the income of his estates.3 A widow who had her own money and possessions at the beginning of the

1 Amy Louise Erickson, “Common Law Versus Common Practice: The Use of Marriage Settlements in Early Modern England,” Economic History Review 43, no. 1 (1990): 24. 2 Ibid, 26. 3 Ibid, 24.

191 marriage often got to keep possession of what she owned as well as bonds in order to provide for any dependent children the marriage produced.4 Many of these provisions for widows were established in marriage settlements much like modern prenuptial agreements.5 Marriage settlements in the seventeenth century were not reserved for the rich; ordinary people used them, and, as such, widows of all classes inherited their husband’s property and belongings. Thus, upon widowhood a woman could find herself in one of the highest positions of personal and legal authority a woman in this period could achieve.6

Once a woman's husband died she could choose to remain single as the highest authority of her household if she had the provisions to do so, or she could choose to remarry, which would legally forfeit her status to a husband. If a widow remarried, the property and wealth in her possession from her previous marriage would become the property of her new husband. However, legal stipulations could be put in place to prevent a new husband from owning goods that would belong to the children of the previous marriage as their inheritance.7 These legal barriers on inheritance rights were rarely put in place due to the difficulties of a woman obtaining legal aid and the social stipulations on a woman questioning the rights of her future husband.8 The expectation of widows to forfeit ownership of their deceased husband's money and property to a new husband upon marriage was the origin of the portrayal of rich widows as naïve and desirable for

4 Ibid, 25. 5 Ibid. 6 Ibid, 24. 7 Ibid, 27. 8 Ibid.

192 marriage in popular plays and ballads.9 For many young men, marriage to a rich widow would give them immediate social value and wealth. There was much to be gained for a young man if he married a widow with substantial financial savings from a previous marriage.

Strike While the Iron is Hot (1625) is a ballad that encouraged young men to make their fortune by marrying widows quickly. One of the first verses states, “for one many ther's women twenty, this time lasts but for a space, she will be wrought, though it be for nought, but wealth which her first husband got, let younge men poore, make haste therefore, tis good to strike while the iron tis hot.”10 This ballad is a prime example of the sentiment that was present in many seventeenth-century plays and ballads about widows, which were primarily written by men; it encouraged young men to target rich widows in order to gain personal fortunes.

The next passage of Strike While the Iron is Hot (1625) states, “that old Widowes love young men, Oh then doe not spare for asking, though she's old, shele toot agen: she scornes to take for Ritches sake. Thy Money she regardeth not, with love her winne, together joyne.”11 This passage informs young men that they are desirable to widows and argues that widows will not care about the money the young men have, but rather will care about the love that the young men can give them. The message conveyed in the passage above is that widows were naïve and had a disregard for the potential fortunes they held and were merely concerned with the goal of gaining another man in their lives to love. Strike While the Iron is Hot serves as an example of a ballad which created a

9 Christine Peters, "Single Women in Early Modern England: Attitudes and Expectations," Continuity and Change 12, no. 3 (1997): 341. 10 Martin Parker, “Strike While the Iron is Hot,” 1625, UCSB English Broadside Ballad Archive. 11 Ibid.

193 perception of widows that discounted their individuality and disregarded the ability of widows to have any form of agency over their own lives. Contrary to the message portrayed in the ballads, women at this time would have had some degree of choice in who they married and, if financially capable, would have likely refrained from remarriage as indicated by the declining rate of remarriage in the seventeenth century.12

Strike While the Iron is Hot then goes on to state, “if a poore Young-man be matched with a Widdow stord with gold, And thereby be much inritched, though hes young and she is old, twill be no shame unto his Name.”13 This passage reflects social attitudes at the time regarding negative ideas about a woman being older than her husband, because youth was associated with womanly values and desire.14 This ballad suggested that if young men sacrificed having a young wife and instead married a rich old widow there would be no shame in it because of the riches they could gain. Strike While the Iron is Hot acts as a social commentary portraying themes and attitudes reflected in

English society in the seventeenth century, in which it shamed women who were old and men who married old women unless they were rich. Strike While the Iron is Hot displayed ideas about wealth and age, particularly that being young and rich was viewed as the best a person could be and old and poor was the worst, especially as a woman.

Strike While the Iron is Hot ended in a direct statement to the listener, calling to the young men who lived in London to go and marry the supposed abundance of rich widows while the opportunity was available to gain a fortune through marriage. “Young- men all who hear this Ditty, in your memories keep it well, Chiefely you in London City,

12 Patricia Crawford and Laura Gowing, eds., Women's Worlds in Seventeenth-Century England (London: Psychology Press, 2000), 169. 13 Parker, “Strike While the Iron is Hot.” 14 Crawford and Gowing, 170.

194 where so many Widowes dwe[ll] the Season now doth well alow Your practise, therefore loose it not, fall toot apace, while you have space, And strike the Iron while tis hot.”15

Widows were the main subject matter of this ballad but were not given any direct role or called anything but rich and old, which conveyed to the listener that the value of the widow was only in her wealth. Her wealth was worth the sacrifice of a young man of not having a young wife. Though not uncommon, this portrayal of widows as naïve, easy to deceive, and desirable only for their potential fortunes serves as an example of social expectations of widows to be compliant individuals who were assumed to be ready to give up their fortunes to the next suitor who presented himself.

Another popular form of media in the seventeenth century was plays. Plays circulated ideas about culture, politics, custom, and desire that were then exaggerated and placed in the eyes of the urban public.16 Plays offer a unique historical insight into contemporaries’ ideas about the composition of their society and the different phenomena that occurred in the lives of people at that time. In the comedy The Widow's Tears (1606), the widow Eudora gained a fortune after the death of her husband. She was deceived and wooed by the young and poor Tharsalio in order to gain her riches. Tharsalio described her first in the following passage: “know you (as who knows not) the exquisite lady of the palace, the late governor's admired widow, the rich and haunty countess Eudora?

Were not she a jewel worth the wearing, if a man knew how to win her?”17 This description of Eudora set the tone for how she was portrayed in the rest of the play, as a desirable object for courtship due to her wealth and social status as a widow.

15 Parker, “Strike While the Iron is Hot.” 16 Jennifer Panek, Widows and Suitors in Early Modern English Comedy (New York: Cambridge University Press, 2004), 54. 17 Ibid, 58.

195

Throughout the play, the character Eudora lacked personality and detail besides her wealth and stature. Indeed, her role appeared to be only of an object of desire and gain for the leading character Tharsalio, as her role ended once Tharsalio obtains her as a bride. Throughout the play Eudora is portrayed as naïve and easy to entice into marriage.

Eudora's character appeared not to care about the fortune she was left and her ability to manage it but rather was portrayed as seeking the affection of a young man and the contentment of marriage, happy to give away her wealth. In one scene, she calls out after her husband saying: “’Husband,’ my countess cried, ‘take more, more yet.’”18 The use of

Eudora's character in this play, as a widow who is complicit in a young man's desire for her riches in exchange for “marital pleasure,” is representative of other sources in seventeenth-century popular media that displayed accepted notions about how widows were easy targets for money-hungry young men.

In an examination of seventeenth-century ballads and plays about widows, themes of male anxieties over widows possessing some degree of power over their potential mates emerged. Male anxieties over powerful widows was a key theme in the ballad A

Batchelers Resolution (1629).19 In this ballad, the character of the bachelor contemplated the best choice of woman to be his wife. He categorized the women based on their age and previous marital status, i.e. maid or widow. When contemplating a marriage with a widow, the character stated that “Widows will not be controlled” which displayed male anxieties about widows potentially holding power and expressing independence over their male counterparts.20 A Batchelers Resolution additionally highlighted some of the distinct categorizations of widows at the time, specifically based on their age. The

18 Ibid, 58. 19 “A Batchelers Resolution,” 1629? UCSB English Broadside Ballad Archive. 20 Ibid.

196 bachelor states, “If I should wed a widow old, I had better take a younger.”21 In context, the bachelor was discussing the probability of a widow exercising independence and power over him as a partner, to which he concluded if he were to marry a widow, she would need to be young because she would be more submissive due to her age.22 This ballad served as an example of the male anxieties surrounding the courtship of widows and the social categorizations and values that were placed on widows due to their age.

For example, a young and rich widow was thought to be a naïve target for a beneficial marriage, or an old widow was thought to be too independent to control and thus undesirable for marriage.

Complementary to the analysis of A Batchelers Resolution, Widows and Suitors in

Early Modern English Comedy by Jennifer Panek examines the different ways widows were perceived in media throughout early modern England. By investigating the roles widows played in comedies, Panek was able to uncover cultural ideas and anxieties about the social mobility of widows through marriage, economics, and sex.23 Panek argues that seventeenth-century comedies portrayed male anxieties of a widow exercising independent authority separate from any male partner or authority figure.24 Panek states that, “Age and wealth along with sexual, legal and economic experience all factored into men's anxious fantasies about the remarried widow as wife, making her simultaneously desirable and deeply threatening, a figure who, as I will argue, had the potential to both establish her husband's manhood and undermine it.”25 Elizabeth Hanson makes a similar argument in her analysis of seventeenth-century English dramas. In her article, “There’s

21 Ibid. 22 Ibid. 23 Panek, 54. 24 Ibid, 47. 25 Ibid, 48.

197

Meat and Money Too: Rich Widows and Allegories of Wealth in Jacobean City

Comedy,” Hanson argues that, “an anxiety that emerges in marriage manuals and other descriptions of married life present[s] marriage to a widow as demanding an especially forceful assertion of masculine authority.”26 The presence of these anxieties amongst males in the artistic culture of seventeenth-century English life brought light to ideas of widows opposing the categorical representations and laws surrounding their existence.

Widows did not always fit nicely into specific representations based on wealth and age, but instead were individuals who learned to navigate their new status as widows depending on their personality, age, wealth, and social status. In an effort to bring attention to widows whose existence pushed back against the contemporary depictions of widows, two case studies stand out: the life of Katharine Austen and the life of Bess

Hardwick.

Barbara J. Todd examines the life of seventeenth-century widow Katharine

Austen in her article, “Property and a Woman's Place in Restoration London.”27 Austen’s life served as a case study that contrasted with assumptions about how widows in the seventeenth century acted in comparison to their portrayal in ballads and plays. She rejected the character of the desirable, naïve, and rich widow without a care for her money and only the desire for another marriage. Upon becoming a widow, Austen rejected the popular cultural categorizations that were suggested of rich widows as seen through ballads and plays. Instead, she took advantage of her newfound legal agency

26 Elizabeth Hanson, “’There's Meat and Money Too’: Rich Widows and Allegories of Wealth in Jacobean City Comedy,” ELH 72, no. 1 (2005). 27 Barbara J. Todd, “Property and a Woman's Place in Restoration London,” Women's History Review 19, no. 2 (2010): 183.

198 through which she thrived and eventually pushed beyond the limits that were set before her.

Austen never married again after her husband's death. Instead, she took on the role as head of her family and began to engage highly in economic sectors in order to provide for her children.28 She initially took over the family properties, but soon began to be more aggressively involved in property and economic investment.29 It was in this period that stocks were growing in popularity, and Austen became a stockholder in large companies including the East India Company, which amassed her and her children a large amount of wealth while she was still relatively young.30 Austen went on to engage in highly controversial matters concerning land ownership and investment including one instance in which she defended a bill involving her son's claim to land in the House of

Commons in Parliament, an act drastically outside of the role of a woman at this time.31

Todd noted that, “When Austen began buying and developing London real estate, she was doing something distinctly uncommon.”32 Though Austen was not the norm at the time, her life serves as an example in contrast to the mediated portrayals of widows of her financial status and age. Austen displayed immense economic competence and independence upon taking the role of widow.33 She worked within the legal rights she had been given and set out to be the head of her household as the sole economic provisioner and mother.34

28 Ibid, 185. 29 Ibid. 30 Ibid. 31 Ibid, 195. 32 Ibid, 185. 33 Ibid, 192. 34 Ibid, 198.

199

Austen stood out as a highly successful woman of this era. The success of her economic involvement could indicate that, although it was not the norm at the time, other women were able to engage in economic sectors after widowhood and likely did so.

Though little historical evidence from this period exists from women themselves, and especially not financial records, widows were indeed capable of exercising their legal rights. Thus, some widows’ lives could have been lived in contrast to the image of the wealthy naïve widow and instead displayed that they were formidable women that stood to give reason to the male anxieties of a female asserting independent economic control as was examined in ballads and plays circulating in popular media at the time.

In Allison Levy’s book, Widowhood and Visual Culture in Early Modern Europe, there is a case study on the life of Bess Hardwick, who was a widow four times. Over the course of her life Hardwick experienced widowhood very differently and adapted to her situation as a widow very differently each time as well.35 Hardwick was first married at the age of fourteen and left a widow by the age of sixteen when her husband died very young at the age of fifteen.36 Hardwick married again three years later to William

Cavendish, who was of a much higher age and social rank than her, as he had formerly been on the royal Privy Council.37 Hardwick’s first remarriage served to largely enhance her social and financial standings and, as Sarah French notes, appears to have been a loving relationship.38 It is through the marriage to Cavendish that Hardwick had her six

35 Sarah French, “A Widow Building in Elizabethan England: Bess of Hardwick at Hardwick Hall,” Widowhood and Visual Culture in Early Modern Europe, ed. Allison M. Levy (Burlington: Ashgate, 2003), 164. 36 Ibid. 37 Ibid. 38 Ibid.

200 children.39 French comments in her study that it is during her marriage to Cavendish that

Hardwick likely began to be involved with owning and managing lands. 40 After

Cavendish's death, Hardwick was left with a large debt.41 Even though her social standing had risen, she was left with six children to raise and no money to raise them with.42 After her second widowhood, she remarried again to a rich man to increase her wealth and pay off her debts, Sir William St Loe, the Grand Butler to Elizabeth I.43 It was in the course of this marriage that Hardwick began to build and manage her own estates, separate from her husband's, though he financed them.44 Six years after their marriage, Sir William died and Hardwick remarried yet again, though this time not to increase her monetary value, as she then held several independent resources that could provide for her and her children. French notes that in her final marriage Hardwick likely married in order to increase her and her children's social standing.45 Hardwick’s final marriage was to

George Talbot, who held earldoms in Britain and the custody of Mary Queen of Scots.46

By entering into this marriage, Hardwick secured politically advantageous marriages for several of her children, eventually placing her granddaughter Arbella Stuart within the legal line of succession to the English and Scottish thrones.47

Hardwick was a widow who stepped out of the stereotyped role portrayed in ballads and plays; though she was rich and did remarry, she seemingly did so on her own terms and through decisive and independent choices. Each of her marriages proved to be

39 Ibid, 165. 40 Ibid, 164. 41 Ibid. 42 Ibid. 43 Ibid, 165. 44 Ibid, 164. 45 Ibid, 165. 46 Ibid. 47 Ibid, 166.

201 advantageous for her, raising her from relatively poor beginnings to the household of royalty by the end of her life. Over the course of Hardwick’s life, she took advantage of the economic capabilities that came with being a widow. Hardwick began to purchase, manage, and build properties during her second marriage, then went on to invest and made advantageous arrangements for herself and her kin following her last marriage.48

French’s study of Hardwick’s life is telling of the large amount of detail and individual attention that Hardwick put into the houses she built and managed. Hardwick herself took responsibility for these projects, giving even the most minute details and carefully selecting the interior designs in order to reflect, as French argues, moralistic and intellectual values.49

Though she began as poor, Hardwick became the ideal of a rich widow that was portrayed in early modern England’s plays and ballads; but a further study of Hardwick’s life reveals that she was not as she would be portrayed in a play. Hardwick took control of her own decisions and worked within the confines of her social standing in order to meet the needs of herself and her children as well as increase the standing of her family.

Though Hardwick would not have been an ordinary woman during this period, her life was as an example of a woman who worked within the confining legal system of early modern English society and stepped out of the expectations governing them. She was an anomaly but served to prove that widows were portrayed in forms of popular media in an untrue fashion through a biased lens that was clouded with male anxieties and the need to exert authority over women who held the potential for power within a difficult legal system and society they were in. England at this time was not in a gender war of women

48 Ibid, 168. 49 Ibid.

202 against men but, as it stood, there was a large divide between the economic abilities of men and those of women. Through the tragedy of death, some women were able to break out of their expected roles and surprise the world around them with their success.

The portrayal of widows in seventeenth-century popular culture invoked ideas about widows that would have undermined their abilities and not been representative of their real situations. The dichotomy between widows and their representations in the media portrayed them as naïve, easy targets for their money, discounting their individual experiences. A narrative of male anxieties emerged in artistic plays about widows in which males feared that widows would step out of their traditional roles as submissive women in their newfound legal status. Some of these anxieties proved true as is seen in the case studies of Hardwick and Austen. Some widows in seventeenth-century England maneuvered through their legal situations, remarrying or not, in order to make conscious decisions for themselves and their families, not always being subject to the men in their lives.

When it comes to widows in seventeenth-century England, there is still much to be studied and discovered about their lives and experiences. This essay reviewed a wide variety of literature concerning widows, but there was a lack of information available on topics other than sexuality, children, and land rights. Questions such as how widows perceived themselves could be further studied. As well, examinations of the lives of widows of lower classes would add valuable information to the academic community.

Resources such as diaries and personal letters should be reviewed in order to bring a voice to widows from the seventeenth century, rather than having to rely on sources that others wrote about them. A greater understanding of the ways in which seventeenth-

203 century English life was experienced could be achieved by examining much of the yet to be discovered information about the personal experiences of many of these women from all areas of life such as culture, class, personal identity, occupation, and interests.

204

Bibliography “A Batchelers Resolution.” UCSB English Broadside Ballad Archive. 1629, accessed 18 December 2016. https://ebba.english.ucsb.edu/ballad/20105/xml. Chapman, George and Akihiro Yamada. The Widow's Tears (1634). London: Methuen, 1975. Crawford, Patricia, and Laura Gowing, eds. Women's Worlds in Seventeenth-Century England. London: Psychology Press, 2000. Erickson, Amy Louise. "Common Law Versus Common Practice: The Use of Marriage Settlements in Early Modern England." Economic History Review 43, no. 1 (1990): 21-39. French, Sarah. “A Widow Building in Elizabethan England: Bess of Hardwick at Hardwick Hall.” Widowhood and Visual Culture in Early Modern Europe, ed. Allison M. Levy. Burlington: Ashgate, 2003. 161-176. Hanson, Elizabeth. “’There's Meat and Money Too’: Rich Widows and Allegories of Wealth in Jacobean City Comedy.” ELH 72, no. 1 (2005): 209-238. Panek, Jennifer. Widows and Suitors in Early Modern English Comedy. New York: Cambridge University Press, 2004. Parker, Martin “Strike While the Iron is Hot.” UCSB English Broadside Ballad Archive. 1625, accessed 18 December 2016. https://ebba.english.ucsb.edu/ballad/20179/xml. Peters, Christine. "Single Women in Early Modern England: Attitudes and Expectations." Continuity and Change 12, no. 3 (1997): 325-345. Todd, Barbara J. “Property and a Woman's Place in Restoration London.” Women's History Review 19, no. 2 (2010): 181-200.

205

Enemies Within or Without Enemies: “Enemy Aliens” and Internment in Canada

during the Second World War

By: Tricia Nowicki

Tricia Nowicki (1985 – 2017) took two classes with me, but she taught me more than I taught her. Being in a mechanized wheelchair, she was a notable presence in the classroom. What really made Tricia stand out, though, was her powerfully articulate presence in seminar. For any who imagined a small woman in a mechanized wheelchair to need polite support and gentle handling, Tricia quickly put such notions to rest. She was fearless and suffered no fools. She spoke directly to issues, highlighting debates, pulling at evidence, ably comparing perspectives, and tackling debate with gusto. Her strength and intelligence stood out far more than any wheelchair.

The following essay was from her second-year course with Professor Maureen Lux. It's not a research essay, but rather an exercise in historiography - that is, the history of what is written about a particular subject. One of the great challenges history students face is making sense of complex debates in history, not simply what happened but different interpretations of what happened and what meaning we take from it. This essay looks at four different historians all addressing the same story: the internment of "enemy aliens" - people born in or ethnically identified with the enemy countries of Germany, Italy, or Japan during the Second World War. While the basics of the story are agreed upon, each historian has a different assessment of why and how the Canadian government detained these people, whether and how it was justified, and what meaning we should take of this important moment in our country's history. Tricia's essay takes us through each essay, examines their evidence, assessing not only what is said, but also situating the political and ethical dimensions of the story. She understood the disciplinary issues of being a historian; she understood the ethical importance of this story and the debate around it. She was a fine historian.

- Dr. Daniel Samson

During the Second World War (1939-1945), the nations involved in hostilities were split between the Allied and Axis powers. As part of the Allies and a country built on immigration, Canada suddenly found itself having to manage hundreds of thousands of residents whose nationalities were of Axis states, including but not limited to men and women of German, Italian, and Japanese descent. As a result, the Canadian government implemented the Defence of Canada Regulations (DOCR) to restrict distribution of

206 dissident information and monitor or ban the activities of people and groups deemed a threat to the safety of the country, such as fascists, pro-Nazis, communists, or conscientious objectors. 1 The DOCR forced persons of enemy origin, whether immigrants, naturalized citizens, or even Canadian-born nationals, to register as “enemy aliens” with the state and authorized government security forces to arrest and detail anyone suspected of subversion or aiding the enemy without a trial.2 Moreover, many

“enemy aliens,” particularly Japanese families, were subjected to property confiscation, home evacuation, relocation to labour or internment camps, or even deportation or denaturalization. 3 Historians have treated the Canadian government’s internment of persons of enemy origin during the Second World War differently, since some hold a more traditionalist view, arguing “enemy aliens” were prejudicially interned based on ethnicity, while others apply a revisionist perspective, claiming belief in seditious and radical politics or ideologies led to the justifiable internment of “enemy aliens.” By examining several articles about “enemy aliens” in Canada in the Second World War, these differences are not only reflected in the background of the authors, choice of subject, and date of publication, but are also revealed in the analysis and interpretation of primary and secondary sources to draw conclusions on why and how the government interned persons of enemy origins identified as subversive or threats to the country.

Since the topic is the focus of many books and articles, the following analysed articles present only a small sample of the varying arguments and perspectives published

1 Marcel Martel, “World War II and the Internment of Enemy Aliens: Circumscribing Personal Freedoms,” in Visions: The Canadian History Modules Project, Post-Confederation, eds. P.E. Byrden et al (Toronto: Nelson, 2017), 237-238. 2 Ibid, 237. 3 Ibid, 238. See also Bill Waiser, s.v. “Wartime Internment,” The Oxford Companion to Canadian History (New York: Oxford University Press, 2004), accessed 23 March 2017.

207 on the Second World War interment of “enemy aliens” in Canada. In “’Agents Within the

Gates’: the Search for Nazi Subversives in Canada during World War II,” historian

Robert H. Keyserlingk argues the threat of persons of enemy origin “turning against the adopted country” was a political myth, and the Canadian government arbitrarily interned peoples from the German community to show control and assuage public fear of a Nazi insurgency.4 As it was not only published amidst campaigns lobbying the Canadian government for Japanese Canadian redress but also included as part of On Guard for

Thee: War, Ethnicity, and the Canadian State, 1939-1945, a collection of papers presented at a Queen’s University symposium with a traditionalist view on how the government dealt with ethnic minorities during the Second World War,5 “’Agents Within the Gates’” was an opportunity to call attention to the discrimination German Canadians experienced during the war. Furthermore, as a German-born Canadian himself, there is no doubt Keyserlingk’s familial background inspired him to focus on “enemy aliens” of

German descent, but his knowledge and seasoned past, including service in the Royal

Canadian Navy and careers as a Canadian foreign service officer in Germany and history professor at the University of Ottawa,6 adds to the scholarship of his article.

Similarly, Pamela Sugiman, a Canadian woman of Japanese descent, identifies with the subjects of her article, “Passing Time, Moving Memories: Interpreting Wartime

Narratives of Japanese Canadian Women,” but she acknowledges the influential impact

4 Robert H. Keyserlingk, “’Agents Within the Gates’: the Search for Nazi Subversives in Canada during World War II,” Canadian Historical Review 66, no. 2 (June 1985): 211-212, 237-238. 5 Norman Hillmer, Bohdan Kordan, and Lubomyr Luciuk, eds., On Guard for Thee: War, Ethnicity, and the Canadian State, 1939-1945 (Ottawa: Canadian Committee for the History of the Second World War, 1988). 6 “Keyserlingk, Robert H.,” Encyclopedia.com, accessed 23 March 2017.

208 her own memories and emotions had on her research.7 Nonetheless, a historian and sociology professor at Ryerson University, her expertise in oral history, memory, racism and racialization, and women’s history in Canada comes across in “Passing Time,

Moving Memories” and definitely strengthens her argument, both from a traditionalist and revisionist perspective.8 Like the German Canadian subjects in “’Agents Within the

Gates,’” Sugiman claims the Canadian government violated Japanese Canadians due to their ethnicity and interned them without evidence of subversion during the Second

World War, “[resulting] in the destruction of a community and trauma to the individuals within it.”9 In addition, “Passing Time, Moving Memories,” originally presented at a conference for the Centre for Feminist Research, is a revisionist work to a certain extent, since it focuses on the experiences of female Japanese Canadians during the Second

World War, reflecting the growing recognition of feminist and women’s history when published, and argues against scholars’ previous depiction of Japanese Canadian women as voiceless, powerless, and without agency.10

In contrast, the articles by historians Michelle McBride, “The Curious Case of

Female Internees,” and Reg Whitaker and Gregory S. Kealey, “A War on Ethnicity? The

RCMP and Internment,” are principally revisionist, seeing as they were both featured in

Enemies Within: Italian and Other Internees in Canada and Abroad, a book compiled after a 1995 conference debating the fascist associations of Italian Canadians interned during the Second World War and arguing “enemy aliens” with political beliefs diverging

7 Pamela Sugiman, “Passing Time, Moving Memories: Interpreting Wartime Narratives of Japanese Canadian Women,” in Visions: the Canadian History Modules Project, Post-Confederation, eds. P.E. Bryden et al (Toronto: Nelson, 2017), 252-253. 8 “Pamela Sugiman,” Research & Innovation Spotlight, Ryerson University: Faculty of Arts, accessed 23 March 2017. 9 Sugiman, 250-251. 10 Ibid, 250, 264.

209 from those of the Canadian government posed a genuine threat to national security.11 In

“Curious Case of Female Internees,” McBride claims literature on female internees is virtually non-existent, even through women were a significant presence in Canadian fascist and pro-Nazi organizations before and during war, and her intention is to gender the history of internment by writing them into it.12 Moreover, her interest in the Royal

Canadian Mounted Police (RCMP), having written her History Masters thesis on how they dealt with Nazism and fascism in Canada from 1934 to 194113 is evident, as she suggests the arbitrary confinement of women was due to the irresponsible Inter-

Departmental Committee on Internment (IDC) rather than the RCMP, who were vital in investigating and tracking subversive activities.14 Likewise, Whitaker and Kealey, history professors at York University and the University of New Brunswick, respectively, and specialists on Canadian security and intelligence history,15 show a considerable amount of affinity for the RCMP in their article, “War on Ethnicity?” Formerly collaborating on

RCMP Security Bulletins, a multi-volume work of assembled and edited security dispatches from before and during the Second World War,16 it is not surprising that their article argues the RCMP proficiently separated politics or ideologies from ethnic backgrounds, stating they “were more remarkable for their relative selectivity than for putting ‘ethnicity on trial,’ striking not at the ethnic communities in general but at the

11 Franca Iacovetta, Roberto Perin, and Angelo Principe, eds., Enemies Within: Italian and Other Internees in Canada and Abroad (Toronto: University of Toronto Press, 2000), vii-viii. 12 Michelle McBridge, “The Curious Case of Female Internees,” in Enemies Within: Italian and Other Internees in Canada and Abroad, eds. Franca Iacovetta, Roberto Perin, and Angelo Principe (Toronto: University of Toronto Press, 2000), 148-152. 13 Iacovetta, Perin, and Principe, eds., Enemies Within, 414. 14 McBride, 148-152. 15 “Inventory of Reg Whitaker Fonds,” Fonds 0225, York University Archives and Special Collections, Toronto ON, last modified 20 November 2003; “Greg Kealey,” History University of New Brunswick: Faculty of Arts, accessed 23 March 2017. 16 Ibid; Iacovetta, Perin, and Principe, eds., 414 and 416.

210 ideologically suspect minority.” 17 Furthermore, Whitaker and Kealey suggest the

RCMP’s surveillance intelligence prevented any subversive plots in Canada by quickly identifying and detaining “enemy aliens” whose activities in fascist or pro-Nazi organizations posed a threat to both the nation and war effort.18 Although, in 1990, Prime

Minister Brian Mulroney publicly apologized on behalf of the Canadian government for the wrongful treatment and internment of Italian Canadians during the Second World

War,19 this sentiment is clearly not shared by McBride, Whitaker, and Kealey in their articles published after his statement, viewing “enemy aliens” of Italian (or German) descent not as victims of their ethnic origin, like the Japanese Canadians, but as people with questionable political loyalties.

Each of the articles analysed in this paper combine primary and secondary sources in different ways to show how the Canadian government and its security forces classified and interned supposedly subversive persons of enemy origin during the Second World

War. In “’Agents Within the Gates,’” Keyserlingk uses immigration statistics and records from the Department of External Affairs to de-mythologize the idea that many German immigrants had strong ties to Nazi Germany, showing most German Canadians were either refugees from Nazism or not exposed to the Hitler regime (1933-1945) because they came to Canada before the Depression closed immigration in 1931.20 While his article is well-researched, Keyserlingk relies heavily on documents from the William

17 Reg Whitaker and Gregory S. Kealey, “A War on Ethnicity? The RCMP and Internment,” in Visions: the Canadian History Modules Project, Post-Confederation, eds. P.E. Bryden et al (Toronto: Nelson, 2017), 270, 273, 274, 277. 18 Ibid, 274. 19 Iacovetta, Perin, and Principe, eds., vii. 20 Keyserlingk, 214-215.

211

Lyon Mackenzie King Diary and Public Archives of Canada,21 only examining the situation through the eyes of public officials and not the German Canadian internees, and he is relatively non-comparative by not relating his findings to those of other historians, either Canadian or from abroad, possibly since his study was one of the first of its kind.

Although Whitaker and Kealey similarly view “enemy alien” internment through a bureaucratic or RCMP lens, using letters and reports from the National Archives or

Canadian Security Intelligence Services (CSIS), they also juxtapose their conclusions with several other historians, including Keyserlingk and McBride.22 While Keyserlingk cites newspaper articles, specifically from the Hamilton Spectator and Globe and Mail, claiming no German threat existed as the RCMP found little or no evidence of Nazi activity,23 Whitaker and Kealey disagree, using biographies and autobiographies written by security service officials to argue, at the beginning of the war, the RCMP “broke the back of potential Nazi [subversion or sabotage] against the war effort.”24 Furthermore,

Keyserlingk indicates that even if a Nazi threat had existed, they did not have the personnel capacity nor the intelligence capabilities to effectively consider the problem, as illustrated in official memorandums and reports, a history about the RCMP, and an officer’s autobiography.25 However, Whitaker and Kealey use sources unavailable when

“’Agents Within the Gates’” was published, including their own work, RCMP Bulletins, compiling formerly restricted RCMP documents and McBride’s Masters thesis about the

RCMP to claim they had a network of undercover agents and informants within the

21 Ibid, 211. 22 See Whitaker and Kealey, 272, 274-275. 23 Keyserlingk, 237; Whitaker and Kealey, 274. 24 Whitaker and Kealey, 274. 25 Keyserlingk, 216-220.

212

Italian and German communities long before the war and obtained evidence suitable for internment through surveillance and confiscation of property.26

Likewise, in “Curious Case of Female Internees,” McBride takes advantage of

Whitaker and Kealey’s RCMP Bulletins and her own Masters thesis, as well as newly released documents in the National Archives of Canada, including records from the

RCMP, Norman A. Robertson, an advisor to Prime Minister Mackenzie King, and the

Department of the Solicitor General.27 Unfortunately, she had to interpret fragments of these redacted documents to obtain any details on female internees, a similar dilemma faced by Sugiman, as Japanese-Canadian correspondence discovered in the National

Archives had been “intercepted and censored by government officials during [the Second

World War].”28 In the same way as Keyserlingk and Whitaker and Kealey, McBride only uses sources to view the circumstances of the arrest and internment of “enemy aliens,” relying on communications between the Department of National Defence, RCMP, and prison warden of the Kingston Penitentiary. However, through the IDC personal history files and RCMP reports, she provides interesting personal details about the lives of specific female internees and uncovers their associations with fascist or pro-Nazi organizations.29 Nonetheless, statistics from a list of internees on the RCMP Central

Registry Classification Sheet illustrate females constituted a very small percentage of interned “enemy aliens,” not only making the cases seem exceptional, rather than the norm, but also leading McBride to conclude internment policies set by the IDC were

“haphazard” and more women with connections to threatening political organizations

26 McBride, 148; Whitaker and Kealey, 272, 274, 278. 27 McBride, 148-149. 28 Sugiman, 253. 29 McBride, 149-165.

213 should have been interned.30 Although McBride claims her work is an attempt to gender the history of internment, her failure to use any written or oral testimony from female internees weakens this goal. Sugiman, on the other hand, as if answering McBride’s call, makes very effective use of letters to and from interned Japanese Canadian women, while also adding oral accounts she herself developed. Thus, Sugiman is the only author to use the perspective of the Japanese internees themselves. Most of the others guide readers to

Ken Adachi’s The Enemy that Never Was: a History of Japanese Canadians.

The letters and oral testimonies used by Sugiman verify that government policies subjected Japanese Canadians to ethnic discrimination and suggest there was no evidence of any threat that might justify internment. While offering a strong contrast to McBride and Whitaker and Kealey, Sugiman acknowledges the problems with relying on written accounts that were both censored and translated from Japanese. Oral testimony also has limits as memories change over time, sometimes people alter how they recall traumatic events, and sometimes responses are influenced by how the question is asked.31 Every historian uses various primary sources, including letters, official documents, newspapers, or oral history, but Sugiman offers a richer account of female “enemy aliens.”

In general, each article is strengthened using a range of prime and secondary sources to support their theses, but each author’s narrow focus sometimes weakens their arguments. Each of Keyserlingk, McBride, and Whitaker and Kealey rely on one major source and often reject or ignore other findings to the contrary. In its 29 pages, for example, Keyserlingk’s “’Agents within the Gates’” dedicates two short sentences to the capture of three German spies on Canadian soil, but this evidence is quickly dismissed

30 Ibid, 154-166. 31 Sugiman, 253-267 and 267.

214 since they were not from Canada. If he is arguing there were no Nazi collaborators in

Canada, then he should have paid more attention to this case. Did they, for example, have connections to people of German descent in Canada? Sugiman, similarly, would benefit from oral testimony from people outside the Japanese-Canadian community, particularly security service officers. Would it not be of interest to know what they thought of the internment and specifically of women’s internment? Lastly, McBride and Whitaker and

Kealey glaze over the role of the RCMP in the relocation, confiscation of property, and restrictions on the freedom of Japanese Canadians. They emphasize the role of the DOCR arguing that other security agencies were more responsible and that the “actual conduct and conditions of internment were not an RCMP responsibility.”32 Yet the articles do not contain any primary sources by persons of enemy origin or in unofficial positions as evidence either confirming or denying the RCMP’s involvement. How can one really know what the RCMP’s involvement was when the evidence base is so narrow?

Undoubtedly these authors know the government and the RCMP acted in a discriminatory manner, and no amount of interpretation or evidence can change that fact.

While these considerations may have helped form impartial and informed arguments in these articles, the suggestion and questions could also act as a starting point for further historical research on “enemy” internment during the Second World War.

In conclusion, historians differ in their treatment of “enemy alien” internment during the Second World War. All agree that the Canadian government singled out and interned persons of specific ethnic backgrounds, despite no evidence that they posed any legitimate danger to the country. These people offered no real threats to society; they

32 Whitaker and Kealey, 271 and 279.

215 offered no sense of extreme or subversive ideologies. In all the articles analysed, the personal or academic background, the authors’ subjective points-of-view, and their own historical contexts influenced how they interpreted this story. The conclusions of

Keyserlingk, McBride, and Whitaker and Kealey demonstrate how historians can analyse and interpret primary sources about one event in several, often opposing, ways.

Ultimately, while Keyserlingk provides adequate evidence of government prejudice towards German Canadians during the Second World War, his assertions of injustice are somewhat exaggerated when compared with the experiences of those of

Japanese descent, as Sugiman illustrates. Nevertheless, both authors at least make clear that the Canadian government later attempted to remedy its mistakes, either through admission of wrongdoing, redress settlements, not reinstating the DOCR, and acknowledging its internment was not representative of Canadian laws and ideals. On the other hand, McBride and Whitaker and Kealey seem to misinterpret the situation altogether by suggesting the detainment was duly justified. In a sense, then, these authors were allowing that, in times of war, even in a democracy, the government has the right to strip minorities or nonconformist citizens of their civil liberties. They forget that the reason Canadian soldiers were fighting was to protect those very rights and freedoms.

Therefore, it is important for historians to convey how the Canadian government and its security services subjected persons of enemy origins to discriminatory treatment under extreme circumstances of the Second World War. But it is equally important for them to show how Canada differs from its enemies by admitting its faults, making reparations, and ensuring the future protection of the democratic rights and principles of its citizens, regardless of their backgrounds or beliefs.

216

Bibliography “Greg Kealey.” Department of History, University of New Brunswick. Accessed 23 March 2017. http://www.unb.ca/fredericton/arts/departments/history/people/gkealey.html Hillmer, Norman, Bohdan Kordan, and Lubomyr Luciuk, eds. On Guard for Thee: War, Ethnicity, and the Canadian State, 1939-1945. Ottawa, Canadian Committee for the History of the Second World War, 1988. Keyserlingk, Robert H. “’Agents Within the Gates’: the Search for Nazi Subversion in Canada During World War II.” Canadian Historical Review 66, no. 2 (June 1985): 211-239. Keyserlingk, Robert H. Encyclopedia.com. Assessed 23 March 2017. http://www.encyclopedia.com/arts/culture-magazine/keyserlingk -robert-h.html Martel, Marcel. “Introduction: World War II and the Internment of Enemy Aliens: Circumscribing Personal Freedoms.” Visions: The Canadian History Modules Project, Post-Confederation, eds. P.E. Bryden, Colin M. Coates, Maureen Lux, Lynne Marks, Marcel Martel, and Daniel Samson. Toronto, Nelson, 2017. 237- 238. McBride, Michelle. “The Curious Case of Female Internees.” Enemies Within: Italian and Other Internees in Canada and Abroad, eds. Franca Iacovetta, Roberto Perin, and Angelo Principe. Toronto: University of Toronto Press, 2000. 148-170. “Pamela Sugiman.” Research & Innovation Spotlight. Ryerson University, Faculty of Arts. Accessed 23 March 2017. http://www.ryerson.ca/arts/research- innovation/research-and-innovation-spotlight/sugiman-pamela/ Sugiman, Pamela. “Passing Time, Moving Memories: Interpreting Wartime Narratives of Japanese Canadian Women.” Visions: The Canadian History Modules Project, Post-Confederation, eds. P.E. Bryden, Colin M. Coates, Maureen Lux, Lynne Marks, Marcel Martel, and Daniel Samson. Toronto, Nelson, 2017. 250-270. Waiser, Bill. “Wartime Internment: in The Oxford Companion to Canadian History. Oxford University Press, 2004. Accessed 22 March 2017. http://www.oxfordreference.com/view/10.1093/acref/9780195415599.001.0001/a cref-9780195415599-e-1636 Whitaker, Reg and Gregory S. Kealey. “A War on Ethnicity? The RCMP and Internment.” Visions: The Canadian History Modules Project, Post- Confederation, eds. P.E. Bryden, Colin M. Coates, Maureen Lux, Lynne Marks, Marcel Martel, and Daniel Samson. Toronto, Nelson, 2017. 237-238.

217