<<

THE AMERICAN WAY OF BOMBING

CHANGING ETHICAL AND LEGAL NORMS, FROM FLYING FORTRESSES TO DRONES

EDITED BY MATTHEW EVANGELISTA AND HENRY SHUE

CORNELL UNfVERSlTY PRESS lrhaca and Lond~n nnw·mnr st·ww·mlllllm>•""'"'"'"'"'"''"'"'' -

BANNING AUTONOMOUS KILLING 225

This developing norm IS reflected in documents of such diverse origin as the CHAPTER 12 United States Department of Defense7 and Rights Watch.s While consensus is building toward the norm, consensus does not yet exist about how to move beyond e<;tablishing the norm to winning global acceptance of Banning Autonomous Killing it. While it 1s possible to find support in existing law for such a norm, the legal case 1s based on inference and analogy. Such support may be sufficient The Legal and Ethical Requirement That to gain wide acceptance, but there is really no denying that the norm would Make Near- Time Lethal Decisions be strengthened by preliminary discussion and eventual negotiation of an af­ firmative treaty ban on fully autonomous killing. The discussion beginning MAl-lY ELLEN O'CoNNELL m the context of a new Protocol to the Convention on Certain Conven~ tiona! is a promi~ing start.~ In April 2013, United Nations Special Rapporteur Christof Heyns called for a moratorium on moving beyond the design stage m the development of fully autonomous weapons pending the formation of a panel of experts to "articulate a policy for the mternational community on the 1ssue." 10 Wendell Wallach, chair of Yale's and ethics study group, also called for a moratorium in 2013 and for the U.S. president to "sign an executive order declaring that a deliberate attack with lethal and nonlethal force by fully autonomous weaponry violates the Law of War." 11 This chapter proposes a Long before the computerization of weapons treaty that will say much the same as Wallach's executive order. The negotia~ 1 technology, humanity debated the normative acceptability of new weapons. tion of such a treaty will require the expert input and global debate needed to The invention of the long bow, gunpowder, airplanes, weapons of ma&~ destruc­ develop a strong express rule reqmring that human beings make any decision tion, and so on have all raised moral and legal concerns. 2 Unmanned aerial for the near-time application of offensiVe lethal force to other human beings combat vehicles, or drones,3 became the focus of debate when the United States and property. Even without the successful conclusion of a treaty, the nego~ used a drone to launch a nussilc attack that killed several people in November tiation alone could go a long way toward creating a principle of customary 2001 in Afghanistan. 4 It was the fiN knO\vn use of a drone, operated from a international law that bans autonomous killing. great distance, to kill. As the debate over drones grew, another debate, on the The research for this chapter began in 2010. It is likely that as soon as the legality and morality of autonomous weapons, intemified.' In certain respects, chapter is published some of the information contained in it will be inac­ autonomoU'i weapons are as old as any \veapon if they are defined as weapons curate, given the pace of technological development. Nevertheless, the best that may be triggered by a target rather than by the user of the . A cam­ time to consider the law and morality applicable to new technology is, argu­ ouflaged pit, a spring gun, a land or maritime mine, or an improvised explosive ably, before it is fully operational. Once technology is in use, re'itrictions tend device c-an be triggered by the target without the user being in the vicinity. to be more difficult to obtain. 12 On the other hand, regulating technology Such weapons have long been the focus of philosophers and legal scholars. The still under development poses obvious problems. Every attempt will be made advent of robots with computer programs that can learn ha'i renewed the debate here to focus on the most likely future scenarios. The proposal made in the and deepened the concerns. Advance'i in artificial intelligence mean that once final section, titled "Norm Building," is designed to apply to ccrtam existing a robot is constructed and programmed, it will be able to make the decision to weapons systems, as well as future ones. Thus, the proposal should be relevant attack without additional human intervention.6 Such an attack could occur at regardless of what scientists invent next. great distance from the time and place of the robot's origin. The remainder of this chapter is divided mto three parts: It will begin In response to these developments, discussion is building toward a norm with a brief overview of what we know at the time of writing about autono­ against the use of fully autonomous robotic decisions to deploy lethal force. mous weapons. The discussion will move on to the law governing killing, MARY ELLEN O'CONNELL BANNING AUTONOMOUS KILLING including the international law regulating lethal weapons. The final part will at another meeting, a former U.S. Air Force research scientist expressed hi~ introduce a proposal for a treaty ban on removing humans too far from any confidence that there is "nothing holding us back. The technology IS a 'slam­ decision to kill. The precise details of such a ban will reqmre mternational dunk."'20 He argued that the United States is not yet fully automating drones negotiation; this chapter will focus on why such negotiation IS imperative. because of "cultural resi'itance," not because of technological hurdles. 21

Despite any "cultural reSIStance," there IS no doubt that scientists are hard Autonomous Weapons Technology at work on the relevant technology for fully autonomous systems. ~ 2 The 2012 DoD directive 'ihows the level of U.S. involvement in planning for In 2010, the scientific community did not yet have a consensus definition autonomous weapons systems. In the United States, the Committee on Au­ of what constitutes a fully autonomous weapons system. 13 By mid-2013, tonomous Vehicles in Support of Naval Operations wrote in 2005 that "the a common definition had emerged. In November 2012 the United States Navy and Marine Corps should aggressively exploit the considerable war­ Department of Defense (DoD) issued a directive titled "Autonomy in Weap­ fighting benefits offered by autonomous vehicles (AVs) by acquirmg opera~ ons Systems." The purpose of the directive is to establish "DoD policy and tiona] expenence with current systems and using lessons learned from that assigns [sic] responsibilitie; for the development and use of autonomous and experience to develop future AV , operational reqmremems, and semi-autonomous functions in weapon systems." The directive defines an systems concepts."23 "autononwus weapons systen1" as a weapons system that "once activated, At a press bnefing m 2007, a spokesperson for the U.S. Department can select and engage target~ without further interventions by a human op­ of Defense's Unmanned Aenal Systems Task Force, Dyke Weatherington, erator. This includes human-supervised autonomous weapon systems that spoke of the need to go beyond the remotely controlled technology in ex­ are designed to allow human operators to override operation of the weapon istence today that requires human intervention. For example, in "air-to-air system, but can select and engage targets without further human mput after combat-there's really no way that a system that's remotely controlled can activation." 14 effectively operate in an offensive or defensive air combat environment. That UN Special Rapporteur Christo£ Heyns refers to "lethal autonomous ro­ [requires] ... a fully autonomous system." 24 Finally, a Human Rights Watch botics" as weapons systems "that, once activated, can select and engage targets report released jll'it a few days before the DoD directive was released m without further human intervention."Li Military ethicist Deanna Neal uses a November 2012 states, "Some military and robotics experts have predicted similar definition of a fully autonomous weapon. She calls it a robot "which that 'killer robot>' -fully autonomous weapons that could select and engage uses some form of artificial intelligence to guide its decision-making and that targets without human intervention-could be developed within 20 to 30 2 IS capable of target discrimmation and regulating its use of force indepen­ years." ' dently of human 'eyes on target' verification or authorization before it kills The breakthrough to fully autonomous weapons has either already hap­ 1 someone." & The Human Rights Watch definition is also similar: "If a weapon pened or will occur in the foreseeable future. As will be argued in the section 17 were fully autonomous, it would 'identify targets and ... trigger itself."' on norm building, the time to clarify the applicable legal and moral pnn­ These definitions exclude many autonomous weapon systems by specify­ ciples 1s now. 20 We already have a variety of semi-autonomous and passJVe ing that the weapon have the ability to "select and engage targets." The gen­ autonomous weapons such as land mines: eral view IS that states do not yet possess fully autonomous weapons systems but are definitely seeking them. Although a few commentators continue to Indeed, several military robotic-automation systems already operate at raise doubts about whether scientists can or will develop fully autonomous the level where the human IS still m charge and responsible for the w~tems, the weight of opinion indicates it is only a matter of time. How deployment of lethal force, but not in a directly supervi~ory manner. much time is disputed. Current e~timates predict that fully autonomous Examples include: (i) the Phalanx system for Aegis-class cruisers m the 1 weapons will emerged as early as 2015 or as late as 2050. H At a meeting of Navy "capable of autonomously performing its own search, detect, the International Society of Military Ethics in January 2011, Neal cautioned evaluation track, engage and kill assessment functions" ... ; (ii) the that "there is disagreement among the engineering nmununity as to whether MK-60 encap~ulated (CAPTOR) sea mine system--one of 1 1 an autonomous robot ... ean actually be a~ated." ' Yet a few weeks later the Navy's primary ou1ti~ubmarine weapons capable of autonomously MARY ELLEN O'CONNELL BANNING AUTONOMOUS KILLING

firing a torpedo and cruise missiles .; (iii) the Patriot anti-aircraft conduct of hostilities .... It would also raise a range of fundamental legal, nus;1le battenes; (iv) "fire and forget" missile systems generally; and ethKal and societal issues, which need to be considered before such systems (v) anti-penonnel mines or alternatively other more discriminating are developed or deployed.''32 Kellenberger may or may not be correct that the cla'ises of mines (e.g. ami-tank). These devices can each be considered deployment of fully autonomous systems will be a paradigm shift over semi­ to be robotic by some definitions, a'> they ali are capable of sensing their autonomous one;. That issue is not as important, however, as his second point environment and actuating, in these cases through the application of about the fundamental legal and ethical prmciples we have established over lethal force. 27 centuries. These prmciples are premised on a closer association between the decision to use a weapon and the death or destruction resulting from that use Efforts to bmld norms against fully automated killing should begin before than is often the case with today's weapon 'YStems. Without legal interven­ the technology ism wide use. A recent UK Ministry of Defense report stated tion, scientists may continue to develop robots that take the kill decision ever of unmanned aerial vehicles: "Most of the legal issues ;urrounding the use farther from the human beings who should bear responsibility for making it. of existing and planned systems are well understood and ate simply a varia­

2 tion of those associated with manned systems.'' ~ The same could be said of Lawful and Ethical Killing fully automated systems. Yet it is the thesis of thi~ chapter that humanity has not yet taken mto account the impact of increasing physical and temporal International law prohibits the resort to lethal force except in limited cir­ distance on our legal and moral principles on killing. Development of the cumstancesY In peacetime, international law permits police forces and other capacity to deploy robotic target selection should be aecompamed by both government authorities acting under police rules to use lethal force to save an audit of applicable legal and ethical norms and affirmative action toward lives immediately. No innocent bystanders may be killed in such operations. a central, treaty-based principle to restrict the removal of human beings from Governments may resort to military force when challenged by organized the offensivt· near-time kill decision. armed insurgents on the1r territory or when attacked with significant force Several arguments already exi'it agamst drafting a new rule. One posits that from abroad. The United Nations Security Council may also authorize the computers will be able to make target selections better than human beings. <'J me of force by states to restore "international peace and security." Another is offered by Schmitt and Thurnher who argue that by definition A state's response to a significant attack from beyond its territory is regu­ human being'i \viii not be taken out of actual kill densiom because human lated under the international law of self-defense. The law of self-defense is being' will build and program robotic weapon systems. Thus, no new rule is compnsed of Article 51 of the United Nations Charter and additional rules needed to cmure that humans are in the loop. They admit that human input found in the law of state responsibility and the general principles of law.-'4 could occur long before a robot resorts to lethal force but then they fail to The International Court of Justice has identified these additional rules in a grapple with the implications of this fact. 111 Temporal di~tance from the kill number of unportant decisions, starting with the 1948 Coifu Channel case decis10n is the critical i'iSue. Most would agree that a computer programmed between the United Kingdom and Albania.3; Military fOrce may be exercised to kill months or years before an actual operation no longer has meaningful by a state that is the victim of a significant armed attack on the territory human involvement m the deployment of lethal force. The current law and of a state responsible for the attack. The exercise of mch force must also system of accountability for targeting decisions is built around human in­ conform to the general pnnciples of nece;sity and proportionality.36 States volvement, a; Schmitt and Thurnher acknowledge:11 Yet, with no additional may, of course, take defensive action in other circumstances, but such action ev1deuce, they a;sert that the system will continue to work adequately even may not include major military force on another state's territory. States may when humans are far removed from the kill decision. It would be irrespon­ also resort to force when challenged by an organized armed group within sible at best to base legal and moral 'itandards for killing on such an unproven the state that is attempting to overthrow the government or to ;ecede. States assertion that defies common sense. appear to tolerate outside intervention by states assisting a government m Jakob Kellenberger, former president of the International Committee of ending insurgent or secessionist military challenges. the Red Cross, has said: "The deployment of such [fully autonomousl sys­ Fighting, whether for internal control, in self-defense, or as authorized tems would reflect a paradigm shift and a major qualitative change in the by the Security Cmuwil, amounts to armed contlict if orp;anized armed MARY ELLEN O'CONNELL BANNING AUTONOMOUS KILLING "' groups fight each other with a certam amount of intensity. Missiles and police-level force nece'isary to save a human life. Thus, other battlefield tar­ bombs, regardless of how they arc deployed, are lawful for use only within geting rules~proportionality, distinction, precaution, and humanity-are the actual fighting of an armed conflict. In other words, missiles and bombs relevant after a necessity decismn is made. In the mid-2000s, the ICRC are pernmsible for use in armed-conflict hostilitie!> only. If a police force sought to broaden the category of persons subject to intentional targeting were to me bombs or nussiles, it would generally be rcmrting to excessive because of their status as persons in a "continuous combat function." The force. Beyond armed-conflict hostilities, authorities may use only that force ICRC insists, however, that such persons may be targeted only when it is necessary to save a human life immediately. Bombs and missiles risk killing necessary to do so. The standard of necessity depend<; on the circumstances, bystanders, which means that the current generation of drones deploys too whether the situation constitutes armed conflict or not. In the words of the much firepower for lawful use outside of actual armed conflict or in response ICRC's Interpretative Guidance on the Notion of Direct Participation in to a significant armed attack by a ~tate that triggen the right of self-defense Hostilities under International Humanitarian Law: under Article 51 of the UN Charter. This law governing resort to military force is a subfield of international In classic large-scale confrontations between well-equipped and orga­ law still referred to as the jus ad bellum. The law was reconfirmed by a con­ nized armed forces or groups, the principles of military necessity and sensus of all UN members at the 2005 World Summit in New YorkY The of humanity are unlikely to restrict the use of force against legitimate rules on how military force may be used dunng an armed conflict are found military targets beyond what is already required by specific provisions in the Hague Conventions, the Geneva Conventions of 1949 and their Ad­ of IHL. The practical importance of their re<;training function will ditional Protocols of 1977, customary international law, and, again, general increa-;e with the ability of a party to the conflict to control the cir­ principles (collectively the jus in bello.) The rules on conduct of force are the cumstances and area in which its military operations are conducted, subject of regular review and comment by the International Committee of may become densive where armed forces operate against selected in­ dividuals in situations comparable to peacetime policmg. In practice, the Red Cross (ICRC). Al~o in 2005, the ICRC published a comprehensive review of customary international humanitanan law (IHL) for the two types such considerations are likely to become particularly relevant where a of armed conflict; for which there are well-developed sets of rules: interna­ party to the conflict exercises effective territorial control, most notably in occupied and non-international armed 4 tional armed conflict and non-international armed conflict. 38 It IS Important territorie~ conflicts. ~ to also emphasize that certam human nghts principles apply even during This greater emphasis on necessity i'i, in effect, a new restriction on the an armed conflict.>'! The European Court of Human Rights, the Inter­ use of lethal force under the in bello rules. It is consistent with the overrid­ Amencan Court of Human Rights, and the Inter-American Commission ing obligation to respect human rights, which compels that any close case be have investigated whether governments have used excessive force and have decided m favor of peacetime standards for the resort to lethal force. thm violated the nght to life.4< 1 The law in mid-2013 reflects an ever-greater restriction on resort tole­ These contemporary rules on the resort to and conduct of lethal force, thal force, whether by state against state in self-defense or during an armed whether in peace or armed confliCt, developed most directly from the Just conflict by combatants against each other. Still, the lawful use of lethal force War Theory of Augustine and Aqumas. Augustine drew on Aristotle and requires an exercise of conscience. Even where a president or soldier has the Cicero for the concept that peace is the normal state and that violence is legal right to kill, the decision to do so will ultimately be an act of moral justified only to restore peace. Moral philosophers continue to teach that the judgment. Moreover, legal 'icholars know that in rare circumstances, indi­ 41 taking of human life may be justified to protect human life. In other words, vidual conscience may compel action m defiance oflaw. This chapter argues the exceptional right to resort to lethal force rests >quarcly on a JUStification that the ultimate decision to kill must be made, therefore, by a human being of necessity. Current law reflects the understanding of what necessity per­ at or very near the time of the lethal impact. Even if scientists develop a mits as a moral and ethical matter. computer that can replicate the human conscience, the decision must not be Within the in bello context, it is becoming increasingly apparent that neces­ given up to a machine that cannot be held accountable. sity is the most important guide to regulating the use of force, given the rise However, John Aquill;1, executive director of the Information Operations of irregular forces. Necessity determines what level of fon:e may be used­ Center at the Naval l'n~t (lr~du:Ite School, has said, "I will stand my artificial military force where enemy combatants may hl' killl•d without warning or intelligence ag;aiun yuur hmn~n ;any day of the week :md tell yo\l ~hili mvA.I. ·························----

MARY EllEN O'CONNEll BANNING AUTONOMOUS KILLING will pay more attention to the rules of engagement and create fewer ethical on certain types of -weapons, restrictions on how and where certain weapon; lapses than a human force." 43 Similarly, Ronald Arkin, author of "Govern­ may be used, and limits on who may use certain weapons. Practitioners and ing Lethal Behavtor in Autonomous Robots," a study funded by the Army scholars are aware of the need to engage the challenges autonomous systems Research Office, believes computer software can be developed to mcorporate pose to current law on weapons. \fJ Much can be found in existing law to proportionality, recognition of surrender, uncertainty, and other fundamental guide the development and me of autonomous systems. Nevertheless, tl1e concepN to lawful conduct on the battlefield.44 Anderson and Waxman have essential, core norm will be a new one: a ban on removing humans too far also argued that roboN will make better, more accurate decisions in life-and­ from the "kill chain." death matters, whether the context is health care or war-fightingY In international law the most common way new norms are built is through Others insist that scientists will not be capable of designing a computer multilateral treaty negotiation. Think only of the new norms that emerged sophisticated enough to make lethal force judgments reflectmg the prin­ in the course of the law of the sea or International Criminal Court negotia­ ciple of necessity. They argue that these judgmenN will always be subjective tions. This form of norm development IS also evident in the area of arms densions that only a human being, not a computer, can make. 46 A similar control. Some of the first multinational treaties concerned weapons bans, in­ argument was made when computer engineer; first began to predict that a cluding the dum-dum bullet and a'iphyxiating gases. Some of the most recent computer could defeat a human being at chess. The counterargument was mccessful treaties have also concerned weapons, including the Rome Statute once that a computer would not be able to make the subjective densions of the International Criminal Court, the 1997 Ottawa Convention banning necessary to outwit a human adversaryY Today it appears well within the landmines, and the 2008 Dublin Convention banning duster munitions. realm of the possible that computers will be programmed to be capable of On the other hand, the ICRC has employed a different approach to domg what experienced battlefield lawyers currently do. law development in recent decades, finding and compiling in written form What 'ieems unprogramrnable is conscience, common sense, intuition, principles of cu<;tomary internatiOnal law. In 2005, as mentioned above, the and other essential human qualities. Accountability is another challenge that ICRC published iN study of customary IHL rules; in 2009, it published its 4 seems impossible to overcome in the case of autonomous killing. H Cur­ InterpretatiVe Guidance on the Notion of Direct Participation in Hmtili­ rent systems for holding individuals accountable for killing require a certain ties. The Guidance includes the heightened necessity standard discussed in mens rea (mental intention), something a computer does not have. Without the section above on lawful and ethical killing. Developing a new express accountability, the importance of norms about the use of force would likely norm against autonomous killing IS probably best achieved drawing on all diminish. categones of mternationallegal norms: treaties and customary international 4 It is already provmg too easy to kill with robots. ry Giving up the dension law, as just de'icribed, but also general principle'i, and jus wgens. entirely to a computer program will truly lower the barrier and remove, liter­ International arms control agreements such as the conventiOns on land ally, the humanity that should come to bear m all cases of justifiable killing. mines or duster munitions indicate some of the difficultie'i that lie ahead in From the perspective of law, morality, and strategy, it seems essential that a banning fully autonomous weapons. The requirement that a human being human being who has training and a conscience and who may be held ac­ make the kill decision within a certain time of the killing is a novel form countable should always make the awesome, ultimate deciSion to kill. of arms control. Moreover, the technology of autonomom killing is under development, leaving the design of rules a matter of prediction. Richard Jackson, a ovilian Pentagon lawyer, has asserted that it is not possible to de­ Norm Building velop a treaty on a technology that is not yet in use. 51 In fact, mternational Even if consensus is reached that the decision to kill must be made by a law does prohibit certain future technologies. The ban on blinding laser human being, the question arises of how such a norm is to be created. Given weapons wa'i developed before the technology came mto use. 52 There are the history of technological development respecting armaments, it seems un­ analogous bam, such as the ban on all forms of human cloning, despite the likely that we can prevent the wide availability of fully autonomous weapons. fact that the technology for human cloning did not exist when the ban was The more promising approach is indicated by the strategies used to create adopted. 53 Both of these bans rest on fundamental moral and ethical views legal control of weapons that we already have. These include outright bans of the technology. MARY ELLEN O'CONNELL BANNING AUTONOMOUS KILLING

Despite the challenge of regulating a developing weapons technology, there are advantages to acting before the technology is widely available. It may be possible. a~ m the ca~e of blinding laser weapons, to get agreement in part because ~tate<; do not yet have the technology. Additionally, the norm can be developed from existing pnnciples used to regulate existing near­ autonomom weapons ~ystems. An incremental step m law development can enhance the perceived legitimacy of the new rule. l3uilding on current rules overcome> the problem of regulating a hypothetlcal and may have a positive Impact in developing new, more appropriate ethical/legal norms beyond the case of future robotic weapons. For example, land mines are a type of au­ tomatic \Veapon that operates by detonating under certain conditions rather than when a human being presses a button or pulls a trigger. A ban on fully autonomous killing could reinforce the existing legal and moral prohibi­ tion on land mines, even leading states not party to the land mines treaty to consider themselvc<; bound. A ban on autonomous killing may also lead us to revisit other weapons technology, such as the intercontinental (ICBM), which apparendy has no failsafe to interrupt it after launch. A norm agamst autonomous killing might necessitate retrofittmg ICBMs and similar weapons. Currently a human being makes the decision to launch an ICBM, bnt after that the missile locks on its target, rnakmg it impossible to abort the strike. A norm against autonomous killing could drive the de­ velopment of new technology that would allow a human being to abort an attack prior to the moment of impact that IS now many minutes after the decisiOn to launch.

NOTES TO PAGES 213-218 NOTES TO PAGES 218-224

2'J. Ibid. 49. Jeff Schogol and Marke1hia Rick<;, '"Demand Grows for UAV Pilots, Sen­ 30. Susan T. Fiske, Lasana T. Harris, and Amy J. C. Cuddy, "Why Ordinary ' sor Operators," 21 April 2012, http://www.alrforcetl!TlCs.com/news/2012/04/air­ People Torture Enemy Prisoners," Sdmce 306 (26 November 2004): 1482-83. force-demand-grows-uav-pilotl-lemor-operators-04 2112w /_ 31. Ervin Staub, "The Roots of Evil: SoClal Conditions, Culture, Penonality, and 50. Elisabeth Bumiller, "Air Force Drone Operators Report High Levels of Basic Human Needs," Personality and Soda/ Psychology Revirw 3, no. 3 (1999): 179-92. Stress," ,Vew York Times, 18 December 2011, my italics. 32. Schmitt, The .~'•lomos ~f the t:arth, 317. 51. Ibid. 33. Carl Schmitt, The Theory ~f the (New York: Telos Press, 2007). 52. Stanford Law School and NYU School of Law, Li11ing Under Drones (Stan- 34. Schmitt. The Nomos of the Earth, 320. ford: Stanford University, 2012). 35. As Thomas Schelling write1, "Traditions or conventions are not simply an 53. Ibid., viii. analogy for limits in war, or a curious aspect of them; tradition or precedent or 54. Ibid., 133. convention is the essence of the limit1"; Thomas Schelling, The StrafeJ?y of Conflict 55. Ibid., 134. (Cambridge: Harvard University Pres1, 1980), 260. 56. Elisabeth Bumiller and Thom Shanker, "War Evolves w1th J)rones, Some 30. Carl von Clausewitz, On War [1832] (Princeton, NJ: Princeton Univer1ity Tiny as Bugs," New York Time>, 19 June 2011. Prell, 1';176), 005. 57. UK Ministry of Defence, The UK ApproadJ to Unmanned Airaaft Sy.llems, S.

27. Gary E. Marchant, "International Governance of Autonomous Military Ro­ f 38. Jean-Marie Hcnckaerts and Loui~e Doswald-Heck, eds., Cu>tomary Interna­ bots," Columbia Science [:,. 1i:chnology Law Review 12 {2011), 276. The Israeli Iron tional Humanitarian Law (Cambridge: ICRC, 2005), 13. See also two other initiatives Dome mi~sile defense system 11 another autonomous system that does not require ~ponsorcd by or assoCiated with the ICRC: International Committee of the Red human intervention but allows for it. See Michael N. Schrrutt and JeffreyS. Thorn­ Cross, "Interpretative Guidance on the Notion of D1rcct Participation in Hostili­ her, "Out of the Loop: Autonomous Weapon Systems and the Law of Armed Con­ ties under International Humanitarian Law," International Review rif the Red Cross 90, flict," Harvard 1'V'ationa/.\'emn'ty .Journa/4, no. 231 (2013). no. 872 (2009); and Program on Humanitarian Pohcy and Conflict Research at 28. United Kingdom Ministry of DefCncc, The UK Approach to Un­ Harvard UnivcNity, HPCR Manual on International Law Applicable to Air and ,'vfissile manned Aircreasesandmaterials/ cases/ While autonomom weapon ~)"items will increasingly be capable of solving complex region~kases/curopeancourtofhumanrights/nr/2615; and Juan Carlos Abella v. Ar­ problems, absent dramatic improvement<; in artificial mtdligcncc, humans will decide .~entina, Case 11.137, Report No. 53/97, Inter-American Court of Human Rights, when and where to deploy the system and what parameters to embed within it. OEA/Ser.L/V/11.98,1Joc. h. rev. 18 November 1997,paras. 149-51,http://www1. Although the subjective decisions may sometimes have to be made earlier in the umn.edu/humanrts/ case~/1997/argentina55-97a.html (distinguishing ''internal dis­ targeting cycle than has traditionally been the case, this neither precludes the lawful­ turbances" from armed conflict on the basis of the nature and level of violence). ness of the decisiom, nor represents an impediment to the lawful deployment of the 41. See Gcnnam G. Grisez, "Toward a Consistent Natural Law Ethics of Kill­ systems." "Out of the Loop," 33. ing," American Journal of Jurispmdence 15 (1970): 76, cited in David Hollenbach, 31. Ibid. 1\iudear Ethics: A Christian Moral A~~ument (Mahwah, NJ: Paulist Press, 1983), 18-19. 32. Fmn, "A Future for Drones." Hollenbach describes how the just war tradttion evolved from Aquinas's position 33. For a general overview of the law applicable to resort to lethal force, see presuming that war is >inful to one presuming war is just so long as it IS waged by O'Connell, "Unlawful Killing with Combat Drones." legitimate authoritie>. Hollenbach argues in favor of returning to the presumption 34. Article 51 provide;: "Nothing in the present Charter shall impair the mherent that violent warfare is presumed to be morally wrong and that resort to war is justifi­ right of indtvidual or collective self-defence if an armed attack occurs agamst a Mem­ able only in exceptional >ituatiom. Hollenbach, Nudear Ethics, 14-16. Hollenbach's ber of the United Nations, until the Security Council ha<; taken measures necessary to position 1s consi5tent with current mternationallaw on the use of force, as reviewed maintam international peace and security. Measure<; taken by members m the exercise above. See Jeremy Waldron, "Can Targeted Killing Work as a Neutral Principle?" of this right of self-defence shall be immediately reported to the Security Council Public Law Hesearch Paper No. 11-20, NYU School of Law, March 24, 2011, and shall not in any w.Iy affect the authority and rcsponsibihty of the Security Coun­ http:// papers.ssrn. com/so\3/ papers. cfm ?abstract_id= 1788226. Cil under the present Charter to take at any time mch action as it deems necessary in 42. International Committee of the Red Cross, "Interpretative Guidance on order to maintain or restore international peace and security." Charter of the Umted the Notion of Direct Participation 111 Hostilities under International Humanitarian Nations, Chapter VII, http://www.un.org/en/ documents/charter/ chapter7.shtml. Law," 8(}-81. 35. Coifu Channel (United Kinxdom rif Great Britain and Northern fre/and v. Albania), 43. Markhoff, "War Machines." Judgment, 1949 I. CJ. 4 (July 9), http:/ 1-wv.'W.icj-cij.org/ docket/indcx.php?p 1 = 44. Fmn, "A Future for Drones." 3&p2 = 3&k= cd&case= 1 &code =cc&p3 =4. 45. Anderson and Waxman, "Law and Ethic~ for Robot Soldiers." 36. Jud!th Gardam, Newsity, Proportionality and the Use of/-."' 376. Executions.'' 60. Ib!d.,J77. 49. For the ease of killing w1th drones, sec Mary Ellen O'Connell, "Seductive 61 For details on the treaty, see Conventiou on the Prohibitiun uf the Usc. Drones: Learning from a Decade of Lethal Opcrauons,"]ormwl of Law, Information, & Stockpiling. Production and Transfer of Anti-Pen.onuel Mme> and Their Destruc­ Srin1ce 2L no. 2 (2011): 116-39. tion (known as the Otta\va Convention), http://legal.un.org/avl!ha/cpusptam/ 50. Jmtin McClelland. "The Rev1ew of Weapons in Accordance with Article cpusptarn.httnl. 36 of Additional Protocol!," International Review qfthe Red Cross 85, no. 850 (June 2003): 408, http: //wv.-w. icrc .org/ eng/ assets/ files/ other I irrc_850 _mcclelland. pdf; Patrick Wo~>nip, "U.N. Official Calls for Study of Ethics, Legality of Unmanned Weapom," fVaslriu.~ton l'ost, October 24, 2010. See "The Statement of the 2010 Ex­ pert Workshop on Limiting Armed Tele-Operated and Autonomous Systems," In­ ternational Committee for Robot Arms Control (ICRAC) statement, Berlin, 2010, at http://icrac.net/statements/. 51. Jackson, "Remarks." 52. See Convention on Certain Conventional Weapons, Protocol IV, Blindmg Laser Weapom (May 3, 1996). 53. "Member State; are called upon to prohibit all forms of human domng in­ asmuch as they are incompatible with human dignity and the protection of human life"; United Natiom Declaration on Human Cloning, UN General Assembly Reso­ lutlOn 59/280, March 23, 2005. 54. Ann Fiorini, "The Evolution of International Norms," htlemational Stud­ ies Quarterly 40 (1996): 363-89. For another vie\v on why some norms enter the agenda of norm-promoting groups, see Charli Carpenter, Sirin I )uygulu, and Anna Tomaskovic-Devey, "Explaining the Advocacy Agenda: Insights from the Human Se­ cunty Network," pre;entation at the University of California-Irvine, Irvine, CA, May 2011, http://www.socsci.uci.edu/ftleo/internationalltudies/ docs/ carpenter2011.pdf. 35. Fiorini, "The Evolution of International Norms," 374. 36. Special Rapporteur on Extrajudicial, SummJry or Arbitrary Executions, "Re­ port of the Special Rapporteur on Extrajudicial, Summary or Arbitrary Executions, Phihp Alston, Addendum, Study on Targeted Killings," UN Doc. A/HRC/14/24/ Add.6, at 24 (May 28, 2010). 57. A 2012 poll of Americam on their attitudes toward killing with drones found that 76 percent of likely U.S. voters "approve of the use of the unmanned aircraft to kill terrorists." "Voters Are Gung-Ho for Use of Drones Hut Not over the Umted States," Rasmussen Reports Web site (survey conducted 10 February 2012), http:// \VWW. rasmussenreports. com/public_content/ politin/ current_events/afghanistan/ voters_are_gung_ho_for_use_of_drones_but_not_over_the_united_states. This fig­ ure has dedmed following Senator Rand Paul's thirteen-hour filibuster against the nomination of John Brennan as director of the CIA and other consciousness-raismg events, such as protests outside drone ba~es and media reports of young children and rescue \Vorkers bemg killed. 58. A recommitment to the ban on torture in the United States gained an es­ sential boost when the Nation;ll Religiom Campaign Agaimt Torture built a coali­ oon of religious orgamzat10ns and persons to protest U.S. use of torture. See http:/ I www.nrcat.org/.