The Ethics & Morality of Robotic Warfare: Assessing the Debate over Autonomous

Michael C. Horowitz

Abstract: There is growing concern in some quarters that the drones used by the United States and others represent precursors to the further automation of force through the use of lethal autonomous weap- on systems (LAWS). These weapons, though they do not generally exist today, have already been the sub- ject of multiple discussions at the United Nations. Do autonomous weapons raise unique ethical questions for warfare, with implications for just theory? This essay describes and assesses the ongoing debate, fo- cusing on the ethical implications of whether autonomous weapons can operate effectively, whether human accountability and responsibility for autonomous systems are possible, and whether delegating life and death decisions to machines inherently undermines human dignity. The concept of LAWS is extreme- ly broad and this essay considers LAWS in three categories: munition, platforms, and operational systems.

The growing use of drones on today’s battlefields raises important questions about targeting and the threshold for using military force. Over ninety mili- taries and nonstate actors have drones of some kind and almost a dozen of these have armed drones. In 2015, Pakistan shot down an Indian drone in the dis- puted Kashmir region, Turkey shot down a drone near its border with Syria, and both Nigeria and Pa- MICHAEL C. HOROWITZ is As- 1 sociate Professor of Political Sci- kistan acquired armed drones. ence at the University of Penn- The use of drones by the United States and oth- sylvania and Associate Director ers has led to an array of questions about the appro- of Penn’s Perry World House. He priateness of so-called remote-controlled warfare. formerly worked for the U.S. De- Yet on the horizon is something that many fear even partment of Defense. His publi- more: the rise of lethal autonomous weapon sys- cations include Why Leaders Fight tems (laws).2 At the 2016 Convention on Certain (2015) and The Diffusion of Mili- tary Power: Causes and Consequences Conventional Weapons in Geneva, over one hun- for International Politics (2010). dred countries and nongovernmental organiza- You can follow him on Twitter tions (ngos) spent a week discussing the potential @mchorowitz. development and use of autonomous weapon sys-

© 2016 by Michael C. Horowitz doi:10.1162/DAED_ a_00409

25 The Ethics tems. An ngo, The Future of Life Insti- are undoubtedly subcomponents of each & Morality tute, broke into the public consciousness issue not discussed here. Most generally, of Robotic Warfare in 2015 with a call, signed by luminaries this essay finds that the ethical challenges Elon Musk and Stephen Hawking, as well associated with autonomous weapons may as scientists around the world, to prohib- vary significantly depending on the type of it the creation of autonomous weapons.3 weapon. laws could fall into three catego- Two essential questions underlie the de- ries: munition, platforms, and operational bate about autonomous weapons: first, systems. While concerns may be overstated would autonomous weapons be more or for laws that will be most akin to next-gen- less effective than nonautonomous weapon eration munitions, when thinking about au- systems? Second, does the nature of auton- tonomous weapon platforms or operation- omous weapons raise ethical and/or moral al systems for managing , laws raise considerations that either recommend their more important questions. Caution and a development or justify their prohibition? realistic focus on maintaining the centrali- Ultimately, the unique facet distinguishing ty of the human in decisions about war will laws from non-laws is that the weapon be critical. system, not a person, selects and engages targets. Therefore, it’s critical to consid- Given the use of drones by the United er whether the use of laws could comply States and others against terrorists and broadly with the protection of life in war, insurgents around the world, there is a a core ethical responsibility for the use of tendency to conflate the entire category force; whether laws can be used in ways of military robotics with specific cases of that guarantee accountability and responsi- drone strikes. However, it is a mistake to bility for the use of force; and whether there focus solely on the trees and is something about machines selecting and miss the vast military robotics forest. For engaging targets that makes them ethically example, as current platforms, like the problematic. Given the centrality of these rq-4 Global Hawk, and next generation issues in debates about just war theory, it experimental technologies, like the x-47b therefore makes the issue of laws relevant (United States) and Sharp Sword (China), to just war theory as well. demonstrate, drones are potentially use- This essay examines the potentially ful for much more than simply targeted unique ethical and moral issues surround- strikes, and in the future could engage in an ing laws, as opposed to nonautonomous even larger category of military missions. weapon systems, especially as they relate Moreover, the focus on drone strikes pre- to just war theory, in an attempt to lay out sumes that military robotics are only use- some of the key topics for thinking about ful in the air. But there are a variety of mis- laws moving forward. It does not engage, sions–from uninhabited truck convoys to however, with certain legal arguments sur- the Knifefish sea mine detection system to rounding laws, such as whether interna- Israel’s unmanned surface patrol , tional humanitarian law implies that hu- the Protector–in which robotic systems can mans must make every individual life-or- play a significant role outside the context death decision, or whether laws violate the of airborne-targeted killings.5 Martens Clause of the Hague Convention Within the realm of military robotics, au- by violating the dictates of the human con- tonomy is already extensively used, includ- science.4 Moreover, different opponents of ing in autopilot, identifying and tracking laws make different arguments, as do dif- potential targets, guidance, and weapons ferent critics of those opponents, so there detonation.6 Though simple autonomous

26 Dædalus, the Journal of the American Academy of Arts & Sciences weapons are already possible, there is vast ing decisions about who to target and when Michael C. uncertainty about the state of the possible to fire weapons via algorithm clearly is. In Horowitz when it comes to artificial intelligence and between these extremes, however, is a vast its application to . While robots and murky gulf–from incremental ad- that could discriminate between a person vances on the precision guided weapons holding a rifle and a person holding a stick of today to humanoid robots stalking the still seem to be on the horizon, technology earth–that complicates our thinking about is advancing quickly. How quickly and how the ethical and moral challenges associat- prepared society will be for it, though, are ed with laws and the implications for just open questions.7 A small number of weap- war theory. on systems currently have human-super- In 2012, the U.S. Department of Defense vised autonomy. Many variants of the close- (dod) defined an autonomous weapon in weapon systems (ciws) deployed by the as “A weapon system that, once activat- U.S. military and more than two dozen mil- ed, can select and engage targets without itaries around the world, for example, have further intervention by a human opera- an automatic mode.8 Normally, the system tor.”11 The dod further distinguished be- works by having a human operator identi- tween autonomous weapons, human-su- fy and target enemy or planes and pervised autonomous weapons (that is, au- fire at them. However, if the number of in- tonomous weapons that feature a human coming threats is so large that a human op- “on the loop” who possesses an override erator cannot target and fire against them switch), and semiautonomous weapons, or effectively, the operator can activate an au- “a weapon system that, once activated, is tomatic mode whereby the computer tar- intended to only engage individual targets gets and fires against the incoming threats. or specific target groups that have been se- There is also an override switch the human lected by a human operator.”12 ngo groups, can use to stop the system. such as Human Rights Watch, have gener- Nearly all those discussing autonomous ally adopted similar definitions.13 This es- weapons–from international organiza- say does as well, considering lethal auton- tions to governments to the Campaign to omous weapon systems as weapon systems Stop Killer Robots–agree that laws differ that, once activated, are designed to select fundamentally from the weapons that mil- and engage targets not previously designat- itaries employ today.9 While simple at first ed by a human.14 Defining what it means glance, this point is critical: when consid- to select and engage targets is complicat- ering the ethical and moral challenges asso- ed, however. For example, if homing muni- ciated with autonomous weapons, the cate- tions are considered to “select and engage” gory only includes weapons that operate in targets, then autonomous weapons have ex- ways appreciably different from the weap- isted since II. ons of today.10 Resolving the definitional debate is be- From a common sense perspective, de- yond the scope of this essay. But even if fining an autonomous weapon as a weap- there is not a clear agreement on exactly on system that selects and engages targets what constitutes an autonomous weapon, on its own makes intuitive sense. More- breaking down laws into three “types” over, it is easy to describe, at the extremes, of potential autonomous weapons–muni- what constitutes an autonomous weap- tion, platforms, and operational systems– on. While a “dumb” bomb launched by can potentially help move the discussion a b-29 in World War II is not an autono- forward, revealing the ethical, moral, and mous weapon, a hunter-killer drone mak- strategic issues that might exist for each.15

145 (4) Fall 2016 27 The Ethics At the munitions level, there are already substitute, in a way, for military leaders and & Morality many semiautonomous weapons today. The their in planning operations. No laws of Robotic Warfare advanced medium range air-to-air at the operational level appear to exist, even (amraam), for example, deployed by the in research and development, though it is United States and several militaries around possible to imagine militaries wanting to the world, is a “fire and forget” missile: leverage potential insights from machine after it is launched, it uses internal naviga- learning models as they conduct planning. tion and radar to find and destroy a target. In this scenario, upon deciding to fight a amraam engagements generally happen war–or perhaps even deciding whether beyond visual range, with the pilot making to fight a war–a human would activate an the decision to launch an amraam based autonomous system that could de- on long-range radar data, not visual cues. cide the probability of winning a war and The amraam is not considered inherent- whether to attack, plan an operation, and ly problematic from an ethical perspective, then direct other systems–whether hu- nor is it considered an autonomous weap- man or robotic–to engage in particular on.16 Some fully autonomous weapons at attacks. This category is the furthest away the munitions level arguably already do ex- from reality in terms of technology and is ist, though, including the Israeli Harpy, a the one that most invokes images of robot- loitering cruise missile designed to detect ic weapon systems in movies such as The and destroy a certain type of radar.17 Terminator or The Matrix. The next level of military system ag- gregation is the platform. An example Some worry that autonomous weap- of an autonomous weapon system plat- ons will be inherently difficult to use in form would be a or plane capable ways that discriminate between combat- of selecting targets and firing munitions ants and noncombatants and only take life at those targets on its own. There are al- when necessary. An inability to discrimi- most no platform-level laws current- nate would violate just war theory as well ly deployed, but the ciws systems that as the . Consequently, some wor- protect and military bases from at- ry that autonomous weapons will be un- tack are arguably an exception. Like the controllable–prone to errors and unable amraam, countries have used these to operate predictably.19 Moreover, even if weapon systems for decades without op- laws meet basic law of war requirements, position. However, an example of a plat- they could create safety and control prob- form-level laws that does not currently lems. Their very strength–the reliability exist–and which no military appears to be of their programming relative to humans– planning to build–is an autonomous ver- could make them fragile when facing op- sion of the mq-9 Reaper (United States) or erating environments outside of their pro- the ch-4 (China) drones. Imagine a drone gramming. At the extreme, unpredictable identical from the exterior, but with soft- algorithms interacting as multiple coun- ware that allows it, after activation by a hu- tries deploy autonomous weapons could man operator, to fly around the world and risk the military version of the 2010 stock target a particular individual or groups of market “flash crash” caused by high-fre- individuals and fire missiles at them, much quency trading algorithms.20 as human-piloted drones do today.18 Additionally, opponents of laws argue The broadest type of laws would be a that autonomous weapons will necessarily military operations planning system in struggle with judgment calls because they which a machine learning system would are not human.21 For example, a human

28 Dædalus, the Journal of the American Academy of Arts & Sciences soldier might have empathy and use judg- would almost certainly fire more accurate- Michael C. ment to decide not to kill a lawful combat- ly and discriminate perfectly according to Horowitz ant putting down a weapon or who looks their programming. According to scholars like they are about to give up, while a ro- like Ronald Arkin, this could make these botic soldier would follow its order, killing types of war crimes and the killing of civil- the combatant. This could make it harder ians by human soldiers less likely.24 to use laws justly.22 How would these theoretical benefits Additionally, autonomous weapons po- and drawbacks stack up? Given the cur- tentially raise jus in bello questions con- rent state of the technology in question, we cerning conduct in war from a just war can only speculate the extent to which these perspective. For example, laws that are matters are likely to be more or less serious unable to respect benevolent quarantine for the three possible categories of auton- for prisoners would violate core just war omous weapon systems described above. principles, though their inability to com- For munitions, most imaginable laws ply means responsible militaries would are less likely to create inherent effective- not deploy them in those situations. This ness challenges beyond those of current is precisely why it makes the most sense to weapons in terms of controllability. There think about autonomous weapons in com- is still a human operator launching the mu- parison with existing weapons in realis- nition and making a decision about the ne- tic scenarios. cessity of firing upon a target or set of tar- These are also empirical questions, though gets. Autonomy may help ensure that the convincing evidence is difficult to gather be- weapon hits the correct target–or gets cause these weapon systems generally do not to the target, if autonomy enables a mu- yet exist. Moreover, even beyond the un- nition to avoid countermeasures. In this certainty about the technological range case, there is not a significant difference, of the possible, many of these arguments from an ethical perspective, between an can be made in both directions. For exam- autonomous weapon, a semiautonomous ple, those less worried about laws could weapon, or arguably even a bullet, because contend that the arguments above consid- a person is making the choice to launch the er improbable scenarios, because militar- munition based on what is presumably ies are unlikely to deploy inherently unpre- sufficient information. For example, Isra- dictable weapons that would be less likely el’s Harpy may be problematic because the to accomplish missions than non-laws.23 system will destroy its target whether that In this sense, it’s possible that militaries target is on top of a school or on a military would purposefully decide not to deploy base, but it is not executing a complicated laws unless they believed those laws algorithm that makes it inherently unpre- could operate with the ability to discrim- dictable. Practically, militaries are very un- inate and follow the law of war. laws likely to use laws at the munitions level might also be more effective and ethical on unless they are demonstrably better than the battlefield than other nonautonomous semiautonomous weapons, precisely for alternatives. Human soldiers kill unneces- reasons of controllability. sarily on the battlefield, up to and includ- It is, of course, possible to imagine fu- ing war crimes, for a variety of reasons, in- turistic versions of munitions that would cluding rage, revenge, and errors from fa- be more complicated. Autonomous cruise tigue. One theoretical benefit oflaws is missiles that can loiter for days, instead of that, as machines that do not get tired or hours, and travel around the world, pro- (presumably) experience emotion, laws grammed to target particular individuals

145 (4) Fall 2016 29 The Ethics or ships when they meet certain criteria, able today. This is potentially unique to & Morality could raise other questions. This is one ex- laws. Remotely piloted military robotics of Robotic Warfare ample of how context based on geography do not appear to create excessive moral dis- and time may influence the appropriate- tance from war at the operator level. For ness and desirability of autonomous weap- example, new research shows that drone on systems in a given situation. pilots actually suffer from posttraumatic It is at the platform and the operational stress disorder at similar rates to pilots in levels that disquiet about discrimination the cockpit.26 and controllability becomes more com- There is still nervousness, however, that plex. A laws platform deployed in a con- drones already make war too “easy” for po- fined geographical space in a clear war zone litical leaders. Autonomous weapons raise may not (depending on the programming) similar fears, just as indirect and be inherently problematic, but there are manned airpower did in the past.27 The core other mission sets–like patrolling autono- fear is that laws will allow leaders and sol- mous drones searching for insurgents–that diers not to feel ethically responsible for us- would lead to much greater risk from a con- ing military force because they do not un- trollability perspective. Essentially, compli- derstand how the machine makes decisions cations, and thus the potential for fragility, and they are not accountable for what the will increase as the machine has to do more machine does. “work” in the area of discrimination. laws may substitute for a human sol- At the operational battle-management dier, but they cannot be held accountable level, it is difficult to imagine militaries the way a human soldier is held account- having enough trust to delegate funda- able.28 Imagine, for example, deploying a mental operational planning roles to al- robot soldier in a mis- gorithms, though they could become sup- sion to clear a building that is suspected plemental sources of information. Dele- to house insurgents. If that robotic soldier gating those roles, however, could create commits a , indiscriminately ex- large-scale ethical concerns from the con- ecuting noncombatants, who is responsi- sequences of those actions, in part because ble? The responsible party could be the they might be harder to predict. Opera- programmer, but what if the program- tional planning laws could make choic- mer never imagined that particular situ- es or calculate risks in novel ways, lead- ation? The responsible party could be the ing to actions that are logical according to commander who ordered the activation their programming, but are not predict- of the weapon, but what if the weapon be- able to the humans carrying out those or- haved in a way that the commander could ders. Operational planning laws also con- not have reasonably predicted? nect most directly to the types of existen- On the other side of the debate, part of tial risks raised by Hawking and others. the problem is imagining laws as agents, rather than tools. The human operator that One of the key arguments made by oppo- fires a laws munition or activates a laws nents of laws is that, because laws lack platform still has an obligation to ensure meaningful human control, they create a the system will perform in an ethical- moral (and legal) accountability gap.25 If ly appropriate fashion to the best of any- they malfunction or commit war crimes, one’s ability to predict, just as with today’s there is no single person to hold account- weapons.29 Thus, planning and training able the way a drone operator, pilot in the becomes critical to avoiding a responsi- cockpit, or ground team would be account- bility gap. By ensuring that potential op-

30 Dædalus, the Journal of the American Academy of Arts & Sciences erators of laws understand how they op- just operate somewhat differently. Adap- Michael C. erate–and feel personally accountable tations of existing accountability regimes Horowitz for their use–militaries can theoretically therefore seem plausible. avoid offloading moral responsibility for The platform level will place the largest the use of force. amount of stress on potential training and Formal rules could ensure ac- planning to avoid offloading accountabili- countability. One solution in the case of ty when using laws. While there is still a the ground combat situation described person that will have to activate and launch above is to hold the commander account- an autonomous weapons platform, if that able for war crimes committed by the ro- person lacks sufficient understanding of the botic soldier, just as commanders today are mission or how the laws will operate to generally held accountable for war crimes complete the mission, it could lead to a re- committed by their unit.30 This leads to sponsibility gap. Such a gap does not seem fairness considerations, though: if the ro- inevitable, however, presuming the con- botic soldier malfunctions, and it is not the struction of clear rules and training. fault of the commander, is it fair to hold At the operational system level, the use the commander accountable? Arguably of laws creates a real and significant risk not, though commander accountability of moral offloading. Operational plan- for laws would create a strong incentive ning conducted by an algorithm–rather for commanders only to use laws when than the algorithm being one input into they have a high degree of confidence in human judgment–is precisely the type of its situational appropriateness. Analogies situation in which human accountability from legal regimes, such as vicarious lia- for war would decline and humans might bility, could also prove useful. Thus, while cease to feel responsible for the casualties accountability and responsibility issues caused by war. This is a significant ethical are relevant topics, it is not clear that they concern on its own and would raise large are irresolvable. Additionally, accidents questions in terms of just war theory. with nonautonomous and semiautono- Establishing the line at which the hu- mous weapons happen today and raise man is so removed from the targeting de- accountability questions. In a 2003 inci- cision that it makes the use of force a priori dent in which a U.S. Patriot missile bat- unjust is complex from a just war perspec- tery shot down allied , no one was tive, however. Imagine a case in which the personally held accountable for the system human is entirely removed from the target- malfunction. Should the accountability re- ing and firing process, but the outcome is a quirements for laws be higher than for more precise military engagement. On the other weapon systems? one hand, such an engagement would al- Considering this argument in both di- most certainly meet basic jus in bello require- rections, it makes sense again to see how ments, but one might also argue that the re- these concerns might vary across differ- moval of human agency from the process ent types of laws. At the munitions lev- is ethically defective. This is a tricky ques- el, ensuring legal accountability and mor- tion, and one worth further consideration. al responsibility should be relatively close, if not identical, to the use of semiauton- The last major ethical argument about omous weapons today. There will still be laws is whether they might be inherent- human operators firing the munitions in ly problematic because they dehumanize ways that they believe are legitimate; the their targets. All human life is precious and guidance systems for the munitions would has intrinsic value, so having machines

145 (4) Fall 2016 31 The Ethics select and engage targets arguably vio- British casualties the first day of the Bat- & Morality lates fundamental human dignity–peo- tle of the Somme to the firebombing of To- of Robotic Warfare ple have the right to be killed by someone kyo and beyond. who made the choice to kill them. Since Looking at the three categories of possi- machines are not moral actors, automat- ble laws again reveals potential differenc- ing the process of killing through laws is es between them with regards to the ques- also by definition unethical, or as technolo- tion of human dignity. At the munitions gy philosopher Peter Asaro has put it: “jus- level, laws seem unlikely to generate sig- tice itself cannot be delegated to automat- nificant human dignity questions beyond ed processes.”31 laws might therefore be those posed by existing weapon systems, thought of as mala in se, or evil in them- at least based on the current technologi- selves, under just war theory. cal world of the possible. Since the deci- If a machine without intentions or mo- sion-making process for the use of force rality makes the decision to kill, it makes would be similar, if not identical, to the use us question why the victim died.32 This ar- of force today, the connection between the gument has a moral force. As human rights individual firing the weapon and those af- legal scholar Christof Heyns argues: “De- fected would not change.36 cisions over life and death in armed con- At the platform level, laws again re- flict may require compassion and intu- quire deeper consideration, because it is ition.”33 There is something unnerving with laws platforms that the system be- about the idea of machines making the de- gins calculating whether to use force. The cision to kill. The United Nations Institute extent to which they may be problemat- for Disarmament Research describes it as ic from a human dignity perspective may “an instinctual revulsion against the idea also again depend on how they are used. of machines ‘deciding’ to kill humans.”34 The use of platform-level laws in an anti- The concern by opponents of laws is that material role against adversary ships or machines making decisions about killing planes on a clear battlefield would be differ- leads to a “vacuum of moral responsibili- ent than in an urban environment. More- ty”: the military necessity of killing some- over, as the sophistication of laws grows, one is a subjective decision that should in- they could increase the risk of dehumaniz- herently be made by humans.35 ing targets. Returning to the case of the Har- On the other side, all who enter the mili- py, at present, it is clearly up to the person tary understand the risks involved, includ- launching the missile to make sure there is ing the potential to die; what difference a lawful radar target that the Harpy can en- does the how make once you are dead? In gage. A platform with the ability to make an esoteric sense, there may be something choices about whether the radar is a law- undignified about dying at the hands of a ful target (for example, is the radar on top machine, but why is being shot through of a hospital?) would be better at discrim- the head or heart and instantly killed by ination, making it ethically preferable in a machine necessarily worse than be- some ways, but also raising questions from ing bludgeoned by a person, lit on fire, or the perspective of the human dignity argu- killed by a cruise missile strike? The digni- ment; it is the machine, rather than a per- ty argument has emotional resonance, but son, making the targeting decision.37 it may romanticize warfare. Humans have The human dignity argument arguably engaged in war on an impersonal and in- also applies less to platforms that defend dustrial scale since at least the nineteenth a fixed position from attack. Electric fenc- century: from the near sixty thousand es are not ethically problematic as a cat-

32 Dædalus, the Journal of the American Academy of Arts & Sciences egory if labeled clearly and used in areas cially regarding just war theory? Exclud- Michael C. where any intrusion is almost by defini- ing technologically implausible scenarios Horowitz tion a hostile action.38 Or to take another of autonomous operational battle sys- example, South Korea deploys a sys- tems deciding to go to war, autonomous tem called the sgr-1 pointed at the demili- weapons are unlikely to lead to jus ad bel- tarized zone with North Korea. The system lum problems from a traditional just war has some automatic targeting features, perspective, excluding the risk that laws though the specifics are unclear. Howev- will make going to war so easy that politi- er, since the system is deployed in a con- cal leaders will view unjust wars as costless flict zone and can only aim at targets that and desirable. One could argue that since would almost certainly be lawful combat- machines cannot have intentions, they ants, this is arguably less problematic than cannot satisfy the jus ad bellum requirement laws platforms employed as part of an as- for right intentions. Yet this interpretation sault operation. would also mean that broad swaths of pre- laws pose the largest challenges to hu- cision-guided modern semiautonomous man dignity at the operational system lev- weapons that dramatically reduce civil- el, though the relationship to just war the- ian suffering in war arguably violate the ory is more ambiguous. An operational- individual intentionality proposition, giv- level laws making decisions about wheth- en the use of computerized targeting and er and how to conduct a guidance. Presumably no one would rath- certainly involves offloading moral respon- er the world return to the age of the “dumb sibility for the use of force to a machine. bombs” used in World War II. Overall, it is Oddly, though, imagine a case in which critical to understand that there is the pos- an operational-level laws designed a bat- sibility for significant diversity within the tle plan implemented by humans. In that subset of autonomous weapons, in partic- case, the machine is taking the place of a ular, whether one is discussing a munition high-level military commander, but hu- with greater autonomy in engaging a target mans are selecting and engaging targets on versus a platform or operational system. the ground. Would this be less problematic, At the level of the munition, where laws ethically, than a hunter-killer drone search- might represent missiles programmed to ing for individuals or groups of insurgents? attack particular classes of targets (such It sounds odd, but this example points to as an amphibious landing craft) in a giv- the complexities of assessing these issues. en geographic space, the relevant ethi- cal issues appear similar to those regard- The debate is just beginning, and this es- ing today’s weapons. The process of using say attempts to address the broad ethical is- force–and responsibility for using force– sues potentially associated with the devel- would likely look much the same as it does opment of autonomous weapons, a class today for drone strikes or the use of oth- of weapons that, with a few exceptions, do er platforms that launch precision-guid- not yet exist. While technological trends ed munitions. The key will be how muni- suggest that artificial intelligence is rap- tions-based laws are used. idly advancing, we are far from the realm It is at the platform level that the ethical of dystopian science fiction scenarios. challenges of laws begin to come into fo- Of course, how quickly the technology will cus. Autonomous planes, for example, fly- develop remains to be seen. ing for thousands of miles and deciding for Do autonomous weapons create novel themselves whom to target, could risk the issues from an ethical perspective, espe- moral offloading of responsibility and un-

145 (4) Fall 2016 33 The Ethics dermine human dignity in some scenari- are quite unlikely to want to relinquish & Morality os, even if they behave in ways that com- that level of control over war, meaning of Robotic Warfare ply with the law of war. While it is possible the real world systems that require deep- to address this issue through training, ac- er thought over the next several years are countability rules, and restricting the sce- laws at the munition and platform levels. narios for using autonomous weapon plat- Finally, just war theory provides an inter- forms, this area requires further investi- esting lens through which to view laws: gation. could they lead to a world in which hu- Autonomous operational systems using mans are more removed from the process algorithms to decide whether to fight and of warfare than ever before, while warfare how to conduct operations, besides being itself becomes more precise and involves closest to the robotic weapon systems of less unnecessary suffering? These are movies and television, could create more complicated questions regarding the ap- significant moral quandaries. Given full propriate role for humans in war, informed authority (as opposed to supplementing by how we balance evaluating laws based human judgment), operational system on a logic of consequences versus evalu- laws would make humans less relevant, ating them based on a logic of morality. It from an ethical perspective, in major war- will be critical to ensure in any case that time decision-making. Fortunately, these the human element remains a central part types of systems are far from the techno- of warfare. logical range of the possible, and humans

endnotes Author’s Note: Thank you to Michael Simon and all the workshop participants at West Point, along with Paul Scharre, for their feedback. All errors are the sole responsibility of the author. 1 Michael C. Horowitz, Sarah E. Kreps, and Matthew Fuhrmann, “The Consequences of Drone Proliferation: Separating Fact from Fiction,” working paper (Philadelphia: University of Pennsylvania, 2016). 2 For the purposes of this essay, I use the phrases autonomous weapon, autonomous weapon system, and lethal autonomous weapon system interchangeably. 3 See the Campaign to Stop Killer Robots, http://www.stopkillerrobots.org/; and the Future of Life Institute, “Autonomous Weapons: An Open Letter from ai and Robotics Researchers,” http://futureoflife.org/AI/open_letter_autonomous_weapons. 4 For example, see the discussion in Peter Asaro, “On Banning Autonomous Weapon Systems: Human Rights, Automation, and the Dehumanization of Lethal Decision-Making,” International Review of the Red Cross 94 (886) (2012): 687–709; and Charli Carpenter, “How Do Americans Feel About Fully Autonomous Weapons?” Duck of Minerva, June 10, 2013, http://duckofminerva .com/2013/06/how-do-americans-feel-about-fully-autonomous-weapons.html; and Michael N. Schmitt, “Autonomous Weapon Systems and International Humanitarian Law: A Reply to the Critics,” Harvard National Security Journal (2013); and Michael C. Horowitz, “Public Opinion and the Politics of the Killer Robots Debate,” Research & Politics (forthcoming). 5 Michael C. Horowitz, “The Looming Robotics Gap: Why America’s Global Dominance in is Starting to Crumble,” Foreign Policy Magazine (May/June 2014), http:// foreignpolicy.com/2014/2005/2005/the-looming-robotics-gap/. Put another way, discussions of banning drones because they are used for targeted killing conflate the act of concern (tar- geted killings) with the means (drones), when other means exist. It would be like banning the airplane in the early twentieth century because of targeted killing.

34 Dædalus, the Journal of the American Academy of Arts & Sciences 6 Paul Scharre and Michael C. Horowitz, “An Introduction to Autonomy in Weapon Systems,” Michael C. working paper (Washington, D.C.: Center for a New American Security, 2015), 7, http:// Horowitz www.cnas.org/sites/default/files/publications-pdf/Ethical%20Autonomy%20Working%20 Paper_021015_v02.pdf. 7 For one example, see Stuart Russell, “Artificial Intelligence: Implications for Autonomous Weapons,” presentation at the Convention on Certain Conventional Weapons, Geneva, 2015. 8 U.S. military examples include the Phalanx and c-ram. 9 Human Rights Watch, Losing Humanity: The Case against Killer Robots (New York: Human Rights Watch, 2012), http://www.hrw.org/sites/default/files/reports/arms1112_ForUpload.pdf. 10 It is possible, of course, to use today’s weapons in ethically problematic ways, but that is be- yond the scope of this essay. 11 U.S. Department of Defense, Directive on Autonomy in Weapons Systems, Number 3000.09 (Wash- ington, D.C.: U.S. Department of Defense, 2012), 13, http://www.dtic.mil/whs/directives/ corres/pdf/300009p.pdf. 12 Ibid., 14. 13 Human Rights Watch, Losing Humanity. 14 This builds on the definition in Scharre and Horowitz, “An Introduction to Autonomy in Weap- on Systems,” 16. The phrase “not previously designated by a human” helps reconcile the fact that the use of weapons sometimes involves firing multiple munitions at multiple targets. 15 Another interesting possibility is to classify laws based on the types of autonomy they pos- sess. See Heather Roff, “The Forest for the Trees: Autonomous Weapons and ‘Autonomy’ in Weapons Systems,” working paper, June 2016. 16 This discussion is similar to ibid., 11. 17 Peter J. Spielmann, “Israel Killer Robots Could be Banned under un Proposal,” The Times of Is- rael, May 3, 2013, http://www.timesofisrael.com/israeli-killer-robots-could-be-banned-under -un-proposal/. 18 The x-47b, a U.S. Navy experimental drone, has autonomous piloting, but not automated weapon systems. 19 Human Rights Watch, Losing Humanity. 20 Michael C. Horowitz and Paul Scharre, “The Morality of Robotic War,” The New York Times, May 26, 2015, http://www.nytimes.com/2015/05/27/opinion/the-morality-of-robotic-war .html. Also see Paul Scharre, Autonomous Weapons and Operational Risk (Washington, D.C.: Cen- ter for a New American Security, February 2016), http://www.cnas.org/sites/default/files/ publications-pdf/CNAS_Autonomous-weapons-operational-risk.pdf. 21 Aaron M. Johnson and Sidney Axinn, “The Morality of Autonomous Robots,” Journal of Mil- itary Ethics 12 (2) (2013): 137. 22 Michael Walzer, Just and Unjust Wars: A Moral Argument with Historical Illustrations, 4th ed. (New York: Basic Books, 1977), 142–143. 23 This is particularly true given that drones and other remotely piloted military robotics op- tions exist. 24 Ronald C. Arkin, Governing Lethal Behavior in Autonomous Robots (Boca Raton, Fla.: crc Press, 2009). 25 Wendell Wallach, A Dangerous Master: How to Keep Technology from Slipping Beyond Our Control (New York: Basic Books, 2015); and Human Rights Watch, Mind the Gap: The Lack of Account- ability for Killer Robots (New York: Human Rights Watch, 2015), https://www.hrw.org/report/ 2015/04/09/mind-gap/lack-accountability-killer-robots.

145 (4) Fall 2016 35 The Ethics 26 James Dao, “Drone Pilots are Found to Get Stress Disorders Much as Those in Combat Do,” & Morality The New York Times, February, 22 2013, http://www.nytimes.com/2013/02/23/us/drone-pilots of Robotic -found-to-get-stress-disorders-much-as-those-in-combat-do.html. Warfare 27 Kenneth Anderson, Daniel Reisner, and Matthew C. Waxman, “Adapting the Law of Armed Conflict to Autonomous Weapon Systems,” International Law Studies 90 (2014): 391–393; and Kenneth Anderson and Matthew C. Waxman, “Law and Ethics for Autonomous Weapon Sys- tems: Why a Ban Won’t Work and How the Laws of War Can,” Jean Perkins Task Force on Na- tional Security and Law Essay Series (Stanford, Calif.: Stanford University, The Hoover Institu- tion, April 10, 2013). 28 Robert Sparrow, “Killer Robots,” Journal of Applied Philosophy 24 (1) (2007): 62–77. 29 Horowitz and Scharre, “The Morality of Robotic War.” 30 This can vary depending on the specific situation, but the general point is clear. 31 Asaro, “On Banning Autonomous Weapon Systems,” 701. 32 United Nations Institute for Disarmament Research, The Weaponization of Increasingly Autonomous Technologies: Considering Ethics and Social Values (Geneva: United Nations Institute for Disarma- ment Research, 2015), 9, http://www.unidir.org/files/publications/pdfs/considering-ethics -and-social-values-en-624.pdf. 33 United Nations General Assembly, Report of the Special Rapporteur on Extrajudicial, Summary or Ar- bitrary Executions (a/hrc/23/47), April 9, 2013, http://www.ohchr.org/Documents/HRBodies/ HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf. 34 United Nations Insitute for Disarmament Research, Weaponization of Increasingly Autonomous Technologies, 7–8. 35 United Nations General Assembly, Report of the Special Rapporteur, 17. 36 This is arguably why munitions-based laws may not really be laws at all, depending on the definition. 37 Thanks to Paul Scharre for making this point clear to me in a personal conversation. 38 Johnson and Axinn, “The Morality of Autonomous Robots,” 131.

36 Dædalus, the Journal of the American Academy of Arts & Sciences