The Ethics & Morality of Robotic Warfare
Total Page:16
File Type:pdf, Size:1020Kb
The Ethics & Morality of Robotic Warfare: Assessing the Debate over Autonomous Weapons Michael C. Horowitz Abstract: There is growing concern in some quarters that the drones used by the United States and others represent precursors to the further automation of military force through the use of lethal autonomous weap- on systems (LAWS). These weapons, though they do not generally exist today, have already been the sub- ject of multiple discussions at the United Nations. Do autonomous weapons raise unique ethical questions for warfare, with implications for just war theory? This essay describes and assesses the ongoing debate, fo- cusing on the ethical implications of whether autonomous weapons can operate effectively, whether human accountability and responsibility for autonomous weapon systems are possible, and whether delegating life and death decisions to machines inherently undermines human dignity. The concept of LAWS is extreme- ly broad and this essay considers LAWS in three categories: munition, platforms, and operational systems. The growing use of drones on today’s battlefields raises important questions about targeting and the threshold for using military force. Over ninety mili- taries and nonstate actors have drones of some kind and almost a dozen of these have armed drones. In 2015, Pakistan shot down an Indian drone in the dis- puted Kashmir region, Turkey shot down a drone near its border with Syria, and both Nigeria and Pa- MICHAEL C. HOROWITZ is As- 1 sociate Professor of Political Sci- kistan acquired armed drones. ence at the University of Penn- The use of drones by the United States and oth- sylvania and Associate Director ers has led to an array of questions about the appro- of Penn’s Perry World House. He priateness of so-called remote-controlled warfare. formerly worked for the U.S. De- Yet on the horizon is something that many fear even partment of Defense. His publi- more: the rise of lethal autonomous weapon sys- cations include Why Leaders Fight tems (laws).2 At the 2016 Convention on Certain (2015) and The Diffusion of Mili- tary Power: Causes and Consequences Conventional Weapons in Geneva, over one hun- for International Politics (2010). dred countries and nongovernmental organiza- You can follow him on Twitter tions (ngos) spent a week discussing the potential @mchorowitz. development and use of autonomous weapon sys- © 2016 by Michael C. Horowitz doi:10.1162/DAED_ a_00409 25 The Ethics tems. An ngo, The Future of Life Insti- are undoubtedly subcomponents of each & Morality tute, broke into the public consciousness issue not discussed here. Most generally, of Robotic Warfare in 2015 with a call, signed by luminaries this essay finds that the ethical challenges Elon Musk and Stephen Hawking, as well associated with autonomous weapons may as scientists around the world, to prohib- vary significantly depending on the type of it the creation of autonomous weapons.3 weapon. laws could fall into three catego- Two essential questions underlie the de- ries: munition, platforms, and operational bate about autonomous weapons: first, systems. While concerns may be overstated would autonomous weapons be more or for laws that will be most akin to next-gen- less effective than nonautonomous weapon eration munitions, when thinking about au- systems? Second, does the nature of auton- tonomous weapon platforms or operation- omous weapons raise ethical and/or moral al systems for managing wars, laws raise considerations that either recommend their more important questions. Caution and a development or justify their prohibition? realistic focus on maintaining the centrali- Ultimately, the unique facet distinguishing ty of the human in decisions about war will laws from non-laws is that the weapon be critical. system, not a person, selects and engages targets. Therefore, it’s critical to consid- Given the use of drones by the United er whether the use of laws could comply States and others against terrorists and broadly with the protection of life in war, insurgents around the world, there is a a core ethical responsibility for the use of tendency to conflate the entire category force; whether laws can be used in ways of military robotics with specific cases of that guarantee accountability and responsi- drone strikes. However, it is a mistake to bility for the use of force; and whether there focus solely on the drone strike trees and is something about machines selecting and miss the vast military robotics forest. For engaging targets that makes them ethically example, as current platforms, like the problematic. Given the centrality of these rq-4 Global Hawk, and next generation issues in debates about just war theory, it experimental technologies, like the x-47b therefore makes the issue of laws relevant (United States) and Sharp Sword (China), to just war theory as well. demonstrate, drones are potentially use- This essay examines the potentially ful for much more than simply targeted unique ethical and moral issues surround- strikes, and in the future could engage in an ing laws, as opposed to nonautonomous even larger category of military missions. weapon systems, especially as they relate Moreover, the focus on drone strikes pre- to just war theory, in an attempt to lay out sumes that military robotics are only use- some of the key topics for thinking about ful in the air. But there are a variety of mis- laws moving forward. It does not engage, sions–from uninhabited truck convoys to however, with certain legal arguments sur- the Knifefish sea mine detection system to rounding laws, such as whether interna- Israel’s unmanned surface patrol vehicle, tional humanitarian law implies that hu- the Protector–in which robotic systems can mans must make every individual life-or- play a significant role outside the context death decision, or whether laws violate the of airborne-targeted killings.5 Martens Clause of the Hague Convention Within the realm of military robotics, au- by violating the dictates of the human con- tonomy is already extensively used, includ- science.4 Moreover, different opponents of ing in autopilot, identifying and tracking laws make different arguments, as do dif- potential targets, guidance, and weapons ferent critics of those opponents, so there detonation.6 Though simple autonomous 26 Dædalus, the Journal of the American Academy of Arts & Sciences weapons are already possible, there is vast ing decisions about who to target and when Michael C. uncertainty about the state of the possible to fire weapons via algorithm clearly is. In Horowitz when it comes to artificial intelligence and between these extremes, however, is a vast its application to militaries. While robots and murky gulf–from incremental ad- that could discriminate between a person vances on the precision guided weapons holding a rifle and a person holding a stick of today to humanoid robots stalking the still seem to be on the horizon, technology earth–that complicates our thinking about is advancing quickly. How quickly and how the ethical and moral challenges associat- prepared society will be for it, though, are ed with laws and the implications for just open questions.7 A small number of weap- war theory. on systems currently have human-super- In 2012, the U.S. Department of Defense vised autonomy. Many variants of the close- (dod) defined an autonomous weapon in weapon systems (ciws) deployed by the as “A weapon system that, once activat- U.S. military and more than two dozen mil- ed, can select and engage targets without itaries around the world, for example, have further intervention by a human opera- an automatic mode.8 Normally, the system tor.”11 The dod further distinguished be- works by having a human operator identi- tween autonomous weapons, human-su- fy and target enemy missiles or planes and pervised autonomous weapons (that is, au- fire at them. However, if the number of in- tonomous weapons that feature a human coming threats is so large that a human op- “on the loop” who possesses an override erator cannot target and fire against them switch), and semiautonomous weapons, or effectively, the operator can activate an au- “a weapon system that, once activated, is tomatic mode whereby the computer tar- intended to only engage individual targets gets and fires against the incoming threats. or specific target groups that have been se- There is also an override switch the human lected by a human operator.”12 ngo groups, can use to stop the system. such as Human Rights Watch, have gener- Nearly all those discussing autonomous ally adopted similar definitions.13 This es- weapons–from international organiza- say does as well, considering lethal auton- tions to governments to the Campaign to omous weapon systems as weapon systems Stop Killer Robots–agree that laws differ that, once activated, are designed to select fundamentally from the weapons that mil- and engage targets not previously designat- itaries employ today.9 While simple at first ed by a human.14 Defining what it means glance, this point is critical: when consid- to select and engage targets is complicat- ering the ethical and moral challenges asso- ed, however. For example, if homing muni- ciated with autonomous weapons, the cate- tions are considered to “select and engage” gory only includes weapons that operate in targets, then autonomous weapons have ex- ways appreciably different from the weap- isted since World War II. ons of today.10 Resolving the definitional debate is be- From a common sense perspective, de- yond the scope of this essay. But even if fining an autonomous weapon as a weap- there is not a clear agreement on exactly on system that selects and engages targets what constitutes an autonomous weapon, on its own makes intuitive sense. More- breaking down laws into three “types” over, it is easy to describe, at the extremes, of potential autonomous weapons–muni- what constitutes an autonomous weap- tion, platforms, and operational systems– on.