Lethal Autonomous Systems and the Plight of the Non-combatant by Ronald Arkin (Georgia Institute of Technology)

It seems a safe assumption, unfortu- Mobile Robot Laboratory (GT-MRL)3. nately, that humanity will persist in This paper asserts that it may be pos- conducting warfare, as evidenced over sible to ultimately create intelligent au- all recorded history. New technology tonomous robotic military systems that has historically made killing more ef- are capable of reducing civilian casual- ficient, for example with the inven- ties and property damage when com- tion of the longbow, , armored pared to the performance of human vehicles, aircraft carriers, or nuclear warfighters. Thus, it is a contention . Many view that each of these that calling for an outright ban on new technologies has produced a Rev- this technology is premature, as some olution in Military Affairs (RMA), as groups already are doing4. Nonethe- they have fundamentally changed the less, if this technology is to be deployed, ways in which is waged. Many now then restricted, careful and graded in- consider robotics technology a poten- troduction into the battlefield of lethal tially new RMA, especially as we move autonomous systems must be standard towards more and more autonomous1 policy as opposed to haphazard de- systems in the battlefield. ployments, which I believe is consistent Robotic systems are now widely with existing International Humanitar- present in the modern battlefield, pro- ian Law (IHL). viding intelligence gathering, surveil- Multiple potential benefits of in- lance, reconnaissance, target acquisi- telligent war machines have already tion, designation and engagement ca- been declared by the military, includ- pabilities. Limited autonomy is also ing: a reduction in friendly casualties; present or under development in many force multiplication; expanding the bat- systems as well, ranging from the Pha- tlespace; extending the warfighter’s lanx system “capable of autonomously reach; the ability to respond faster performing its own search, detect, eval- given the pressure of an ever increasing uation, track, engage and kill assess- battlefield tempo; and greater precision ment functions”2, fire-and-forget muni- due to persistent stare [constant video tions, loitering torpedoes, and intelli- surveillance that enables more time for gent antisubmarine or anti-tank mines decision making and more eyes on tar- among numerous other examples. Con- get]. This argues for the inevitability of tinued advances in autonomy will result development and deployment of lethal in changes involving tactics, precision, autonomous systems from a military ef- and just perhaps, if done correctly, a re- ficiency and economic standpoint, un- duction in atrocities as outlined in re- less limited by IHL. search conducted at the Georgia Tech It must be noted that past and p. 1 AISB Quarterly present trends in human behavior in the introduction of International Humani- battlefield regarding adhering to legal tarian Law (IHL) over the last 150 years and ethical requirements are question- or so, these tendencies persist and are able at best. Unfortunately, humanity well documented,7 even more so in the has a rather dismal record in ethical days of CNN and the Internet. ‘Armies, behavior in the battlefield. Potential armed groups, political and religious explanations for the persistence of war movements have been killing civilians crimes include5: high friendly losses since time immemorial.’8 ‘Atrocity. . . is leading to a tendency to seek revenge; the most repulsive aspect of war, and high turnover in the chain of command that which resides within man and per- leading to weakened leadership; dehu- mits him to perform these acts is the manisation of the enemy through the most repulsive aspect of mankind’.9 The use of derogatory names and epithets; dangers of abuse of unmanned robotic poorly trained or inexperienced troops; systems in war, such as the Preda- no clearly defined enemy; unclear or- tor and Reaper drones, are well docu- ders where intent of the order may mented; they occur even when a human be interpreted incorrectly as unlawful; operator is directly in .10 youth and immaturity of troops; exter- Given this, questions then arise re- nal pressure, e.g., for a need to pro- garding if and how these new robotic duce a high body count of the enemy; systems can conform as well as, or and pleasure from power of killing or better than, our soldiers with respect an overwhelming sense of frustration. to adherence to the existing IHL. If There is clear room for improvement achievable, this would result in a reduc- and autonomous systems may help ad- tion in collateral damage, i.e., noncom- dress some of these problems. batant casualties and damage to civil- Robotics technology, suitably de- ian property, which translates into sav- ployed may assist with the plight of ing innocent lives. If achievable this the innocent noncombatant caught in could result in a moral requirement ne- the battlefield. If used without suit- cessitating the use of these systems. able precautions, however, it could po- Research conducted in our laboratory11 tentially exacerbate the already exist- focuses on this issue directly from a de- ing violations by human soldiers. While sign perspective. No claim is made that I have the utmost respect for our young our research provides a fieldable solu- men and women warfighters, they are tion to the problem, far from it. Rather placed into conditions in modern war- these are baby steps towards achieving fare under which no human being was such a goal, including the development ever designed to function. In such a of a prototype proof-of-concept system context, expecting a strict adherence to tested in simulation. Indeed, there may the Laws of War (LOW) seems unrea- be far better approaches than the one sonable and unattainable by a signif- we currently employ, if the research icant number of soldiers6. Battlefield community can focus on the plight of atrocities have been present since the the noncombatant and how technology beginnings of warfare, and despite the may possibly ameliorate the situation.

No 137, July 2013 p. 2 As robots are already faster, ture or wall penetrating radars, acous- stronger, and in certain cases (e.g., tics, and seismic sensing, to name but Deep Blue, Watson12) smarter than a few. There is reason to believe in the humans, is it really that difficult to future that robotic systems will be able believe they will be able to ultimately to pierce the fog of war more effectively treat us more humanely in the - than humans ever could. field than we do each other, given the – Unmanned robotic systems can be persistent existence of atrocious behav- designed without emotions that cloud iors by a significant subset of human their judgment or result in anger warfighters? and frustration with ongoing battlefield events. In addition, ‘Fear and hyste- Why technology can lead ria are always latent in combat, often to a reduction in casualties real, and they press us toward fearful measures and criminal behavior’13. Au- on the battlefield tonomous agents need not suffer simi- Is there any cause for optimism that larly. this form of technology can lead to a – Avoidance of the human psychological reduction in non-combatant deaths and problem of ‘scenario fulfilment’ is pos- casualties? I believe so, for the follow- sible. This phenomenon leads to dis- ing reasons. tortion or neglect of contradictory in- – The ability to act conservatively: i.e., formation in stressful situations, where they do not need to protect them- humans use new incoming information selves in cases of low certainty of tar- in ways that only fit their pre-existing get identification. Autonomous armed belief patterns. Robots need not be vul- robotic vehicles do not need to have nerable to such patterns of premature self-preservation as a foremost drive, if cognitive closure. Such failings are be- at all. They can be used in a self- lieved to have led to the downing of an sacrificing manner if needed and appro- Iranian airliner by the USS Vincennes priate without reservation by a com- in 1988.14 manding officer. There is no need for – Intelligent electronic systems can in- a ‘shoot first, ask-questions later’ ap- tegrate more information from more proach, but rather a ‘first-do-no-harm’ sources far faster before responding strategy can be utilized instead. They with lethal force than a human possibly can truly assume risk on behalf of the could in real-time. These data can arise noncombatant, something that soldiers from multiple remote sensors and intel- are schooled in, but which some have ligence (including human) sources, as difficulty achieving in practice. part of the Army’s network-centric war- – The eventual development and use fare concept and the concurrent devel- of a broad range of robotic sensors opment of the Global Information Grid. better equipped for battlefield obser- ‘Military systems (including weapons) vations than humans currently possess. now on the horizon will be too fast, too This includes ongoing technological ad- small, too numerous and will create an vances in electro-optics, synthetic aper- environment too complex for humans to p. 3 AISB Quarterly direct’15. ination and recognition of the status of – When working in a team of combined those otherwise hors de combat, among human soldiers and autonomous sys- many others. But if a warfighting robot tems as an organic asset, they have the can eventually exceed human perfor- potential capability of independently mance with respect to IHL adherence, and objectively monitoring ethical be- that then equates to a saving of non- havior in the battlefield by all parties, combatant lives, and thus is a human- providing evidence and reporting in- itarian effort. Indeed if this is achiev- fractions that might be observed. This able, there may even exist a moral im- presence alone might possibly lead to a perative for its use, due to a resulting reduction in human ethical infractions. reduction in collateral damage, similar to the moral imperative Human Rights Addressing some of the Watch has stated with respect to pre- cision guided munitions when used in counter-arguments urban settings18. This seems contradic- But there are many counterargu- tory to their call for an outright ban on ments as well. These include the lethal autonomous robots19 before de- challenge of establishing responsibility termining via research if indeed better for war crimes involving autonomous protection for non-combatants could be weaponry, the potential lowering of the afforded. threshold for entry into war, the mili- Let us not stifle research in the area tary’s possible reluctance to give robots or accede to the fears that Hollywood the right to refuse an order, prolifera- and science fiction in general foist upon tion, effects on squad cohesion, the win- us. By merely stating these systems ning of hearts and minds, cybersecurity, cannot be created to perform properly proliferation, and mission creep. and ethically does not make it true. If There are good answers to these that were so, we would not have su- concerns I believe, and are discussed personic aircraft, space stations, sub- elsewhere in my writings16. If the marines, self-driving cars and the like. baseline criteria becomes outperform- I see no fundamental scientific barriers ing humans in the battlefield with to the creation of intelligent robotic sys- respect to adherence to IHL (with- tems that can outperform humans with out mission performance erosion), I respect to moral behavior. The use consider this to be ultimately attain- and deployment of ethical autonomous able, especially under situational con- robotic systems is not a short-term ditions where bounded morality [nar- goal for use in current conflict, typi- row, highly situation-specific condi- cally operations, but tions] applies17, but not soon and not rather will take considerable time and easily. The full moral faculties of hu- effort to realize in the context of in- mans need not be reproduced to attain terstate warfare and situational context to this standard. There are profound involving bounded morality. technological challenges to be resolved, such as effective in situ target discrim-

No 137, July 2013 p. 4 A plea for the non- ties: friendly forces, enemy combat- combatant ants, civilians, and society in general. This can only be done through rea- How can we meaningfully reduce hu- soned discussion of the issues associ- man atrocities on the modern battle- ated with this new technology. To- field? Why is there persistent fail- ward that end, I support the call for a ure and perennial commission of war moratorium to ensure that such tech- crimes despite efforts to eliminate them nology meets international standards through legislation and advances in before being considered for deployment training? Can technology help solve as exemplified by the recent report from this problem? I believe that simply the United Nations Special Rappor- being human is the weakest point in teur on Extrajudicial, Summary, or Ar- the kill chain, i.e., our biology works bitrary Executions.20 In addition, the against us in complying with IHL. Also Department of Defense the oft-repeated statement that “war is has recently issued a directive21 re- an inherently human endeavor” misses stricting the development and deploy- the point, as then atrocities are also ment of certain classes of lethal robots, an inherently human endeavor, and to which appears tantamount to a quasi- eliminate them we need to perhaps moratorium. look to other forms of intelligent au- Is it not our responsibility as sci- tonomous decision-making in the con- entists and citizens to look for effec- duct of war. Battlefield tempo is now tive ways to reduce man’s inhumanity outpacing the warfighter’s ability to be to man through technology? Where able to make sound rational decisions is this more evident than in the bat- in the heat of combat. Nonetheless, I tlefield? Research in ethical military must make clear the obvious statement robotics can and should be applied to- that peace is unequivocally preferable ward achieving this end. The advent of to warfare in all cases, so this argument these systems, if done properly, could only applies when human restraint fails possibly yield a greater adherence to once again, leading us back to the bat- the laws of war by robotic systems than tlefield. from using soldiers of flesh and blood While we must not let fear and igno- alone. While I am not averse to the rance rule our decisions regarding pol- outright banning of lethal autonomous icy towards these new weapons sys- systems in the battlefield, If these sys- tems, we nonetheless must proceed cau- tems were properly inculcated with a tiously and judiciously. It is true moral ability to adhere to the laws of that this emerging technology can lead war and rules of engagement, while en- us into many different futures, some suring that they are used in narrow dystopian. It is crucially important bounded military situations as adjuncts that we not rush headlong into the de- to human warfighters, I believe they sign, development, and deployment of could outperform human soldiers with these systems without thoroughly ex- respect to conformance to IHL. The end amining their consequences on all par- product then could be, despite the fact p. 5 AISB Quarterly that these systems could not ever be unenforceable even if enacted. expected to be perfectly ethical, a sav- The horse is out of the barn. Un- ing of noncombatant lives and property der current IHL, these systems can- when compared to human warfighters’ not be developed or used until they behaviour. can demonstrate the capability of ade- This is obviously a controversial as- quate distinction, proportionality, and sertion, and I have often stated that shown that they do not produce un- the discussion my research engenders necessary suffering, and must only be on this subject is as important as the used given military necessity. Outside research itself. We must continue to those bounds any individuals responsi- examine the development and deploy- ble should be held accountable for vi- ment of lethal autonomous systems in olations of International Humanitarian forums such as the United Nations and Law, whether they are scientists, indus- the International Committee of the Red trialists, policymakers, commanders, or Cross to ensure that the internationally soldiers. As these systems do not pos- agreed upon standards regarding the sess moral agency, the question of re- way in which war is waged are adhered sponsibility becomes equated to other to as this technology proceeds forward. classes of systems, and a hu- If we ignore this, we do so at our own man must always ultimately bear re- peril. sponsibility for their use23. Until it can be shown that the existing IHL The Way Forward? is inadequate to cover this RMA, only It clearly appears that the use of then should such action be taken to lethality by autonomous systems is in- restructure or expand the law. This evitable, perhaps unless outlawed by in- may be the case, but unfounded pathos- ternational law – but even then enforce- driven arguments based on horror and ment seems challenging. But as stated Hollywood in the face of potential re- earlier, these systems already exist: the ductions of civilian casualties seems at Patriot missile system, the Phalanx sys- best counterproductive. These systems tem on Aegis class cruisers, anti-tank counterintuitively could make warfare mines, and fire-and-forget loitering mu- safer in the long run to the innocents in nitions all serve as examples. A call the , if coupled with the use for a ban on these autonomous systems of bounded morality, narrow situational may have as much success as trying use, and careful graded introduction. to ban artillery, cruise missiles, or air- Let it be restated that I am not craft bombing and other forms of stand- opposed to the removal of lethal au- off weaponry (even the crossbow was tonomous systems from the battlefield, banned by Pope Innocent II in 113922). if international society so deems it fit, A better strategy perhaps is to try and but I think that this technology can ac- control its uses and deployments, which tually foster humanitarian treatment of existing IHL appears at least at first noncombatants if done correctly. I have glance to adequately cover, rather than argued to those that call for a ban, they a call for an outright ban, which seems would be better served by a call for a

No 137, July 2013 p. 6 moratorium, but that is even hard to warfighters have a greater tendency or envision occurring, unless these systems opportunity to stray outside Interna- can be shown to be in clear violation of tional Humanitarian Law. But what the LOW. It’s not clear how one can must be stated is that a careful exami- bring the necessary people to the table nation of the use of these systems must for discussion starting from a position be undertaken now to guide their devel- for a ban derived from pure fear and opment and deployment, which many of pathos. us believe is inevitable given the ever For those familiar with the Martens increasing tempo of the battlefield as clause24 in IHL, a case could be made a result of ongoing technological ad- that these robotic systems potentially vances. It is unacceptable to be “one “violate the dictates of the public con- war behind” in the formulation of law science”. But until IHL lawyers agree and policy regarding this revolution in on what that means, this seems a dif- military affairs that is already well un- ficult course. I do believe, however, derway. The status quo with respect that we can aid the plight of non- to human battlefield atrocities is unac- combatants through the judicious de- ceptable and emerging technology in its ployment of these robotic systems, if manifold forms must be used to amelio- done carefully and thoughtfully, partic- rate the plight of the noncombatant. ularly in those combat situations where

Footnotes [1] We do not use autonomy in the sense that a philosopher does, i.e., possessing free will and moral agency. Rather we use in this context a roboticists’ definition: the ability to designate and engage a target without additional human intervention after having been tasked to do so. [2] U.S. Navy, “Phalanx Close-in Weapons Systems”, United States Navy Factfile, http://www.navy.mil/navydata/fact_display.asp?cid=2100&tid=487&ct=2, accessed 7/23/2013. [3] R.C. Arkin, Governing Lethal Behavior in Autonomous Robots, Chapman-Hall, 2009. [4] Notably Human Rights Watch, International Committee on Robot Arms Control (ICRAC) and Article 36. [5] Bill, B. (Ed.), Workshop Deskbook, International and Operational Law Department, Judge Advocate General’s School, June 2000; Danyluk, S., “Prevent- ing Atrocities”, Marine Corps Gazette, Vol. 8, No. 4, pp. 36-38, Jun 2000; Parks, W.H., “Crimes in Hostilities. Part I”, Marine Corps Gazette, August 1976; Parks, W.H., “Crimes in Hostilities. Conclusion”, Marine Corps Gazette, September 1976; Slim, H., Killing Civilians: Method, Madness, and Morality in War, Columbia University Press, New York, 2008. [6] Surgeon General’s Office, Mental Health Advisory Team (MHAT) IV Operation Iraqi Freedom 05-07, Final Report, Nov. 17, 2006. [7] For a more detailed description of these abhorrent tendencies of humanity discussed in this context, see Arkin, R.C., “The Case for Ethical Autonomy in Unmanned Systems”, Journal of Military Ethics, 9:4, pp. 332-341, 2010. p. 7 AISB Quarterly [8] Slim, H., Killing Civilians: Method, Madness, and Morality in War, Columbia Uni- versity Press, New York, 2008, p. 3. [9] Grossman, D., On Killing: The Psychological Cost of Learning to Kill in War and Society, Little, Brown and Company, Boston, 1995, p.229. [10] Adams. J., “US defends unmanned drone attacks after harsh UN Report”, Christian Science Monitor, June 5, 2010; Filkins, D., “Operators of Drones are Faulted in Afghan Deaths”, New York Times, May 29, 2010; Sullivan, R., “Drone Crew Blamed in Afghan Civilian Deaths”, Associated Press, May 5, 2010. [11] For more information see Arkin, R.C., Governing Lethal Behavior in Autonomous Systems, Taylor and Francis, 2009. [12] http://en.wikipedia.org/wiki/Deep_Blue_(chess_computer), http://en.wikipedia.org/wiki/Watson_(computer) [13] Walzer, M., Just and Unjust , 4th ed., Basic Books, 1977. [14] Sagan, S., “Rules of Engagement”, in Avoiding War: Problems of Crisis Management (Ed. A. George), Westview Press, 1991. [15] Adams, T., “Future Warfare and the Decline of Human Decisionmaking”, in Param- eters, U.S. Army War College Quarterly, Winter 2001-02, pp. 57-71. [16] E.g., Arkin, R.C, op. cit., 2009. [17] Wallach, W. and Allen, C., Moral Machines: Teaching Robots Right from Wrong, Oxford University Press, 2010. [18] Human Rights Watch, “International Humanitarian Law Issues in the Possible U.S. Invasion of Iraq”, Lancet, Feb. 20, 2003. [19] Human Rights Watch, “Losing Humanity: The Case Against Killer Robots”, Nov. 19, 2012. [20] Christof Heyns, Report of the Special Rapporteur on Extrajudicial, Summary, and Arbitrary Execution, United Nations Human Rights Council, 23rd Session, April 9, 2013. [21] United States Department of Defense Directive Number 3000.09, Subject: Auton- omy in Weapons Systems, November 21, 2012. [22] Royal United Services Institute for Defence and Security Studies, “The Ethics & Legal Implications of Unmanned Vehicles for Defence and Security Purposes”, Work- shop webpage, held Feb. 27, 2008, http://www.rusi.org/events/ref:E47385996DA7D3, (accessed 5/12/2013). [23] Cf. Arkin, R.C., “The Robot Didn’t Do it.”, Position Paper for the Workshop on Anticipatory Ethics, Responsibility, and Artificial Agents, Charlottesville, VA., January 2013. [24] The clause reads “Until a more complete code of the laws of war is issued, the High Contracting Parties think it right to declare that in cases not included in the Regulations adopted by them, populations and belligerents remain under the protection and empire of the principles of international law, as they result from the usages established between civilized nations, from the laws of humanity and the requirements of the public conscience.” (Available at the ICRC website, http://www.icrc.org/eng/resources/documents/misc/57jnhy.htm last visited on 30 April 2013).

No 137, July 2013 p. 8 Ronald Arkin is Regents’ Professor, Director of the Mobile Robot Laboratory, and Associate Dean for Research in the College of Computing at the Georgia Institute of Technology. This work was supported in part by the U.S. Army Research Of- fice under Contract #W911NF-06-1-0252. Small portions of this essay appeared earlier in a View- point article by the author appearing in the Jour- nal of Industrial Robots 38:5, 2011, and from a more comprehensive treatment of the subject in the author’s book Governing Lethal Behavior in Autonomous Systems, Taylor-Francis, 2009, and are included with permission.

p. 9 AISB Quarterly