The New Weapons of Mass Destruction? Building a Lethal Autonomous Weapon Is Easier Than Building a Self-Driving Car
Total Page:16
File Type:pdf, Size:1020Kb
40 The Security Times February 2018 February 2018 The Security Times 41 SECURITY BRIEFS SECURITY BRIEFS The new weapons of mass destruction? Building a lethal autonomous weapon is easier than building a self-driving car. A new treaty is necessary BY RONALD ARKIN pathos and hype while focusing on Smart autonomous weapons sys- BY STUART RUSSELL All of the component technologies – flight Compliance with international humanitar- the real technical, legal, ethical and tems may enhance the survival of control, swarming, navigation, indoor and ian law, even if achievable, is not sufficient to moral implications. noncombatants. Human Rights outdoor exploration and mapping, obstacle justify proceeding with an arms race involving et me unequivocally state: Numerous factors point to auton- Watch considers the use of pre- eginning in 2014, the High Contract- avoidance, detecting and tracking humans, tac- lethal autonomous weapons. President Obama: The status quo with respect omous robots soon being able to cision-guided munitions in urban ing Parties of the Convention on Cer- tical planning, coordinated attack – have been “I recognize that the potential development Lto innocent civilian casual- outperform humans on the battle- settings to be a moral imperative. Btain Conventional Weapons (CCW) demonstrated. Building a lethal autonomous of lethal autonomous weapons raises questions ties is utterly and wholly unaccept- field from an ethical perspective: In effect, there may be mobile have held meetings at the United Nations weapon, perhaps in the form of a multi-rotor that compliance with existing legal norms – if able. I am not in favor of Lethal • They are able to act conser- precision-guided munitions that in Geneva to discuss possible limitations on micro-unmanned aerial vehicle, is easier than that can be achieved – may not by itself resolve, Autonomous Weapon Systems vatively, as they do not need to result in a similar moral impera- the development and deployment of lethal building a self-driving car, since the latter is and that we will need to grapple with more (LAWS) nor of lethal weapons protect themselves in cases of low tive for their use. Such weapons autonomous weapons systems (AWS). In held to a far higher performance standard and fundamental moral questions about whether of any sort. I would hope that certainty of target identification. have the possibility of deciding November 2017, the CCW convened a must operate without error in a vast range and to what extent computer algorithms should LAWS would never when to fire and – more formal Group of Governmental Experts of complex situations. This is not “science be able to take a human life.” need to be used, as I importantly – when not (GGE), chaired by India’s Ambassador to fiction.” Autonomous weapons do not have One of the “fundamental moral questions” am against killing in to fire. They should be the UN Amandeep Singh Gill, with a man- to be humanoid, conscious and evil. And the is the effect of autonomous weapons systems all its manifold forms. Robot designed with overrides date to “assess questions related to emerging capabilities are not “decades away” as claimed on the security of member states and their But if humanity per- to ensure meaningful technologies in the area of lethal autono- by some countries. peoples. On this matter, the message of the sists in entering into human control. More- mous weapons systems.” This article reflects UN Special Rapporteur Christof Heyns, AI community, as expressed in the letters warfare, which is an imperative over, they can employ views shared by a great many in the artificial Human Rights Watch, the International Com- mentioned above, has been clear: Because unfortunate underlying fundamentally different intelligence community. These views were mittee of the Red Cross and other experts have they do not require individual human super- assumption, we must tactics while assum- expressed in an open letter on July 28, 2015, expressed concerns about the ability of autono- vision, autonomous weapons are potentially protect the innocent The moral obligation of using ing far more risk than signed by over 3,700 AI researchers, and in mous weapons to comply with provisions scalable weapons of mass destruction; an noncombatants in the human warfighters in a letter to the Obama administration written of international humanitarian law regarding essentially unlimited number of such weapons battlespace far better AI to reduce atrocities terms of protecting non- on April 4, 2016, by 41 leading American AI military necessity, proportionality and dis- can be launched by a small number of people. than we currently do. combatants and assess- researchers, including almost all of the living crimination between combatants and civilians. This is an inescapable logical consequence Technology can and ing hostility and hostile presidents of AAAI, the main professional Discrimination is probably feasible in most situ- of autonomy. As a result, we expect that should be used toward that end. • The eventual development and intent. In essence, these systems society for artificial intelligence. The British ations, even if not perfectly accurate. However, autonomous weapons will reduce human Is it not our responsibility as sci- use of a broad range of sensors can more effectively operate on a AI community sent a similar letter to then determining proportionality and necessity is security at the individual, local, national and entists to look for effective ways will render robots better equipped philosophy of “First do no harm” Prime Minister David Cameron. most likely not feasible for current AI systems international levels. to reduce man’s inhumanity to than humans for battlefield obser- rather than “Shoot first and ask The UN defines autonomous weapons as and would have to be established in advance It is estimated, for example, that roughly one its fellow man through technol- vations. questions later.” having the capacity to “locate, select and elimi- with reasonable certainty by a human opera- million lethal weapons can be carried in a single ogy? Research in ethical military • They can be designed without Building such systems is not nate human targets without human interven- tor for all attacks the weapons may undertake container truck or cargo aircraft, perhaps with robotics can and should be applied emotions that would otherwise a short-term goal, but rather tion.” Some have proposed alternative defini- during a mission. This requirement would only 2 or 3 human operators rather than 2 or 3 toward achieving this goal. cloud their judgment or result in part of a medium- to long-term tions – for example, the UK Ministry of Defence therefore limit the scope of missions that could million. Such weapons would be able to hunt I have studied ethology – animal anger and frustration with ongoing agenda addressing many challeng- says that autonomous weapons systems must legally be initiated. for and eliminate humans in towns and cities, The considerations of the preceding para- BY KIM MIN-SEOK operations and have experience to replace the Songgolmae. On behavior in their natural environ- battlefield events. ing research questions. However, “understand higher-level intent and direction” Another important component of interna- even inside buildings. They would be cheap, graph apply principally to weapons designed in developing norms and rules of the division level, an indigenous ment – as a basis for robotics for my • They avoid the human psycho- exploiting bounded morality within and “are not yet in existence and are not likely tional humanitarian law is the Martens Clause, for ground warfare and anti-personnel opera- engagement for their use. The US version of the existing KUS-9 entire career, ranging from frogs, logical problem of “scenario fulfill- a narrow mission context helps to to be for many years, if at all.” according to which “the human person remains STUART RUSSELL tions, and are less relevant for naval and aerial onventional warfare tactics, military has actively used drones in drone must be developed and the insects, dogs, birds, wolves and ment,” which contributed to the achieve better performance with Much of the discussion at the UN has been under the protection of the principles of human- is a computer science professor combat. It is still the case, however, that “to traditionally maintained Afghanistan and Iraq in efforts to RemoEyes replaced. The Agency human companions. Nowhere has downing of an Iranian airliner by respect to preserving noncomba- stymied by claims that autonomy is a mysteri- ity and the dictates of public conscience.” In this and the Smith-Zadeh Professor in entrust a significant portion of a nation’s through rifles and tanks, eradicate terrorist forces including for Defense Development (ADD) Engineering an the University of C it been more depressing than to the USS Vincennes in 1988. tant life, and thus warrants robust ous, indefinable property. In the view of the regard, Germany has stated that it “will not California, Berkeley. defense capability in any sphere to autonomous artillery, and fighter jets, is now al-Qaeda and the Taliban. Israel will soon complete development study human behavior in the battle- • They can integrate more infor- research on humanitarian grounds. AI community, the notion of autonomy is accept that the decision over life and death is systems is to court instability and risk strategic centered on the rapid innovation was the first country to use a drone of a MALE (medium-altitude field. The commonplace