3 The Journal of Technology Studies 2 , from erly y a v w While the and Patriot x. ferent times and ere confusing and as climbing a w hich required controllers inadequate and o ys eto a computer solution, y en at dif USS Vincennes act it w ner, 1989). ner, U.S. Navy warship accidentally shot accidentally warship Navy U.S. Air and Missile Defense Command The displa a y hen in f as caused b , m ornado and an American F/A-18, killing ornado and an T Ar If a DSS is faultyIf a DSS is or fails into to take This lack of understanding the need for a This lack of understanding the need for ere usability issues in these cases, there are en ten seconds to v get altitude change, w v v missile cases, interface designers could say that missile cases, interface designers could say but the problem the core problem, usability was is much deeper and more comple manifestation of poor design decisions led to se complex display of information to the controllers display complex one of the 1994). Specifically, den Hoven, (van primary factors to the decision to shoot leading the con- the perception by airliner was the down towards descending trollers that the airliner was the ship, w issues concerning responsibility, underlying down a commercial passenger Iranian airliner a commercial passenger down control com- designed weapons due to a poorly The accident puter interface, killing all aboard. wrong with nothing was revealed investigation but that the or hardware, the system software accident w account a critical social impact factor, the results impact factor, account a critical social in terms of later be expensive will not only also but possibly redesigns and lost productivity, history is replete the loss of life. Unfortunately, failures to adequately of how with examples support inherent in understand decision problems can lead to sociotechnical domains complex in 1988, the USS example, catastrophe. For Vincennes were often incorrect, only and operators, who gi lacking training in a highly admittedly were system management-by-exception complex (32nd 2003). In both the the ship. The display tracking the airliner was The display the ship. rate of designed and did not include the poorly tar to “compare data tak make the calculation in their heads, on scratch make pads, or on a calculator – and all this during combat” (Ler again human-centered interface design was with the military in the 2004 war repeated by missile sys- Patriot Army’s the U.S. Iraq when a fratricide, shooting down in tem engaged British three pilots. - t ysical aces for deci- This paper ely as well as as well ely v ic design guidance. hich allows people to ethically distance people to ethically hich allows When the human element is introduced that result from introducing gitimate authority. I argue that when I argue gitimate authority. interact with the mechanical or ph s ving increasing attention both in academia Understanding the impact of ethical and oduction eloping human computer interf y xity of socio-technical systems, decision sup- xity of socio-technical ers of social and ethical issues (to include v y orld. aces have a number of additional ethical aces have Mary L. Cummings Automation and Accountability in Decision Support Decision in Accountability and Automation InterfaceSystem Design suggest specif discusses those accountability issues specific to DSS’ moral responsibility) emerge but are not always moral responsibility) emerge as such. Ethical and social impact recognized issues can arise during all phases of design, and identifying and addressing these issues as early can help the designer to both analyze as possible the domain more comprehensi and highlight areas that interface designers into consideration. should take into decision and control processes, entirely new into decision and control processes, entirely la systems (DSS’s) embedded in computer inter- systems (DSS’s) f those of designers who responsibilities beyond onl w social dimensions in design is a topic that is social dimensions in design is a topic that recei and in practice. Designers of decision suppor Intr Abstract is introduced into When the human element new decision support system design, entirely but are social and ethical issues emerge of layers This paper dis- as such. recognized not always and social impact issues cusses those ethical specific to decision support systems and high- lights areas that interface should con- designers with an emphasis on mili- sider during design tary com- applications. Because of the inherent ple to cer- port vulnerable systems are particularly tain potential ethical pitfalls that encompass comput automation and accountability issues. If de er systems diminish a user’s sense of moral er systems diminish a user’s account- an erosion of and responsibility, agency ability could result. In addition, these problems an interface is perceived when are exacerbated as a le the ability to sion support systems that have harm that a moral people, the possibility exists distancing, is a form of psychological buffer, created w from their actions. themselves 2

The Journal of Technology Studies 4 making andwithalo rigid tasksthatrequirenoflexibility indecision- Forsystem thatrequireshumanintervention. level ofautomationshouldbeintroducedintoa engineers anddesignersf 1997). Oneoftheprimar of thework ofhumans (Parasuraman &Riley, need forhumans;ratheritchangesthenature systems Automation indecisionsupport that alsocontainssocialandethicalimplications. automation isnotmerely one atechnicalissue,but of designers realizetheinclusionofdegrees the designprocess.Itiscriticalthatinterface implications thatmay notbefully understoodin one thathastremendoussocialandethical may seemingly beatechnicalissue,itisindeed system. While theintroductionofautomation ofautomationusedinadecisionsupport degree unnoticed. Onesuchdesignconsiderationisthe withitethicalimplicationsthatmay go carry seem tobeastraightforward designdecisioncan las andmathematicalmodels.Oftenwhat may usingformu- impact cannotbeeasily quantified system capabilities,andethicalsocial The interactionbetween cognitive limitations, implications, althoughoftenthesecanbesubtle. socialandethical design thathave significant system there aremany ofdecisionsupport facets analysis.further accountability, andsocialimpactthatdeserve for decisionsuppor the relevant data. The applicationofautomation automation where theautomationonly presents out ofthedecisionprocesstominimalle automated where theoperatoriscompletely left duced indecisionsuppor be perfectl and automation arenotadvisable becauseoftherisks and changingconstraints,higherlevels of ing indynamicenvironments withmany external systems like thosethatdealwithdecision-mak- However, in solution (Endsley &Kaber, 1999). f potential conditions orrelevant resultsin factors inability ofautomation modelstoaccountfor all that considersallknown constraints.However, the andcomprehensivebased onacorrect algorithm decisions canbeaccurately and quickly reached ailure, fullautomationoftenprovides thebest the inabilityofanautomateddecisionaidto In general,automationdoesnotreplacethe B V arious levels ofautomationcanbeintro- eyond simply examining usabilityissues, y reliab le (Sarter &Schroeder,le (Sarter 2001). t systems isef w probability ofsystem t y ace isdeter systems, fromfull design dilemmas fecti mining w v e v w els of hen y hat l designed forcancerradiationtherapy, was poor- doses ofradiation (Le cation thatthe machine was delivering fatal code,therewas no indi- “Malfunction 54”error levels ofradiation.Otherthanan ambiguous radiation treatmentsunkno that thedataappearedaccurate, andthenbegin itonthedisplay data,correct so enter erroneous puter interf tion, reducedsituationala tends thatover-automation causesskilldegrada- rily monitoringrole.Parasuraman (2000)con- automation thatrelegate theoperatortoaprima- there aremany drawbacks tohigherlevels of researchhasshown that brittleness, significant a make suboptimaldecisions,which inthecaseof confusing ormisleading,causingoperatorsto automated solutionsandrecommendationscanbe Vincennes USS too lateforse betweenoccurred 1985-1987,itwas discovered mas. For example, inthe Therac-25 casesthat ples ofproblematic and ethical dilem- interfaces 2004). synthesize, misleading,andconfusing(NRC, to thatwastrollers withinformation difficult were overly complex andoverwhelmed thecon- quent representationontheinstr Automation ofsystemcomponentsandsubse were totheaccident. contributors primary control roomandhumancognitive limitations problems representationinthe withinformation of onethe the UnitedStates,1979coolingmalfunction the mostwell-known engineeringaccidents in consequences. For example, inperhaps one of automation representationshave ledtolethal ical device applications,where confusing domains, suchasnuclearpo There have beenmany incidentsinother workload, andanover-reliance onautomation. vide the correct response.Inaddition,asinthe vide thecorrect sible forany automationalgorithmtoalways pro- the“noisiness”ofworldterm makes itimpos- human operators,what Parasuraman etal.(2000) unanticipated responsesfrombothsystemsand The unpredictabilityoffuturesituationsand al., 1996;Smith,McCoy, &C.Layton, 1997). ormisleadingsuggestions(Guerlainet erroneous brittle-decision algorithms,which possibly make y weapons canbelethal. controlinterface, designed. Itw In additiontoproblems withautomation The medicaldomainisrepletewithexam- ace forthe Three MileIslandnuclearreactors, veral patientsthatthehuman-com- and Patriot missileexamples, as possib veson & Turner, 1995). Therac-25, which was le foratechnicianto w wingl areness, unbalanced w er plantsandmed- y ument panels with lethal - The Journal of Technology Studies 25 , irst ge. radation can cause g y ined as people ut the increased b , viduals using the , ver see Skitka, et al., see Skitka, ver y e ror w etlock & Boettger vidence), the de vention techniques vention gies (T ficult ambiguous ques- and While automation can be addressed Some research on social accountability uman error can actually cause new errorsuman error cause new in the can actually esource allocation (Cummings, 2004). Moreover, (Cummings, esource allocation 1989). However, previous studies on social studies previous 1989). However, accountability focused on human judgments through training inter (Ahlstrom et al., 2003, ho 1999 for conflicting e of accountability and abandonment of responsi- computer interfaces using automated bility when are much more dif Automated decision support tions to address. decision effective- tools are designed to improve ness and reduce human er Automation and Accountability and Automation is akin to automation 1983), which (Tetlock, bias. Social accountability is def and justify their social judg- to explain having ments about others. In theor operators to relinquish a sense of responsibility of a accountability because and subsequently perception that the automation is in char feel that the machine is in complete system may control, disclaiming personal accountability for errorany or performance degradation.” suggests that increasing social accountability to best i.e., the tendency effect, reduces primacy remember the salient cues that are seen f subjects to employ accountability motivates deci- complex more self-critical and cognitively sion-making strate Sheridan (1983) maintains that even in the infor- Sheridan (1983) maintains that even mation-processing role, “indi automated decision aids designed to reduce automated decision h correct& Parasuraman Skitka, 1996; & (Mosier is particularly bias Automation 1997). Riley, support intelligent decision is when problematic spaces with time pres- problem large needed in and control in command is needed what sure like path planning and as emergency domains such r in which In an experiment operation of a system. fideli- required to both monitor low subjects were and participatety gauges in a tracking task, 39 errorsout of 40 subjects committed of commis- followed almost always sion, i.e., these subjects or recommenda- incorrect automated directives tions, despite the fact exist- that contraindications ed and verification(Skitka et al., possible was Automation bias is an important consider- 1999). but as will be ation from a design perspective, section, it is also one demonstrated in the next that has ethical implications as well. - els v ary of indi- icient” le f another impor , er v e w more “ef , wn as automation bias, usting the technology and usting the technology tr y Sheridan (1996) is w The focus on the impact of , usting automation in complex usting automation in complex lem. Kno tr Thus higher lissfull y prob ” erl t for interfaces that impact human life such Ov While it is well established that the use of established While it is well Many researchers assert researchers the that keeping Many eapon and medical interfaces. What might eapon and medical interfaces. y the primary consideration when designing a the primary designing consideration when iley, 1997). Reducing automation levels can automation levels 1997). Reducing iley, roblems (Billings, 1997; Parasuraman, (Billings, roblems system operation is a well-recognized decision system operation is a well-recognized suppor tant point is how automation can impact a user’s tant point is how In one sense of responsibility and accountability. references in the technical literature of the few on humans and automation that considers the automation and moral relationship between responsibility viduals “b own abandoning responsibility for one’s actions. or not to disregard a tendency humans have search for contradictory information in light of a solution that is accepted as computer-generated of automation are not always the best selection of automation are not always DSS. for an effective automation in human computer interfaces from a design stand- fully should be investigated espe- point, there are also ethical considerations, ciall as w of automa- level seem to be the most effective not be the may tion from a design viewpoint most ethical. actions is a critical automation on the user’s design consideration; ho cause higher workloads for operators; however, for cause higher workloads operators cognitively the reduction can keep a part decision-mak- of the and actively engaged promotes critical function ing process, which awareness as situation performance as well be seen can workloads 1997). Higher (Endsley, and inefficientas a less-than-optimal design approach, but efficiency should not necessarily be and Scott-Morton (1978) assertDSS. Keen that the effective- using a computer aid to improve ness of decision making is more important than Automation can indeed the efficiency. improving efficient a system highly make but ineffective, needed for a correct if knowledge especially in a predetermined decision is not available algorithm. Masalonis, & Hancock, 2000; Parasuraman & 2000; Parasuraman Masalonis, & Hancock, R operator engaged in decisions supporteddecisions in engaged operator by human-cen- as the otherwise known automation, of automation, to the application tered approach confusion and erroneous prevent will help to fatal could cause potentially which decisions p 2

The Journal of Technology Studies 6 deserves significantly moreattention. significantly deserves interaction and accountabilityresearchthat computer offloaded notonly alsotothe but tothegroup, ity entity inthecollective sothatresponsibil- group becomesanother that thecomputerinterface isnotasclearlyfaces understood.Itislikely ing collecti responsibility andagency foroperatorsinteract- action, thepotentiallossofasensemoral de while research indicatesthatpeopleexperience social loaf possib viduals inadecision-makingcomponent,itis designing systemsthatrequirethefewest indi- (see Karau& Williams, 1993forareview). By and thisconceptisknown as“socialloafing” collective asopposedtoworking groups alone, ity fortasksisdiffused when peoplework in system.Researchindicatesthatresponsibil- port required tointeractwithagi accountability would be thenumberofpeople However, onetangible designconsiderationfor tems, accountabilitywillmostlik accountability? F systemsbedesignedtopromote sion support effects ofautomationbias,how thencoulddeci- Burdick, 2000). cies putinplaceb an estab o alsoimprovedomission andcommission,but of automation biasthroughdecreasederrors social accountabilityleadtofewer instancesof The resultsshowed thatnotonly didincreased comes incomputerizedflightsimulationtrials. jects were requiredtojustifystrategies andout- able fortheirjudgmentsaboutotherpeople,sub- automation bias.Insteadofbeingheldaccount- mine theeffects ofsocialaccountabilityon intentofthisstudywas todeter- The specific included technology ofautomation. intheform ty fromapurely socialperspective toonethat ed tobridgethegap inresearchingaccountabili- and accountability. accountability tothediscussionofcomputers somewhat limitedintheapplicationofsocial t about otherhumansanddidnotincorporate echnology, automation,sothey are specifically v g erall taskperfor , raded taskresponsibilitythroughcollecti and henceaccountability, canbecognitively If increasedaccountabilitycanreducethe Skitka, Mosier, andBurdick(2000)attempt- le thaterosioninaccountabilitythrough lished or . This isoneareainhuman-computer ing couldbediminished.Ho v el y through human-computerinter g or complex socio-technical sys- anizational str y mance (Skitka,Mosier higher -le ven decisionsup- v el management. ucture andpoli- ely comefrom wever, , & v e - (UCA siles andunmannedcombataerialv developing weapons mis- smart suchascruise protecting ouro point accuracy. While thisdistanceiseffective in 1,000 milesfromtheirintendedtarget withpin- fromovermissiles inIraqthatcanbefired recentlytances; forexample, themilitary used fromremotedis- types ofweapons canbefired considering theconsequences.Ingeneral,these that allow humanstokillwithoutadequately also have thepotentialtobecomemoralbuffers response, de withunprecedentedrapidbattlefield the military of minutes. While theseweapons willprovide inamatter ed in-flighttoatarget ofopportunity in as willbedemonstrated weaponsmilitary example; interface although, canbestbe illustratedthrougha er interfaces users. This senseofdistancecreatedby comput- createfortheir ness thatcomputerinterfaces moral buffer isthesenseofdistanceandremote- ethical issuesininterf is precisel be theconduitsformoraldisengagement, which automated recommendations.Moralbuffers can or orprocess,suchasacomputerinterface fact accountability andresponsibilitythroughanarti- of ambiguityandpossible diminishmentof conduct. A moralbuffer addsanadditionallayer self-censure inordertoengage inreprehensible gagement inwhich peopledisengage inmoral same asBandura’s (2002)ideaofmoraldisen- cept ofmoralbuffering notthe isrelatedtobut distance themselves fromtheiractions. The con- ed which allows peopletomorally andethically iscreat- of distancingandcompartmentalization, the possibilityexists thatamoralbuffer, aform forweaponsinterfaces andmedicalinterfaces, people,suchas tem thathastheabilitytoharm foranyoping ahumancomputerinterface sys- puters andautomation,Iargue thatwhen devel- bility thatcanresultfrominteractionswithcom- Designing amoralbuffer diminishes asenseofaccountability. increasing thedistancefrom the battlefield “distant punishment” is a euphemistic form of “distant punishment” isaeuphemisticform the militar “distantpunishment,”termed isdeeply rootedin other domains. A Because ofthediminishmentaccounta- The desiretokilltheenem Vs), w k e y y y element inthede the reasonforneedtoe v culture, ande hich oncelaunched, canberedirect- eloping technolo wn forces,itisalsolik The militar ace design. , moral b v en usingtheter velopment ofa y gies ofthissor is cur y uf from af fers canoccur ehicles rentl el xamine y ar y that , m t The Journal of Technology Studies 27 y , vest- els of di ner v wledge is eapons will inflict fering he causes” ypothesized that the et this kno warfare” (Grossman, 1995). (Grossman, warfare” ® suppose that his w about the lack of empathetic cues in y ram (1975) h y l ere told to administer increasing le ar F When people are administering potentiall w While many different types of experimental different While many The famous Milgram studies of the early Milg ner. When the learner was in sight, 70% of When the learner was ner. ” fering and death, y legitimate authority. Under the impression that Under the impression authority. legitimate With the recent advancements in smart weapons the recent advancements With interfaces through computer that are controlled the both popular video games, that resemble that occur and emotional distancing physical and controlling launching with remotely greater sense of an even provides weapons modern in previously seen detachment than ever warfare. 1960s the concept of how help to illustrate consequences of one’s remoteness from the In alter human behavior. actions can drastically was these studies, the focal point of the research be subjects would “obedient” to determine how considered to be to requests from someone they a the real purpose to examine study was of the as the “teach- subjects, learning and memory, ers, electric shocks to another person, the lear a confederate participant, actually was who on a memorymistakes this person made when test. the one most perti- examined, conditions were is the nent to this discussion of moral buffers depend- that was in subject behavior difference teacher could see the or not the ent on whether lear as the subjects refused to administer the shocks, the resisted when 35% who opposed to only complete- located in a remote place, subject was out of contact with the teacher (Milgram, ly 1975). increase in resistance to shocking another in sight could be the human was human when factors. One important fac-attributed to several tor could be attributed to the idea of empathetic cues. TV and video screens, thermal video TV and some sights, or a psy- that provides apparatus mechanical other Grossman an element that buffer, chological terms“Nintendo painful stimuli to other humans in a remote in a conceptual aware are only location, they could result. Milgram had sense that suffering this to sa “The bombardier can delivery, military weapons reasonab suf ed of affect and does not arouse in him an emo- ed of affect tional response to the suf (Milgram, Milgram 1975). proposed that several - - - to inter n arget etc.) , However, the pri- However, killing another . e Anti-Armor Missiles vide the remote distance Physical Distance from T Physical Distance from r, (Sniper, gy to mak Mid-Range (Rifle) Long Range vices pro Max Range (Bombe 1. Resistance to Killing as a 1. Resistance Handgrenade Range Close Range (Pistol/Rifle) Bayonet Range Knife Range e exual Range Hand-to-Hand Combat Range In addition to the actual physical distance In addition to the actual physical gical de lose

S

Grossman that should be of concer

wever, there is a distinct difference in devel- there is a distinct difference wever, icant contributor as well. Emotional distanc- icant contributor as well.

C

w o L g n i l l i K o t e c n a t s i s e R h g i H sense of moral superiority

y

nif domains is necessary for job per- ing in many the medical formance, such as police work, and in the military in general. community, Ho oping emotional distance for self or team preser emotional distance and developing vation, through technolo that makes it easier for people to kill, Grossman that makes is a sig (1995) contends that emotional distance mary distancing element hypothesized emotional b human more palatable. Grossman contends that human more palatable. of killing can emotional distance in the context be obtained through social factors that cause one particular a group class of people as less to view include cultural elements than human, which as as well such as racial and ethnic differences, a face designers is that of mechanical distancing. In this form distancing, some tech- of emotional nolo can be These devices it easier to kill. that makes Function of Distance (Grossman, 1995) Figur moral buffering. Military and psychol- historian buffering. moral military that contends Grossman per- Dave ogist per- to avoid a deep-seated desire sonnel have use distant punish- and thus sonal confrontation, military will without hav- to exert way ment as a ing to face of combat the consequences depicts the level 1998). Grossman (Grossman, of resistance to firing as a function of a weapon 1. In addition, in Figure to the enemy proximity no been virtually he reports that there have in firinginstances of noncompliance weapons there are signifi- distances, while from removed to firecant instances of refusal for soldiers in hand-to-hand combat (Grossman, engaged 2000).

Artillery)

2

The Journal of Technology Studies 8 also kno Viewing automationasanindependentagentis & human operator(Sarter Woods, 1994). littlefeedbackforthe but human intervention) automation authority(automationactswithout purpose ormeaningoftheir individual actions. purpose them toha mission. This disengagement cancause system talized subsystemanddetached fromtheoverall human userscanbeisolatedin acompartmen- 1997). Inautomatedsuper to erosionofaccountability(Friedman & Kahn, moral agency andresponsibility, thiswould lead systems candiminishusers’ & tiall puter e com- these educatedindividuals withsignificant making tothecomputer. Resultssuggestedthat moral responsibilityindelegation ofdecision theirviewsconcerning ofcomputeragency and studentswereence undergraduate interviewed responsibility, twenty-nine malecomputersci- subject views aboutcomputeragency andmoral W it doing?”and“Why & diditdothat?”(Sarter flight managementautomationsuchas,“Whatis cockpits where pilotswillaskquestionsabout ples ofthiscanbefoundincommercialairline occurs inacomple & (Sarter Woods, 1994).Low observability an independentagentcapable ofwillfulaction cause humanstovie systemswith“lowsupport observability” can has beenestablished thatautomateddecision lished (Reevesit &Nass,1996).Furthermore, computershasbeenwell-estab-pomorphize buffering effect. The humantendency toanthro- an inanimateobject,which addstothemoral agency tothecomputer, thatitis despitethefact sciously recognizing it,peopleassignmoral provide, itisalsopossible thatwithout con- from negative can consequencesthatinterfaces tance, thesenseofremoteness,anddetachment miles away. for onethatcontrolsweapons fromover 1,000 weapons delivery especially computerinterface, areclearlyfactors presentintheuseofa sight, outofmind”phenomenon. All ofthese for subjects,which isessentially the“outof effect includingnarrowing ofthecognitive field accountforthedistance/obedience other factors oods, 1994). Millet, 1997).Itfollows thenthatifcomputer y In aresearchstudydesignedtodetermine In additiontophysical andemotionaldis- responsib xperience doholdcomputersatleastpar- wn as“percei ve littleunderstanding ofthelarger le for computer error (Friedman le forcomputererror x w system withhighle ved animacy” andexam- the automatedsystemas visor senses oftheiro y systems, vels of wn (F and shouldnotbea“closedloop”system to aidinthedecisionofremoving lifesupport that APACHE only beusedasaconsultationtool seen asshiftingfromthehumantocomputer. couldbe and deathdecision,themoralburden the APACHE systemtheauthoritytomake alife por didn’t off make the decisiontoturn thelifesup- decision(“I themselves difficult fromavery through allo APACHE systemcouldprovide amoralbuffer viduals (Helft,Siegler, &Lantos,2000). The predictive prognostic notindi- systemforgroups, systems,itisgenerallyport viewed asahighly when apersonshouldberemoved fromlifesup- tooltoprovideport arecommendationasto futile. While itcouldbeseenasadecisionsup- the stageofanillnesswhere treatmentwould be quantitative toolusedinhospitalstodetermine (APACHE) system. The APACHE systemisa Physiology andChronicHealthEvaluation the humanandcomputeristhatof Acute toolcanbecomeamoralbuffersupport between their computeruse”(Friedman &Kahn,1997). be largely unaccountable fortheconsequencesof occurs, “individuals may considerthemselves to culprits. When thisdiminishedsenseofagency occur,when errors computerscanbeseenasthe Because ofthisdiminishedsenseagency, r ed recommendationscouldbecomeaheuristic, original intent.Insteadofguidance,theautomat become aclosed-loopsystem,w types ofdecisions,thesystemcouldineffect are deemedtobealegitimate authorityforthese sions. When systemslike the APACHE system rely uponthistechnology tomake toughdeci- overwhelmed intheworkplace, toincreasingly could allow medicalpersonnel, who arealready propensity forautomationbiasandover-reliance consistently madeaccuraterecommendations,the entrenched intheculture.Sincesystemhas arises w tem that can inflict harm uponpeople. tem thatcaninflict harm Acting delivery, and indeed, forany computerizedsys- apply systemsforweapons todecisionsupport formedicaladvicemaycomputer interfaces creates possib tion, andhenceamoralbuffer. ule-of-thumb, w riedman &Kahn,1997). t The designersofthissystemrecommend An example ofhow acomputerdecision The samepsychological phenomenon that systems, thecomputerdid”).Byallo hen technologies like APACHE become wing medicalpersonneltodistance le moralb hich becomesthedef uf fers intheuseof The ethicaldif hich was notits ault condi wing f iculty - - The Journal of Technology Studies 29 y - - y ace sys t to distance be tempted to ” y computer interf a exacerbated by large by exacerbated e y wing orders, ticularl commanders ma y ace that affect human life, such as ace that affect ant and thoughtful user design, ma hen adding elements such as the happ g igure 2 that mak ewise, the same elements apply for users the same elements apply ewise, interf weapons control interface, even with the control interface, even weapons y, and shift accountability to the computer, y, y wing the recommendations of the automa Because of the inherent complexity of Because of the inherent complexity A in F g fect can be par socio-technical systems, decision suppor Conclusion of automation tems that integrate higher levels the comput- users to perceive allow can possibly diminish moral authority, er as a legitimate agenc This effect. thus creating a moral buffering ef mission planning carries planning mission it great with responsi- in weapons, of dollars millions as bility, and schedul- hours in personnel, immeasurable are at the dispos- planes, and troops ing of ships, (the planners) bear- users With al of the planner. that it is curious responsibility, ing such serious the interface the chose to represent designers cute, and non- happy, help feature using a feature is no doubt a use-A help dog. aggressive mission accomplishment, ful tool for successful graph- butadding such a cheerful, almost funny by of a moral buffer ic could aid in the creation a greatersense of detachment in plan- providing ning certain innocuous death through such an in fact, that this argued medium. It could be as not to add to the kind of interface is desirable of the mission planner; how- already high stress making the task seem more “fun” and less ever, to reduce user stress. distasteful is not the way most ele response and emergency medical devices resources. become a moral buffer, allowing users, who will users, who allowing become a moral buffer, with authority and not sub- be decision makers ordinates “just follo themselves from the lethality of their decisions. themselves of the Interface designers should be cognizant designing interfaces that when effect buffering require a very decision, and be quick human careful w do tion), militar of an use remotely operated weapons in real-time operated weapons use remotely scenarios without the careful deliber- retargeting ation that occurred of with older versions plan- that required months of advance weapons cannot be redirect- ning, and that once launched, ed. Lik more like a leisure video game than an interface a leisure video game more like If comput- for lost lives. that will be responsible only was ers are seen as the moral agents (i.e., I follo ve ho ha fer in the uf particular design ele- The user of this inter- a hich will be accomplished w software package, which software trained personnel w ute to a moral b y ® Military Planning Tool A highl , . xample of ho 2 e An e ysis in real-time, w educated ell educated and has the authority to choose y aids a military in planning an “optimal” planner mission (LoPresto, 2002). to be a mid-career officer is face is likely who w The task of both resources and targets. between b through a seemingly innocuous apparatus like a like apparatus innocuous a seemingly through fatal interfacecomputer making potentially and the through such as directing weapons decisions and moral buffer mouse can create a click of a as not themselves people to perceive allow result. It consequences whatever for responsible actually those people who that could be argued following are only control in-flight weapons and thus the actual operators orders of superiors, mil- for their actions. In older are not responsible a make itary systems, a commander would weapons-firing and then, for example, decision push the buttonorder an underling to that actual- command Unfortunately, launched a weapon. ly outpaced both have and control technology and traditional human reasoning capabilities command structures. control In smart weapons will no longer be con- of the future, weapons little junior enlisted personnel with trolled by control in the future will training. Smart weapons solving and critical problem require complex anal the ability to both approve and disapprove of a and disapprove the ability to both approve the have launch (such as pilots who weapons war- authority to not drop a bomb if the situation this group of decision mak- rants.) It is precisely buffer. a moral by will be most affected ers who 2. This is a screenshot of an actual military mis- 2. based on sile planning computer program Excel Microsoft’s ment could contrib use of computer interfaces can be seen in Figure Figur 3

The Journal of Technology Studies 0 Ahlstrom,V., &Longo,K.(2003).“ Billings, C.E.(1997). Bandura, A. (2002).“Selective MoralDisengagement intheExercise ofMoral Agency.” Keen, P. G. W., M.S.(1978). &Scott-Morton, Karau, S.J., & Williams, K.D. ameta-analytic review (1993).Socialloafing: andtheoreticalintegra- Helft, P Guerlain, S.,Smith,P Grossman, D. (2000).Evolution of Weaponry, Friedman, B., &Millet,L.I.(1997).Reasoning About Computers As Moral Agents: A Research Friedman, B., &Kahn,P. H.(1997).Human Agency andResponsible Computing:Implicationsfor Endsle Endsley, M.(1997,June). Cummings, M.L.(2004).“ 32nd Army Air andMissileDefenseCommand (2003).“ References of ethicalandsocialimpactanalysis. thesameleveltial toimpacthumanlifedeserve all domainsinwhich computershave thepoten- systemsforweapons;decision support however, forthedevelopmentissues shouldbeaconcern of process. The needforcarefulreflectiononethical unique rolesandresponsibilitiesinthedesign it isparamountthatdesignersunderstandtheir human lifesuchasweapons andmedicalsystems, arenasthatdirectlydecision support impact icant humancognitive especially contribution, in weapons. For designsthatrequiresignif- interface with remoteoperationofdevices suchas occurs andthephysicalorganizations distancingthat Grossman, D Grossman, D. (1995). Moral Education,31 Oper P tion. Immunohematology, 12 “Dealing withbrittlenessinthedesignofe Academic Press. New American Way of War Symposium, Washington DC. Paper Studies,DuelingDoctrinesandthe presentedattheCenterforStrategic andInternational Note. InB. Friedman (Ed.), T Computer SystemDesign.InB. Friedman (Ed.), ness andworkload ina dynamiccontroltask.” F P N.J: Lawrence Erlbaum Associates. Stanford, CA:CSLIPublications. England Journal ofMedicine, 343 AA/Eurocontrol echnology er aper presentedatthe y . , spective R., Sie M. R.,&Kaber ation Iraqi Freedom J ournal ofP . (1998). gler . (pp. 221-235).Stanford Reading, MA: , M., &Lantos,J. (2000). The RiseandFall oftheFutilityMovement. ., Obrado er The Mor On Killing Air Traffic ManagementR&DSeminar, Saclay, France. Aviation Automation: The Search For A Human-Centered Approach. sonality andSocialPsyc , (2), 101-119. D Situation Awareness, Automation, andFree Flight. AIAA IntelligentSystemsConference. A . , B utomation BiasinIntellig 101-107. . ” . vich, J ality ofBombing:Psyc Washington DC:U.S. Army. (1999). “Le Addison-W Human Values andthe DesignofComputer Technology . Boston: LittleBrown &Co. The HumanFactors DesignStandard ., Rudmann,S.,Strohm,P., Smith, J., &Svirbely, J. (1996). (4), 293-296. , CA: CSLIPublications. esle v el ofautomationef Encyclopedia of Violence, Peace, andConflict. y. Decision Support Systems: An Organizational xper hology, 65 Ergonomics, 42 t systems forimmunohematolo patience. insights, framingsuggestions,andmostofall Olsson Professorof Applied Ethicsforher the University of Virginia Anne Shirley Carter I Acknowledgments Technology. Astronautics attheMassachusetts Instituteof in theDepartmentof Aeronautics and Human Values andthe DesignofComputer holo ent Time CriticalDecisionSupportSystems.” would like tothankDr. DeborahJohnson, Patriot MissileDefense Operations during Mary L.Cummingsisanassistantprofessor gical Responsesto (4), 681-706. fects onperfor (3), 462-492. .” Washington DC:FAA. “Distant Punishment”. Paper presentedatthe mance, situationa gy . ” (p. 205). Journal of Ne Hillsdale, w ware- The Journal of Technology Studies 31 (4), Human (3), 286- Paper pre- Paper s, television and new s, television 573-583. Human Factors, 42 Human Factors, , .” Washington DC: United Washington .” s, 43 18-26. 353-373. actor , eat computer lications. Automation and Human Performance: Automation 701-717. (pp. 474-514). Upper Saddle River, NJ: River, Upper Saddle (pp. 474-514). (5), 991-1006. , (pp. 449-460). Mahwah, New Jersey: New (pp. 449-460). Mahwah, Human F people tr ” w (3), 388-398. Aerospace America, Aerospace 360-371. , CA: CSLI Pub IEEE Transactions on Systems, Man, and Transactions IEEE , Decomposing Automation: Autonomy, Authority, Autonomy, Decomposing Automation: aper presented at the First Automation Technology and Technology Automation aper presented at the First New York: Harper and Row. York: New P ansactions on Systems, Man, and Cybernetics, 30 ansactions on Systems, Man, and Cybernetics, ” r T The Accident at Three Mile Island Three at Accident The Stanford . PC-based Mission Distribution System (PC-MDS). Mission Distribution PC-based IEEE ” The media equation: Ho The case of in-flight icing. (pp. 201-220). Mahwah, New Jersey: Lawrence Erlbaum Associates, Erlbaum Lawrence Jersey: New (pp. 201-220). Mahwah, (7), 931-951. T. B., & Wickens, C. D. (2000). “A Model for Types and Levels of Human and Levels Types Model for (2000). “A C. D. Wickens, & B., T. Informatization and the Public Sector, 3 Sector, and the Public Informatization V. (1997). “Humans and Automation: Use, Misuse, Disuse, Abuse.” Abuse.” Automation: Use, Misuse, Disuse, (1997). “Humans and V. , Computers, Ethics & Social Values Social Ethics & Computers, y tainty: A: Systems and Humans, 27 Automation. Automation and Human Performance and Human Automation Obedience to Authority. Obedience to ournal of Human-Computer Studies, 52 (6), 605-612. mance Conference. art (4), 285-292. eal people and places (2), 230-253. Nass, C. (1996). Ergonomics, 43 Ergonomics, erfor bility, and Perceived Animacy and Perceived bility, B., Vamos, T., & Aida, S. (1983). “Adapting Automation to Man, Culture and Society.” Aida, S. (1983). “Adapting & T., Vamos, B., ., Mosier, K. L., & Burdick, M. D. (2000). “Accountability and automation bias.” (2000). “Accountability K. L., & Burdick, M. D. ., Mosier, ., Mosier, K. L., & Burdick, M. D. (1999). “Does automation bias decision-making?” K. L., & Burdick, M. D. ., Mosier, . & T ., es B Cybernetics-P Journal of Personality and Social Psychology, 57 and Social Psychology, of Personality Journal 297. pressure and uncer Associates, Inc. Erlbaum Lawrence on user performance.” The effects systems: 46 Quarterly, information systems.” International J Automatica,19 of Human-Computer Studies, 51 International Journal Interaction with Observa Human P Mouloua (Ed.), sented at the Cruise Missile Seminar, Norfolk, VA. Norfolk, Cruisesented at the Seminar, Missile States Nuclear Regulatory Commission. models.” 636-659. media like r postulates and formulas for analyzing human and machine performance.” performance.” human and machine postulates and formulas for analyzing for Each Other? In R. Parasuraman & M. Mouloua (Eds.), R. Parasuraman for Each Other? In Applications Theory and Inc. Nissenbaum (Ed.), Nissenbaum Prentice Hall. 39 Factors, v arasuraman,R., & Rile arasuraman, R., Sheridan, Tetlock, P. E., & Boettger, R. (1989). “Accountability: A Social Magnifier Effect.” of the Dilution R. (1989). “Accountability: E., & Boettger, P. Tetlock, Ree “Supporting (2001). decision making and action selection under time B. & Schroeder, B., N. Sarter, April). “ (1994, D. D. Woods, & B., N. Sarter, Sheridan, Skitka, L. J problem-solving (1997). “Brittleness in the design of cooperative E., & C. Layton. McCoy, Smith, P., designing politico-administrative ethical principles for (1994). “Towards M. J. den Hoven, van Skitka, L. J Tetlock, P. E. (1983). “Accountability and the perseverance of first impressions.” of first perseverance and the impressions.” E. (1983). “Accountability P. Tetlock, Sheridan, T. B. (1996). “Speculations on Future Relations Between Humans and Automation.” In M. In Automation.” Humans and Between (1996). “Speculations on Future Relations B. T. Sheridan, Milgram,S. (1975). studies and quantitative automation for human use: empirical R. (2000). “Designing Parasuraman, P P Parasuraman, R., Masalonis, A. J., & Hancock, P. A. (2000). “Fuzzy signal detection theory: Basic & Hancock, P. A. J., R., Masalonis, Parasuraman, Mosier, K. L., & Skitka, L. J. (1996). Human Decision Makers and Automated Decision Aids: Made Aids: Decision Automated and Human Decision Makers (1996). J. K. L., & Skitka, L. Mosier, “ (2004). Affairs. NRC Office of Public Leveson, N. G., & Turner, C. S. (1995). “An Investigation of the Therac-25 Accidents.” In H. Accidents.” Therac-25 the of Investigation “An S. (1995). C. Turner, & G., N. Leveson, 14). L. M. (2002, February LoPresto, Lerner, E. J. (1989, April). “Lessons of Flight 655.” Flight 655.” of April). “Lessons (1989, J. E. Lerner,