
Automation and Accountability in Decision Support System Interface Design 23 T Mary L. Cummings h e J o u r n Abstract a If a DSS is faulty or fails to take into l o When the human element is introduced into account a critical social impact factor, the results f T e decision support system design, entirely new will not only be expensive in terms of later c h n layers of social and ethical issues emerge but are redesigns and lost productivity, but possibly also o l o not always recognized as such. This paper dis- the loss of life. Unfortunately, history is replete g y S cusses those ethical and social impact issues with examples of how failures to adequately t u d specific to decision support systems and high- understand decision support problems inherent in i e lights areas that interface designers should con- complex sociotechnical domains can lead to s sider during design with an emphasis on mili- catastrophe. For example, in 1988, the USS tary applications. Because of the inherent com- Vincennes, a U.S. Navy warship accidentally shot plexity of socio-technical systems, decision sup- down a commercial passenger Iranian airliner port systems are particularly vulnerable to cer- due to a poorly designed weapons control com- tain potential ethical pitfalls that encompass puter interface, killing all aboard. The accident automation and accountability issues. If comput- investigation revealed nothing was wrong with er systems diminish a user’s sense of moral the system software or hardware, but that the agency and responsibility, an erosion of account- accident was caused by inadequate and overly ability could result. In addition, these problems complex display of information to the controllers are exacerbated when an interface is perceived (van den Hoven, 1994). Specifically, one of the as a legitimate authority. I argue that when primary factors leading to the decision to shoot developing human computer interfaces for deci- down the airliner was the perception by the con- sion support systems that have the ability to trollers that the airliner was descending towards harm people, the possibility exists that a moral the ship, when in fact it was climbing away from buffer, a form of psychological distancing, is the ship. The display tracking the airliner was created which allows people to ethically distance poorly designed and did not include the rate of themselves from their actions. target altitude change, which required controllers to “compare data taken at different times and Introduction make the calculation in their heads, on scratch Understanding the impact of ethical and pads, or on a calculator – and all this during social dimensions in design is a topic that is combat” (Lerner, 1989). receiving increasing attention both in academia and in practice. Designers of decision support This lack of understanding the need for a systems (DSS’s) embedded in computer inter- human-centered interface design was again faces have a number of additional ethical repeated by the military in the 2004 war with responsibilities beyond those of designers who Iraq when the U.S. Army’s Patriot missile sys- only interact with the mechanical or physical tem engaged in fratricide, shooting down a world. When the human element is introduced British Tornado and an American F/A-18, killing into decision and control processes, entirely new three pilots. The displays were confusing and layers of social and ethical issues (to include often incorrect, and operators, who only were moral responsibility) emerge but are not always given ten seconds to veto a computer solution, recognized as such. Ethical and social impact were admittedly lacking training in a highly issues can arise during all phases of design, and complex management-by-exception system identifying and addressing these issues as early (32nd Army Air and Missile Defense Command, as possible can help the designer to both analyze 2003). In both the USS Vincennes and Patriot the domain more comprehensively as well as missile cases, interface designers could say that suggest specific design guidance. This paper usability was the core problem, but the problem discusses those accountability issues specific to is much deeper and more complex. While the DSS’s that result from introducing automation manifestation of poor design decisions led to and highlight areas that interface designers severe usability issues in these cases, there are should take into consideration. underlying issues concerning responsibility, accountability, and social impact that deserve brittle-decision algorithms, which possibly make 24 further analysis. erroneous or misleading suggestions (Guerlain et al., 1996; Smith, McCoy, & C. Layton, 1997). s Beyond simply examining usability issues, e i The unpredictability of future situations and d there are many facets of decision support system u t unanticipated responses from both systems and S design that have significant social and ethical y human operators, what Parasuraman et al. (2000) g implications, although often these can be subtle. o l term the “noisiness” of the world makes it impos- o n The interaction between cognitive limitations, h sible for any automation algorithm to always pro- c system capabilities, and ethical and social e T vide the correct response. In addition, as in the f impact cannot be easily quantified using formu- o l USS Vincennes and Patriot missile examples, a las and mathematical models. Often what may n r automated solutions and recommendations can be u seem to be a straightforward design decision can o confusing or misleading, causing operators to J carry with it ethical implications that may go e h make suboptimal decisions, which in the case of T unnoticed. One such design consideration is the a weapons control interface, can be lethal. degree of automation used in a decision support system. While the introduction of automation In addition to problems with automation may seemingly be a technical issue, it is indeed brittleness, significant research has shown that one that has tremendous social and ethical there are many drawbacks to higher levels of implications that may not be fully understood in automation that relegate the operator to a prima- the design process. It is critical that interface rily monitoring role. Parasuraman (2000) con- designers realize the inclusion of degrees of tends that over-automation causes skill degrada- automation is not merely a technical issue, but one tion, reduced situational awareness, unbalanced that also contains social and ethical implications. workload, and an over-reliance on automation. Automation in decision support There have been many incidents in other systems domains, such as nuclear power plants and med- In general, automation does not replace the ical device applications, where confusing need for humans; rather it changes the nature automation representations have led to lethal of the work of humans (Parasuraman & Riley, consequences. For example, in perhaps one of 1997). One of the primary design dilemmas the most well-known engineering accidents in engineers and designers face is determining what the United States, the 1979 cooling malfunction level of automation should be introduced into a of one of the Three Mile Island nuclear reactors, system that requires human intervention. For problems with information representation in the rigid tasks that require no flexibility in decision- control room and human cognitive limitations making and with a low probability of system were primary contributors to the accident. failure, full automation often provides the best Automation of system components and subse- solution (Endsley & Kaber, 1999). However, in quent representation on the instrument panels systems like those that deal with decision-mak- were overly complex and overwhelmed the con- ing in dynamic environments with many external trollers with information that was difficult to and changing constraints, higher levels of synthesize, misleading, and confusing (NRC, automation are not advisable because of the risks 2004). and the inability of an automated decision aid to The medical domain is replete with exam- be perfectly reliable (Sarter & Schroeder, 2001). ples of problematic interfaces and ethical dilem- mas. For example, in the Therac-25 cases that Various levels of automation can be intro- occurred between 1985-1987, it was discovered duced in decision support systems, from fully too late for several patients that the human-com- automated where the operator is completely left puter interface for the Therac-25, which was out of the decision process to minimal levels of designed for cancer radiation therapy, was poor- automation where the automation only presents ly designed. It was possible for a technician to the relevant data. The application of automation enter erroneous data, correct it on the display so for decision support systems is effective when that the data appeared accurate, and then begin decisions can be accurately and quickly reached radiation treatments unknowingly with lethal based on a correct and comprehensive algorithm levels of radiation. Other than an ambiguous that considers all known constraints. However, the “Malfunction 54” error code, there was no indi- inability of automation models to account for all cation that the machine was delivering fatal potential conditions or relevant factors results in doses of radiation (Leveson & Turner, 1995). Many researchers assert that keeping the correct (Mosier & Skitka, 1996; Parasuraman & operator engaged in decisions supported by Riley, 1997). Automation bias is particularly 25 automation, otherwise known as the human-cen- problematic when intelligent decision support is T h tered approach to the application of automation, needed in large problem spaces with time pres- e J will help to prevent confusion and erroneous sure like what is needed in command and control o u r n decisions which could cause potentially fatal domains such as emergency path planning and a l problems (Billings, 1997; Parasuraman, resource allocation (Cummings, 2004). Moreover, o f T Masalonis, & Hancock, 2000; Parasuraman & automated decision aids designed to reduce e c h Riley, 1997).
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages9 Page
-
File Size-