Complacency and Bias in Human Use of Automation: an Attentional Integration
Total Page:16
File Type:pdf, Size:1020Kb
Complacency and Bias in Human Use of Automation: An Attentional Integration Raja Parasuraman, George Mason University, Fairfax, Virginia, and Dietrich H. Manzey, Berlin Institute of Technology, Berlin, Germany Objective: Our aim was to review empirical stud- INTRODUCTION ies of complacency and bias in human interaction with Human interaction with automated and deci- automated and decision support systems and provide an integrated theoretical model for their explanation. sion support systems constitutes an important Background: Automation-related complacency area of inquiry in human factors and ergonomics and automation bias have typically been considered (Bainbridge, 1983; Lee & Seppelt, 2009; Mosier, separately and independently. 2002; Parasuraman, 2000; R. Parasuraman, Methods: Studies on complacency and automation Sheridan, & Wickens, 2000; Rasmussen, 1986; bias were analyzed with respect to the cognitive pro- Sheridan, 2002; Wiener & Curry, 1980; Woods, cesses involved. 1996). Research has shown that automation Results: Automation complacency occurs under con- does not simply supplant human activity but ditions of multiple-task load, when manual tasks compete rather changes it, often in ways unintended and with the automated task for the operator’s attention. unanticipated by the designers of automation; Automation complacency is found in both naive and moreover, instances of misuse and disuse of expert participants and cannot be overcome with sim- ple practice. Automation bias results in making both omis- automation are common (R. Parasuraman & sion and commission errors when decision aids are Riley, 1997). Thus, the benefits anticipated by imperfect. Automation bias occurs in both naive and expert designers and policy makers when implement- participants, cannot be prevented by training or instruc- ing automation—increased efficiency, improved tions, and can affect decision making in individuals as well as safety, enhanced flexibility of operations, lower in teams. While automation bias has been conceived of as a operator workload, and so on—may not always special case of decision bias, our analysis suggests that it be realized and can be offset by human perfor- also depends on attentional processes similar to those mance costs associated with maladaptive use of involved in automation-related complacency. poorly designed or inadequately trained-for Conclusion: Complacency and automation bias repre- automation. sent different manifestations of overlapping automation- In this article, we review research on two induced phenomena, with attention playing a central role. An integrated model of complacency and automation bias such human performance costs: automation- shows that they result from the dynamic interaction of per- related complacency and automation bias. These sonal, situational, and automation-related characteristics. have typically (although not exclusively) been Application: The integrated model and attentional considered in the context of two different para- synthesis provides a heuristic framework for further digms of human-automation interaction, super- research on complacency and automation bias and design visory control (Sheridan & Verplank, 1978) and options for mitigating such effects in automated and deci- decision support, respectively. We discuss the sion support systems. cognitive processes associated with automation complacency and automation bias and provide a Keywords: attention, automation-related compla- synthesis that sees the two phenomena as repre- cency, automation bias, decision making, human-com- senting different manifestations of overlapping puter interaction, trust automation-induced phenomena, with attention playing a central role. Address correspondence to Raja Parasuraman, Arch Lab, MS 3F5, George Mason University, 4400 University Drive, AUTOMATION COMPLACENCY Fairfax, VA 22030; [email protected]. Definitions HUMAN FACTORS Vol. 52, No. 3, June 2010, pp. 381–410. The term complacency originated in references DOI: 10.1177/0018720810376055. in the aviation community to accidents or incidents Copyright © 2010, Human Factors and Ergonomics Society. in which pilots, air traffic controllers, or other 382 June 2010 - Human Factors operators purportedly did not conduct sufficient reported that in a survey of 100 highly experi- checks of system state and assumed “all was enced airline captains, more than half stated that well” when in fact a dangerous condition was complacency was a leading factor in accidents. developing that led to the accident. The National Initially, complacency was used to refer to inad- Aeronautics and Space Administration Aviation equate pilot monitoring in relation to any air- Safety Reporting System (ASRS) includes craft subsystem. With the advent of automation, complacency as a coding item for incident however, first in the aviation industry and later reports (Billings, Lauber, Funkhouser, Lyman, in many other domains, the possibility arose of & Huff, 1976). ASRS defines complacency as automation-related complacency. Consistent with “self-satisfaction that may result in non- this trend, in a more recent analysis of aviation vigilance based on an unjustified assumption of accidents involving automated aircraft, Funk satisfactory system state” (Billings et al., 1976, et al. (1999) also reported that complacency p. 23). Wiener (1981) discussed the ASRS defi- was among the top five contributing factors. nition as well as several others, including com- Complacency has also been cited as a con- placency defined as “a psychological state tributing factor in accidents in domains other characterized by a low index of suspicion” (p. than aviation. A widely cited example is the 119). Wiener proposed that empirical research grounding of the cruise ship Royal Majesty off was necessary to go beyond these somewhat the coast of Nantucket, Massachusetts (Degani, vague definitions so as to gain an understanding 2001; R. Parasuraman & Riley, 1997). This ship of the mechanisms of complacency and to make was fitted with an automatic radar plotting aid the concept useful in enhancing aviation safety. (ARPA) for navigation that was based on GPS Currently, there is no consensus on the defi- receiver output. The GPS receiver was connected nition of complacency. However, there is a core to an antenna mounted in an area where there set of features among many of the definitions was heavy foot traffic of the ship’s crew. As a that is common both to accident analyses and to result, the cable from the antenna frayed, lead- empirical human performance studies that could ing to a loss of the GPS signal. At this point, be used to derive a working definition. The first the ARPA system reverted to “dead reckoning” is that human operator monitoring of an auto- mode and did not correct for the prevailing tides mated system is involved. The second is that the and winds. Consequently, the ship was gradually frequency of such monitoring is lower than steered toward a sand bank in shallow waters. some standard or optimal value (see also Moray The National Transportation Safety Board (1997) & Inagaki, 2000). The third is that as a result of report on the incident cited crew overreliance substandard monitoring, there is some directly on the ARPA system and complacency associ- observable effect on system performance. The ated with insufficient monitoring of other sources performance consequence is usually that a sys- of navigational information, such as the Loran tem malfunction, anomalous condition, or out- C radar and visual lookout. right failure is missed (R. Parasuraman, Molloy, Automation complacency has been similarly & Singh, 1993). Technically, the performance cited in several other analyses of accidents and consequence could also involve not an omission incidents (Casey, 1998). The dangers of com- error but an extremely delayed reaction. However, placency have been described in many com- in many contexts in which there is strong time mentaries and editorial columns, including pressure to respond quickly, as in an air traffic those in leading scientific journals, such as Science control (ATC) conflict detection situation (Metzger (e.g., Koshland, 1989). Perhaps as a result of read- & Parasuraman, 2001), a delayed response would ing such anecdotal reports and opinion pieces be equivalent to a miss. (which have a tendency to overgeneralize and draw very broad conclusions), Dekker and col- Accident and Incident Reports leagues (Dekker & Hollnagel, 2004; Dekker & Operator complacency has long been impli- Woods, 2002) proposed that terms such as com- cated as a major contributing factor in aviation placency and situation awareness do not have accidents (Hurst & Hurst, 1982). Wiener (1981) scientific credibility but rather are simply “folk AutomAtion ComplACenCy And BiAs 383 models.” R. Parasuraman, Sheridan, and conducting real ATC operations, did not have Wickens (2008) disputed this view, pointing to any competing tasks, only conflict detection. the growing scientific, empirical literature on R. Parasuraman et al. (1993) provided the the characteristics of both constructs. first empirical evidence for automation compla- Accident and incident reports clearly cannot cency and for the contributing role of high task be held to high scientific standards, but they pro- load. They had participants perform three con- vide a useful starting point for scientific under- current tasks from the Multiple Task Battery standing of any phenomenon. Describing the (MATB): a two-dimensional compensatory track- characteristics