Cognitive Biases and Errors in Radiation Therapy

Total Page:16

File Type:pdf, Size:1020Kb

Cognitive Biases and Errors in Radiation Therapy Systematic Review Cognitive Biases and Errors in Radiation Therapy Rosann Keller, MEd, R.T.(T) Amy Heath, MS, R.T.(T) Purpose To describe common cognitive biases that can lead to radiation therapy errors. Methods A review of the literature was performed using PubMed and Cumulative Index to Nursing and Allied Health Literature. Textbooks, popular literature, and internet resources also were evaluated for their relevance to radiation therapy. Results Several common cognitive biases can lead to radiation therapy errors, including automaticity, inattentional blindness, illusion of attention, and illusion of knowledge. Discussion Cognitive biases can manifest in radiation therapy in many ways. Radiation therapists should be aware of these biases so that they can employ interventions to minimize the risk of errors. Conclusion Human error can result in treatment misadministration and near misses. One cause of human error is cognitive bias. Radiation therapists can take actions to reduce the risk of error caused by cognitive biases. Keywords cognitive bias, radiation therapy errors, medical error, human error rrors in radiation therapy treatments are rela- radiation therapy errors occur, and human error often is tively rare—0.18% to 0.29% per fraction, to blame. Research shows that human error is the cause according to 1 study.1 However, errors can hap- of 70% of plane wrecks, 90% of car accidents, and 90% pen at anytime during the 269 steps in the exter- of errors in the workplace.4 However, understanding Enal beam radiation therapy process, and the results can how the human brain works and how the brain affects be devastating.2 In 2010, the New York Times published the way decisions are made can minimize errors. patient stories that involved errors.3 The article described how 1 patient died after he was treated with a Methods radiation beam that was intended to be modulated, but A review of the literature was performed using the multileaf collimators failed to move into the correct PubMed and Cumulative Index to Nursing and Allied positions. The article also described a patient who Health Literature and the search terms cognitive bias, received 27 treatments to her chest wall without a radiation therapy errors, medical error, and human error. wedge, which resulted in 3 times the intended dose.3 The database searches yielded 2 articles relevant to radi- Cases of errors such as these have aspired change in the ation therapy and cognitive biases. These articles were radiation oncology community, increasing the emphasis used to focus the search to specific cognitive biases, on patient safety and quality assurance. Many radiation which were identified as applicable to radiation therapy. oncology departments have adopted changes to work- The authors found 10 articles addressing cognitive bias- flow, additional quality assurance checks, and new es and medical error. Textbooks, the American Society quality assurance tools. of Radiologic Technologists’ journals, popular litera- Despite the number of checklists and computer ture, and internet resources regarding cognitive bias and machine safeguards put in place, near misses and and the decision-making process also were consulted. 128 asrt.org/publications Systematic Review Keller, Heath Sources were evaluated based on their influence in determined that the number of premature deaths from radiation therapy. medical errors was more than 400 000 a year.11 Despite the increased awareness of medical errors Error Defined and their effects on patients, they continue to occur. The study of error and its causes begins with The radiation therapy process is complex and therefore defining the term. A survey of the literature found 17 prone to errors. Data from the Radiation Oncology different definitions of error, and another found 24 Incident Learning System (RO-ILS) demonstrate that definitions and a range of opinions regarding what most errors (30%) occur in the treatment planning constitutes an error.5 The Merriam-Webster Dictionary phase of the process and that most (29%) are discovered defines error as “an act or condition of ignorant or in the treatment delivery phase.12 This fact is of utmost imprudent deviation from a code of behavior” or “an importance to radiation therapists and their under- act involving an unintentional deviation from truth standing of why errors occur. or accuracy.”6 James Reason, a leading researcher in error and safety management, defines error as a generic Causes of Error term that encompasses all those occasions in which a Humans make mistakes for many reasons, including13: planned sequence of mental or physical activities fails anger issues to achieve its intended outcome and when these failures distraction cannot be attributed to the intervention of some chance lack of attention agency.7 The World Health Organization defines error personal or emotional problems as a “failure to carry out a planned action as intended or prejudices the application of an incorrect plan.”8 timidity A fundamental aspect of each definition of error Radiation therapists and other health care provid- is that the act of erring is unintentional.5 Deliberate ers often are able to recognize a mistake but might deviation from a criterion, rule, requirement, or not understand the cognitive processes that led operating procedure is a violation, which can occur to the error, which requires metacognition—the when people take shortcuts during standard operat- higher-order thinking that allows people to under- ing procedures or purposefully disregard policies and stand, analyze, and control their thought processes.14 protocols. Errors and violations increase the risk of a Metacognition requires accepting that the brain’s abil- patient safety incident, an event, or circumstance that ity to think and reason comes with natural limitations could have resulted, or has resulted, in unnecessary in mental processes.15 harm to the patient.5 A possible explanation for imperfections in cogni- tive function is the dual process theory of thought, Medical Errors which suggests that cognitive processes rely on 2 modes Medical errors have been a popular topic in health of thinking—system 1 and system 2.16 System 1 is an professional literature since the Institute of Medicine’s effortless intuitive thinking process that recognizes landmark report, published in 2000, estimated that patterns, makes assumptions and substitutions, and more than 98 000 Americans die each year because reaches conclusions and decisions quickly.17 It is auto- of medical errors.9,10 The publication led to initiatives matic, requires little or no effort, and operates with no aimed at improving patient safety from the organiza- sense of voluntary control. Researchers have theorized tional to the federal levels.10 The increased scrutiny of that humans spend about 97% of their time in intuitive medical errors also led to a more in-depth look at what thinking mode.18 System 2 is an analytical thinking constitutes a medical error and how it is reported. A process that uses knowledge, weighs data, and consid- 2013 study reviewing these processes found that the ers evidence.17 It requires effortful mental attention to Institute of Medicine’s, now called the Health and address complex problems and situations, making it a Medicine Division, estimate was low; the study instead slower and more resource-intensive process.18 System 2 RADIATION THERAPIST, Fall 2020, Volume 29, Number 2 129 Systematic Review Cognitive Biases and Errors in Radiation Therapy involves deliberate decision-making, problem-solving, limitations influence decision-making.24 Although and constructing thoughts in an orderly series of steps, people endeavor to make rational decisions, many fac- whereas system 1 is instinctive and often based on tors limit this ability, including the24: impressions and emotions. characteristics of the environment System 1 capabilities include the innate skills of distinct features of the cognitive system being able to perceive the world, recognize objects, (system 1 or 2) making the decision orient attention, avoid losses, and experience fear.16 type of decision or task Additional cognitive activities become automatic as Purely rational decisions require a structured and skills are learned and practiced repeatedly. Knowledge systematic approach that weighs risks and benefits. is stored in memory and accessed without effort or Decision-making also is influenced by other factors intention. System 1 impressions, feelings, and knowl- such as the decision maker’s accuracy of perceptions, edge often are used as the source for the measured knowledge, expertise, and overall intelligence. Because choices of system 2. An important aspect of system 2 is people often must make decisions with limited time that its operations require attention and are disrupted and information, they use heuristics, which can lead to when attention is withdrawn.16 cognitive biases.25 System 1 thinking is characterized by the use of Human beings have cognitive biases they are heuristics, defined as a “simple procedure that helps find unaware of and that usually are harmless. For example, adequate, often imperfect, answers to difficult ques- a cognitive bias known as the IKEA effect exists around tions” that save time and effort in daily decision-making the Swedish home furnishings store that sells ready-to- but can lead to errors.16 In simple terms, heuristics refers assemble furniture.26 Research has demonstrated that to using past experiences to learn and improve. This
Recommended publications
  • A Task-Based Taxonomy of Cognitive Biases for Information Visualization
    A Task-based Taxonomy of Cognitive Biases for Information Visualization Evanthia Dimara, Steven Franconeri, Catherine Plaisant, Anastasia Bezerianos, and Pierre Dragicevic Three kinds of limitations The Computer The Display 2 Three kinds of limitations The Computer The Display The Human 3 Three kinds of limitations: humans • Human vision ️ has limitations • Human reasoning 易 has limitations The Human 4 ️Perceptual bias Magnitude estimation 5 ️Perceptual bias Magnitude estimation Color perception 6 易 Cognitive bias Behaviors when humans consistently behave irrationally Pohl’s criteria distilled: • Are predictable and consistent • People are unaware they’re doing them • Are not misunderstandings 7 Ambiguity effect, Anchoring or focalism, Anthropocentric thinking, Anthropomorphism or personification, Attentional bias, Attribute substitution, Automation bias, Availability heuristic, Availability cascade, Backfire effect, Bandwagon effect, Base rate fallacy or Base rate neglect, Belief bias, Ben Franklin effect, Berkson's paradox, Bias blind spot, Choice-supportive bias, Clustering illusion, Compassion fade, Confirmation bias, Congruence bias, Conjunction fallacy, Conservatism (belief revision), Continued influence effect, Contrast effect, Courtesy bias, Curse of knowledge, Declinism, Decoy effect, Default effect, Denomination effect, Disposition effect, Distinction bias, Dread aversion, Dunning–Kruger effect, Duration neglect, Empathy gap, End-of-history illusion, Endowment effect, Exaggerated expectation, Experimenter's or expectation bias,
    [Show full text]
  • Application of Human Factors in Reducing Human Error in Existing Offshore Facilities
    University of Nebraska - Lincoln DigitalCommons@University of Nebraska - Lincoln United States Department of Transportation -- Publications & Papers U.S. Department of Transportation 2002 Application of Human Factors in Reducing Human Error in Existing Offshore Facilities Jeffrey Thomas ExxonMobil, [email protected] Clifford C. Baker American Bureau of Shipping Thomas B. Malone Carlow International Incorporated, [email protected] John T. Malone Carlow Associates, [email protected] Christina L. Hard BP America Inc., [email protected] See next page for additional authors Follow this and additional works at: https://digitalcommons.unl.edu/usdot Part of the Civil and Environmental Engineering Commons Thomas, Jeffrey; Baker, Clifford C.; Malone, Thomas B.; Malone, John T.; Hard, Christina L.; Rezende, Ivan C. L.; Caruana, Sally; and Witten, Mark, "Application of Human Factors in Reducing Human Error in Existing Offshore Facilities" (2002). United States Department of Transportation -- Publications & Papers. 34. https://digitalcommons.unl.edu/usdot/34 This Article is brought to you for free and open access by the U.S. Department of Transportation at DigitalCommons@University of Nebraska - Lincoln. It has been accepted for inclusion in United States Department of Transportation -- Publications & Papers by an authorized administrator of DigitalCommons@University of Nebraska - Lincoln. Authors Jeffrey Thomas, Clifford C. Baker, Thomas B. Malone, John T. Malone, Christina L. Hard, Ivan C. L. Rezende, Sally Caruana, and Mark Witten This article is available at DigitalCommons@University of Nebraska - Lincoln: https://digitalcommons.unl.edu/usdot/ 34 Application of Human Factors Engineering in Reducing Human Error in Existing Offshore Systems WORKING GROUP 3 2ND INTERNATIONAL WORKSHOP ON HUMAN FACTORS IN OFFSHORE OPERATIONS (HFW2002) APPLICATION OF HUMAN FACTORS ENGINEERING IN REDUCING HUMAN ERROR IN EXISTING OFFSHORE SYSTEMS Jeffrey Thomas ExxonMobil Houston, TX USA [email protected] Clifford C.
    [Show full text]
  • Ilidigital Master Anton 2.Indd
    services are developed to be used by humans. Thus, understanding humans understanding Thus, humans. by used be to developed are services obvious than others but certainly not less complex. Most products bioengineering, and as shown in this magazine. Psychology mightbusiness world. beBe it more the comparison to relationships, game elements, or There are many non-business flieds which can betransfered to the COGNTIVE COGNTIVE is key to a succesfully develop a product orservice. is keytoasuccesfullydevelopproduct BIASES by ANTON KOGER The Power of Power The //PsychologistatILI.DIGITAL WE EDIT AND REINFORCE SOME WE DISCARD SPECIFICS TO WE REDUCE EVENTS AND LISTS WE STORE MEMORY DIFFERENTLY BASED WE NOTICE THINGS ALREADY PRIMED BIZARRE, FUNNY, OR VISUALLY WE NOTICE WHEN WE ARE DRAWN TO DETAILS THAT WE NOTICE FLAWS IN OTHERS WE FAVOR SIMPLE-LOOKING OPTIONS MEMORIES AFTER THE FACT FORM GENERALITIES TO THEIR KEY ELEMENTS ON HOW THEY WERE EXPERIENCED IN MEMORY OR REPEATED OFTEN STRIKING THINGS STICK OUT MORE SOMETHING HAS CHANGED CONFIRM OUR OWN EXISTING BELIEFS MORE EASILY THAN IN OURSELVES AND COMPLETE INFORMATION way we see situations but also the way we situationsbutalsotheway wesee way the biasesnotonlychange Furthermore, overload. cognitive avoid attention, ore situations, guide help todesign massively can This in. take people information of kind explainhowandwhat ofperception egory First,biasesinthecat andappraisal. ory, self,mem perception, into fourcategories: roughly bedivided Cognitive biasescan within thesesituations. forusers interaction andeasy in anatural situationswhichresults sible toimprove itpos and adaptingtothesebiasesmakes ingiven situations.Reacting ways certain act sively helpstounderstandwhypeople mas into consideration biases ing cognitive Tak humanbehavior. topredict likely less or andmore relevant illusionsare cognitive In each situation different every havior day.
    [Show full text]
  • Cognitive Biases in Software Engineering: a Systematic Mapping Study
    Cognitive Biases in Software Engineering: A Systematic Mapping Study Rahul Mohanani, Iflaah Salman, Burak Turhan, Member, IEEE, Pilar Rodriguez and Paul Ralph Abstract—One source of software project challenges and failures is the systematic errors introduced by human cognitive biases. Although extensively explored in cognitive psychology, investigations concerning cognitive biases have only recently gained popularity in software engineering research. This paper therefore systematically maps, aggregates and synthesizes the literature on cognitive biases in software engineering to generate a comprehensive body of knowledge, understand state of the art research and provide guidelines for future research and practise. Focusing on bias antecedents, effects and mitigation techniques, we identified 65 articles (published between 1990 and 2016), which investigate 37 cognitive biases. Despite strong and increasing interest, the results reveal a scarcity of research on mitigation techniques and poor theoretical foundations in understanding and interpreting cognitive biases. Although bias-related research has generated many new insights in the software engineering community, specific bias mitigation techniques are still needed for software professionals to overcome the deleterious effects of cognitive biases on their work. Index Terms—Antecedents of cognitive bias. cognitive bias. debiasing, effects of cognitive bias. software engineering, systematic mapping. 1 INTRODUCTION OGNITIVE biases are systematic deviations from op- knowledge. No analogous review of SE research exists. The timal reasoning [1], [2]. In other words, they are re- purpose of this study is therefore as follows: curring errors in thinking, or patterns of bad judgment Purpose: to review, summarize and synthesize the current observable in different people and contexts. A well-known state of software engineering research involving cognitive example is confirmation bias—the tendency to pay more at- biases.
    [Show full text]
  • Infographic I.10
    The Digital Health Revolution: Leaving No One Behind The global AI in healthcare market is growing fast, with an expected increase from $4.9 billion in 2020 to $45.2 billion by 2026. There are new solutions introduced every day that address all areas: from clinical care and diagnosis, to remote patient monitoring to EHR support, and beyond. But, AI is still relatively new to the industry, and it can be difficult to determine which solutions can actually make a difference in care delivery and business operations. 59 Jan 2021 % of Americans believe returning Jan-June 2019 to pre-coronavirus life poses a risk to health and well being. 11 41 % % ...expect it will take at least 6 The pandemic has greatly increased the 65 months before things get number of US adults reporting depression % back to normal (updated April and/or anxiety.5 2021).4 Up to of consumers now interested in telehealth going forward. $250B 76 57% of providers view telehealth more of current US healthcare spend % favorably than they did before COVID-19.7 could potentially be virtualized.6 The dramatic increase in of Medicare primary care visits the conducted through 90% $3.5T telehealth has shown longevity, with rates in annual U.S. health expenditures are for people with chronic and mental health conditions. since April 2020 0.1 43.5 leveling off % % Most of these can be prevented by simple around 30%.8 lifestyle changes and regular health screenings9 Feb. 2020 Apr. 2020 OCCAM’S RAZOR • CONJUNCTION FALLACY • DELMORE EFFECT • LAW OF TRIVIALITY • COGNITIVE FLUENCY • BELIEF BIAS • INFORMATION BIAS Digital health ecosystems are transforming• AMBIGUITY BIAS • STATUS medicineQUO BIAS • SOCIAL COMPARISONfrom BIASa rea• DECOYctive EFFECT • REACTANCEdiscipline, • REVERSE PSYCHOLOGY • SYSTEM JUSTIFICATION • BACKFIRE EFFECT • ENDOWMENT EFFECT • PROCESSING DIFFICULTY EFFECT • PSEUDOCERTAINTY EFFECT • DISPOSITION becoming precise, preventive,EFFECT • ZERO-RISK personalized, BIAS • UNIT BIAS • IKEA EFFECT and • LOSS AVERSION participatory.
    [Show full text]
  • Human Factors
    Chapter 14 Human Factors Introduction Why are human conditions, such as fatigue, complacency, and stress, so important in aviation maintenance? These conditions, along with many others, are called human factors. Human factors directly cause or contribute to many aviation accidents. It is universally agreed that 80 percent of maintenance errors involve human factors. If they are not detected, they can cause events, worker injuries, wasted time, and even accidents. [Figure 14-1] 14-1 maintenance errors are parts installed incorrectly, missing Human Factors parts, and necessary checks not being performed. In Mental comparison to many other threats to aviation safety, the State Emotional mistakes of an aviation maintenance technician (AMT) can be State more difficult to detect. Often times, these mistakes are present but not visible and have the potential to remain latent, affecting the safe operation of aircraft for longer periods of time. Human AMTs are confronted with a set of human factors unique Capabilities within aviation. Often times, they are working in the evening Physical or early morning hours, in confined spaces, on platforms that State are up high, and in a variety of adverse temperature/humidity conditions. The work can be physically strenuous, yet it also Human requires attention to detail. [Figure 14-2] Because of the Limitations nature of the maintenance tasks, AMTs commonly spend more time preparing for a task than actually carrying it out. Proper Environmental documentation of all maintenance work is a key element, and Conditions Human-Machine AMTs typically spend as much time updating maintenance Interface logs as they do performing the work.
    [Show full text]
  • Human Error and Accident Causation Theories, Frameworks and Analytical Techniques: an Annotated Bibliography
    Publications 9-2000 Human Error and Accident Causation Theories, Frameworks and Analytical Techniques: An Annotated Bibliography Douglas A. Wiegmann University of Illinois at Urbana-Champaign Aaron M. Rich University of Illinois at Urbana-Champaign Scott A. Shappell Federal Aviation Administration, Civil Aeromedical Institute, [email protected] Follow this and additional works at: https://commons.erau.edu/publication Part of the Aviation Safety and Security Commons, and the Human Factors Psychology Commons Scholarly Commons Citation Wiegmann, D. A., Rich, A. M., & Shappell, S. A. (2000). Human Error and Accident Causation Theories, Frameworks and Analytical Techniques: An Annotated Bibliography. , (). Retrieved from https://commons.erau.edu/publication/736 This Report is brought to you for free and open access by Scholarly Commons. It has been accepted for inclusion in Publications by an authorized administrator of Scholarly Commons. For more information, please contact [email protected]. Aviation Research Lab Institute of Aviation University of Illinois at Urbana-Champaign ARL 1 Airport Road Savoy, Illinois 61874 Human Error and Accident Causation Theories, Frameworks and Analytical Techniques: An Annotated Bibliography Douglas A. Wiegmann and Aaron M. Rich Aviation Research Lab and Scott A. Shappell Civil Aeromedical Institute Technical Report ARL-00-12/FAA-00-7 September 2000 Prepared for Federal Aviation Administration Oklahoma City, OK Contract DTFA 99-G-006 1 ABSTRACT Over the last several decades, humans have played a progressively more important causal role in aviation accidents as aircraft have become more. Consequently, a growing number of aviation organizations are tasking their safety personnel with developing safety programs to address the highly complex and often nebulous issue of human error.
    [Show full text]
  • The Field Guide to Human Error Investigations
    Contents Preface VII PART I Human error as a cause of mishaps 1. The Bad Apple Theory 3 2. Reacting to Failure 15 3. What is the Cause? 31 4. Human Error by any Other Name 41 5. Human Error—In the Head or in the World? 47 6. Put Data into Context 51 PART II Human error as symptom of trouble deeper inside the system 7. Human Error—The New View 61 8. Human Factors Data 67 9. Reconstruct the Unfolding Mindset 77 10. Patterns of Failure 101 11. Writing Recommendations 133 12. Learning from Failure 143 13. Rules for in the Rubble 151 Acknowledgements 157 Subject Index 159 Preface You are faced with an incident or accident that has a significant human contribution in it. What do you do? How do you make sense out of other people's controversial and puzzling assessments and actions? You ba- sically have two options, and your choice determines the focus, ques- tions, answers and ultimately the success of your probe, as well as the potential for progress on safety: • You can see human error as the cause of a mishap. In this case "human error", under whatever label—loss of situation awareness, procedural violation, regulatory shortfalls, managerial deficiencies— is the conclusion to your investigation. • You can see human error as the symptom of deeper trouble. In this case, human error is the starting point for your investigation. You will probe how human error is systematically connected to features of people's tools, tasks and operational/organizational environment. The first is called the old view of human error, while the second—itself already 50 years in the making—is the new view of human error.
    [Show full text]
  • Comparative Analysis of Nuclear Event Investigation Methods, Tools and Techniques, Presented in This Interim Report, Are Preliminary in Character
    COMPARATIVE ANALYSIS OF NUCLEAR EVENT INVESTIGATION METHODS, TOOLS AND TECHNIQUES Interim Technical Report Stanislovas Ziedelis, Marc Noel EUR 24757 EN - 2011 The mission of the JRC-IE is to provide support to Community policies related to both nuclear and non-nuclear energy in order to ensure sustainable, secure and efficient energy production, distribution and use. European Commission Joint Research Centre Institute for Energy Contact information Address: Postbus 2, 1755 ZG Petten, the Netherlands E-mail: [email protected] Tel.: +31-224-565447 Fax: +31-224-565637 http://ie.jrc.ec.europa.eu/ http://www.jrc.ec.europa.eu/ Legal Notice Neither the European Commission nor any person acting on behalf of the Commission is responsible for the use which might be made of this publication. Europe Direct is a service to help you find answers to your questions about the European Union Freephone number (*): 00 800 6 7 8 9 10 11 (*) Certain mobile telephone operators do not allow access to 00 800 numbers or these calls may be billed. A great deal of additional information on the European Union is available on the Internet. It can be accessed through the Europa server http://europa.eu/ JRC 62929 EUR 24757 EN ISBN 978-92-79-19712-3 ISSN 1018-5593 doi:10.2790/3097 Luxembourg: Publications Office of the European Union © European Union, 2011 Reproduction is authorised provided the source is acknowledged Printed in Luxembourg List of acronyms and definitions AEB Accident evolution and barrier function ASSET Assessment of Safety Significant
    [Show full text]
  • The Skill, Rule and Knowledge Based Classification
    Human Error Understanding Human Behaviour and Error David Embrey Human Reliability Associates 1, School House, Higher Lane, Dalton, Wigan, Lancashire. WN8 7RP 1. The Skill, Rule and Knowledge Based Classification An influential classification of the different types of information processing involved in industrial tasks was developed by J. Rasmussen of the Risø Laboratory in Denmark. This scheme provides a useful framework for identifying the types of error likely to occur in different operational situations, or within different aspects of the same task where different types of information processing demands on the individual may occur. The classification system, known as the Skill, Rule, Knowledge based (SRK) approach is described in a number of publications, e.g. Rasmussen (1979, 1982, 1987), Reason (1990). An extensive discussion of Rasmussen’s influential work in this area is contained in Goodstein et al (1988) which also contains a comprehensive bibliography. The terms skill, rule and knowledge based information processing refer to the degree of conscious control exercised by the individual over his or her activities. Figure 1 contrasts two extreme cases. In the knowledge based mode, the human carries out a task in an almost completely conscious manner. This would occur in a situation where a beginner was performing the task (e.g. a trainee process worker) or where an experienced individual was faced with a completely novel situation. In either of these cases, the worker would have to exert considerable mental effort to assess the situation, and his or her responses are likely to be slow. Also, after each control action, the worker would need to review its effect before taking further action, which would probably further slow down the responses to the situation.
    [Show full text]
  • Understanding Human Errors to Improve Requirements Quality
    UNDERSTANDING HUMAN ERRORS TO IMPROVE REQUIREMENTS QUALITY A Paper Submitted to the Graduate Faculty of the North Dakota State University of Agriculture and Applied Science By Kavitha Manjunath In Fulfillment of the Requirements for the Degree of MASTER OF SCIENCE Major Program: Software Engineering February 2018 Fargo, North Dakota North Dakota State University Graduate School Title UNDERSTANDING HUMAN ERRORS TO IMPROVE REQUIREMENTS QUALITY By Kavitha Manjunath The Supervisory Committee certifies that this disquisition complies with North Dakota State University’s regulations and meets the accepted standards for the degree of MASTER OF SCIENCE SUPERVISORY COMMITTEE: Dr. Gursimran Walia Chair Dr. Kendall E. Nygard Dr. Limin Zhang Approved: February 27, 2018 Dr. Kendall Nygard Date Department Chair ABSTRACT Requirements engineering is the first and perhaps the most important phase of software life cycle. Software faults committed during the requirements development, if left undetected can affect downstream activities. While previous research has developed fault-detection techniques, they are inadequate because they lack an understanding of root cause of faults to be able to avoid its future occurrences. Our research is aimed at helping software engineers understand human errors (i.e., the root cause) that cause faults that are inserted into Software Requirements Specification (SRS). This can help software engineers become more aware and reduce the likelihood of committing human errors during requirements development. Using a retrospective error abstraction approach, this paper reports the results from an industrial study wherein software engineers were trained on human errors followed by their application of human error abstraction (based on fifteen requirement faults supplied to them) and recommendations on prevention techniques for those abstracted errors.
    [Show full text]
  • Cause and Prevention of Human Error in Electric Utility Operations
    CAUSE AND PREVENTION OF HUMAN ERROR IN ELECTRIC UTILITY OPERATIONS Terry Bilke Midwest ISO ABSTRACT Trends in the electric utility industry have increased the chances and costs of errors by power system operators. Competition is dictating that more be done with available assets. There are fewer field personnel. Cost, rather than reliability and ease of operation is taking priority in new power distribution equipment. Computers, which can be used to simplify work , are installed primarily to process more information and allow the operator to do more. Although this study touches on many aspects of electric operations, it focuses on errors committed by the people running the nerve center of the industry, the power system dispatchers or operators. By the nature of the job, these people are the most visible when something goes wrong. The effects of their mistakes are immediate and potentially costly. This is in contrast with engineering or management mistakes which might never be detected even though things later goes awry because of them. Even though errors are taken seriously by the industry, little has been done to formally track and reduce their occurrence. The two cases found relied on the principle that lack of concentration was the primary cause of mishaps. In other words, the dispatcher must be "fixed" if the situation is to improve. This theory is in contrast with other process industry and manufacturing approaches to prevent errors, defects and accidents. This study started with a survey of dispatchers from 18 utilities representing nearly 2000 years of operating experience. Operators identified underlying error causes and suggested improvements.
    [Show full text]