A Technique for Human Error Analysis (ATHEANA) RECEIVED JUL 0 2 1996 OSTI

Total Page:16

File Type:pdf, Size:1020Kb

A Technique for Human Error Analysis (ATHEANA) RECEIVED JUL 0 2 1996 OSTI NUREG/CR-6350 BNL-NUREG-52467 A Technique for Human Error Analysis (ATHEANA) RECEIVED JUL 0 2 1996 OSTI Technical Basis and Methodology Description Prepared by S. E. Cooper, SAIC A. M. Ramey-Smith, NRC J. Wreathall, WWG G. W. Parry, NUS D. C. Bley, WWG W. J. Luckas, J. H. Taylor, M. T. Barriere, BNL Brookhaven National Laboratory Science Applications International Corporation Halliburton NUS Corporation The WreathWood Group Prepared for U.S. Nuclear Regulatory Commission DISTRiBUTSGM OF WS DOCUKHTTS UHQWTEgP^ AVAILABILITY NOTICE Availability of Reference Materials Cited in NRC Publications Most documents cited in NRC publications will be available from one of the following sources: 1. The NRC Public Document Room, 2120 L Street, NW., Lower Level, Washington. DC 20555-0001 2. The Superintendent of Documents, U. S. Government Printing Office, P. O - Box 37082, Washington, DC 20402-9328 3. The National Technical Information Service, Springfield, VA 22161-0002 Although the listing that follows represents the majority of documents cited in NRC publications, it is not in• tended to be exhaustive. Referenced documents available for inspection and copying for a fee from the NRC Public Document Room include NRC correspondence and internal NRC memoranda; NRC bulletins, circulars, information notices, in• spection and investigation notices; licensee event reports; vendor reports and correspondence; Commission papers; and applicant and licensee documents and correspondence. The following documents In the NUREG series are available for purchase from the Government Printing Office: formal NRC staff and contractor reports, NRC-sponsbred conference proceedings, international agreement reports, grantee reports, and NRC booklets and brochures. Also available are regulatory guides, NRC regula• tions In the Code of Federal Regulations, and Nuclear Regulatory Commission Issuances. Documents available from the National Technical Information Service include NUREG-series reports and tech• nical reports prepared by other Federal agencies and reports prepared by the Atomic Energy Commission, forerunner agency to the Nuclear Regulatory Commission. Documents available from public and special technical libraries include all open literature items, such as books, journal articles, and transactions. Federal Register notices, Federal and State legislation, and congressional reports can usually be obtained from these libraries.: Documents such as theses, dissertations, foreign reports and translations, and non-NRC conference pro• ceedings are available for purchase from the organization sponsoring the publication cited. Single copies of NRC draft reports are available free. to the extent of supply, upon written request to the Office of Administration, Distribution and Mail Services Section. U.S. Nuclear Regulatory Commission, Washington, DC 20555-0001. Copies of industry codes and standards used in a substantive manner in the NRC regulatory process are main• tained at the NRC Library, Two White Flint North, 11545 Rockville Pike, Rockville. MD 20852-2738, for use by the public. Codes and standards are usually copyrighted and may be purchased from the originating organiza• tion or, if they are American National Standards. from the American National Standards Institute, 1430 Broad• way, New York, NY 10018-3308. DISCLAIMER NOTICE This report was prepared as an account of work sponsored by an agency of the United States Government. Neither the United States Government nor any agency thereof, nor any of their employees, makes any warranty, expressed or implied, or assumes any legal liability or responsibility for any third party's use, or the results of such use, of any information, apparatus, product, or process disclosed in this report, or represents that its use by such third party would not infringe privately owned rights. NUREG/CR-6350 BNL-NUREG-52467 A Technique for Human Error Analysis (ATHEANA) Technical Basis and Methodology Description Manuscript Completed: April 1996 Date Published: May 1996 Prepared by S. E. Cooper, SAIC A. M. Ramey-Smith, NRC J. Wreathall, WWG G. W. Parry, NUS D. C. Bley, WWG W. J. Luckas, J. H. Taylor, M. T. Barriere, BNL Brookhaven National Laboratory Upton, NY 11973 Subcontractors: Science Applications International Corporation Reston, VA 22090 Halliburton NUS Corporation Gaithersburg, MD 20878 The WreathWood Group Dublin, OH 43016 A. M. Ramey-Smith, NRC Project Manager Prepared for Division of Systems Technology Office of Nuclear Regulatory Research U.S. Nuclear Regulatory Commission Washington, DC 20555-0001 NRC Job Code L2415 ABSTRACT Probabilistic risk assessment (PRA) has become an important tool in the nuclear power industry, both for the Nuclear Regulatory Commission (NRC) and the operating utilities. Human reliability analysis (HRA) is a critical element of PRA; however, limitations in the analysis of human actions in PRAs have long been recognized as a constraint when using PRA. A multidisciplinary HRA framework has been developed with the objective of providing a structured approach for analyzing operating experience and understanding nuclear plant safety, human error, and the underlying factors that affect them. The concepts of the framework have matured into a rudimentary working HRA method. A trial application of the method has demonstrated that it is possible to identify potentially significant human failure events from actual operating experience which are not generally included in current PRAs, as well as to identify associated performance shaping factors and plant conditions that have an observable impact on the frequency of core damage. A general process was developed, albeit in preliminary form, that addresses the iterative steps of defining human failure events and estimating their probabilities using search schemes. Additionally, a knowledge• base was developed which describes the links between performance shaping factors and resulting unsafe actions. iii NUREG/CR-6350 CONTENTS Page ABSTRACT hi LIST OF FIGURES vii LIST OF TABLES viii EXECUTIVE SUMMARY ix ACKNOWLEDGEMENTS xii ACRONYMS xiii 1. INTRODUCTION 1-1 1.1 Background 1-1 1.2 Human-System Interactions in Real Accidents and Near Misses 1-2 1.3 Deficiencies in Current HRA Methods 1-3 1.4 Overview of Improved HRA Project 1-4 1.5 Report Outline 1-5 2. PRINCIPLES UNDERLYING THE ATHEANA HRA METHOD 2-1 2.1 Introduction 2-1 2.2 Advances in Understanding Human Performance 2-1 2.2.1 Human Errors Are Driven by Context 2-2 2.2.2 Contexts Trigger Error Mechanisms 2-3 2.2.3 Behavioral Models Connect Error Mechanisms and Post-Initiator Operator Actions 2-4 2.3 Marriage of Behavioral Science with PRA, Engineering, and Human Factors: The Multidisciplinary HRA Framework 2-6 2.3.1 Concepts Underlying the Framework 2-6 2.3.2 Framework Description 2-6 2.3.3 Definition of Framework Elements 2-7 2.3.3.1 Error Forcing Context 2-7 2.3.3.2 Human Error 2-9 2.3.3.3 PRA Model 2-10 2.4 Confirmation of Principles from Analyses of Operational Experience 2-11 2.4.1 Confirmation from Analyses Using the Framework 2-11 2.4.2 Confirmation from Other Analyses 2-13 2.5 The HRA/PRA Interface 2-14 2.5.1 PRA Structure 2-14 2.5.2 Definition of Post-Initiator Human Failure Events 2-16 2.5.3 Specifications for HRA/PRA 2-17 3. GENERAL DESCRIPTION OF THE ATHEANA HRA METHOD 3-1 3.1 Introduction 3-1 3.2 Characteristics of the ATHEANA Method 3-1 3.3 The ATHEANA Quantification Model 3-2 3.4 The Tools for Applying ATHEANA 3-3 3.4.1 The Application Process 3-3 3.4.2 The Knowledge-Base 3-5 3.4.3 Information Sources for Quantification 3-6 3.5 Ongoing Development Work 3-6 v NUREG/CR-6350 CONTENTS (Cont'd) Page 4. TRIAL APPLICATIONS OF THE CONCEPTS 4-1 4.1 Introduction 4-1 4.2 Trial Demonstration Resources 4-1 4.3 The Scenario: A PWR Small-Break LOCA 4-2 4.4 Identification of an Error of Commission HFE 4-2 4.5 Identification of an Unsafe Action . 4-4 4.6 Identification of an Error-Forcing Context 4-5 4.7 HFE Quantification . 4-6 4.8 Summary of Trial Search Approach 4-8 4.9 Findings 4-9 5. DEVELOPMENTS IN UNDERSTANDING THE CAUSES OF UNSAFE ACTIONS . 5-1 5.1 Introduction . 5-1 5.2 Understanding Derived from Behavioral Science 5-1 5.2.1 Detection 5-2 5.2.2 Situation Assessment 5-2 5.2.3 Response Planning 5-4 5.2.4 Response Implementation 5-5 5.3 Understanding Derived from Analyses of Operational Events 5-6 5.3.1 Unsafe Actions and Important Error-Forcing Context Elements 5-6 5.3.1.1 Detection Problems 5-7 5.3.1.2 Situation Assessment Problems 5-7 5.3.1.3 Response Planning Problems 5-11 5.3.1.4 Response Implementation Problems 5-13 5.3.1.5 Generic Examples of PSFs 5-14 5.3.2 Identifying Apparent Causes Not Normally Addressed in PRAs 5-17 5.4 Implications of Unsafe Action Causes for HRA 5-18 6. DEVELOPMENTS IN THE PROCESS FOR APPLYING ATHEANA 6-1 6.1 Introduction 6-1 6.2 Defining and Identifying HFEs . 6-2 6.3 Identifying Unsafe Actions and Associated Error-Forcing Contexts 6-3 7. CONCLUSIONS 7-1 8. REFERENCES 8-1 APPENDICES: Appendix A: Human Reliability Analysis Literature Review A-l Appendix B: Multidisciplinary HRA Framework B-l NUREG/CR-6350 LIST OF FIGURES No. Title Page 2.1 Multidisciplinary HRA framework 2-8 3.1 ATHEANA application process flow diagram 3-4 vii NUREG/CR-6350 LIST OF TABLES No. Title Page 4.1 Quantification Demonstration Results 4-7 5.3-1 Examples of Problems in Detection 5-8 5.3-2 Examples of Problems in Situation Assessment 5-9 5.3-3 Examples of Problems in Response Planning 5-12 5.3-4 Examples of Problems in Response Implementation 5-14 5.3-5 Examples of General Burdens on Cognitive and Physical Abilities 5-15 5.3-6 Factors Not Normally Considered in PRAs .
Recommended publications
  • Application of Human Factors in Reducing Human Error in Existing Offshore Facilities
    University of Nebraska - Lincoln DigitalCommons@University of Nebraska - Lincoln United States Department of Transportation -- Publications & Papers U.S. Department of Transportation 2002 Application of Human Factors in Reducing Human Error in Existing Offshore Facilities Jeffrey Thomas ExxonMobil, [email protected] Clifford C. Baker American Bureau of Shipping Thomas B. Malone Carlow International Incorporated, [email protected] John T. Malone Carlow Associates, [email protected] Christina L. Hard BP America Inc., [email protected] See next page for additional authors Follow this and additional works at: https://digitalcommons.unl.edu/usdot Part of the Civil and Environmental Engineering Commons Thomas, Jeffrey; Baker, Clifford C.; Malone, Thomas B.; Malone, John T.; Hard, Christina L.; Rezende, Ivan C. L.; Caruana, Sally; and Witten, Mark, "Application of Human Factors in Reducing Human Error in Existing Offshore Facilities" (2002). United States Department of Transportation -- Publications & Papers. 34. https://digitalcommons.unl.edu/usdot/34 This Article is brought to you for free and open access by the U.S. Department of Transportation at DigitalCommons@University of Nebraska - Lincoln. It has been accepted for inclusion in United States Department of Transportation -- Publications & Papers by an authorized administrator of DigitalCommons@University of Nebraska - Lincoln. Authors Jeffrey Thomas, Clifford C. Baker, Thomas B. Malone, John T. Malone, Christina L. Hard, Ivan C. L. Rezende, Sally Caruana, and Mark Witten This article is available at DigitalCommons@University of Nebraska - Lincoln: https://digitalcommons.unl.edu/usdot/ 34 Application of Human Factors Engineering in Reducing Human Error in Existing Offshore Systems WORKING GROUP 3 2ND INTERNATIONAL WORKSHOP ON HUMAN FACTORS IN OFFSHORE OPERATIONS (HFW2002) APPLICATION OF HUMAN FACTORS ENGINEERING IN REDUCING HUMAN ERROR IN EXISTING OFFSHORE SYSTEMS Jeffrey Thomas ExxonMobil Houston, TX USA [email protected] Clifford C.
    [Show full text]
  • Human Factors
    Chapter 14 Human Factors Introduction Why are human conditions, such as fatigue, complacency, and stress, so important in aviation maintenance? These conditions, along with many others, are called human factors. Human factors directly cause or contribute to many aviation accidents. It is universally agreed that 80 percent of maintenance errors involve human factors. If they are not detected, they can cause events, worker injuries, wasted time, and even accidents. [Figure 14-1] 14-1 maintenance errors are parts installed incorrectly, missing Human Factors parts, and necessary checks not being performed. In Mental comparison to many other threats to aviation safety, the State Emotional mistakes of an aviation maintenance technician (AMT) can be State more difficult to detect. Often times, these mistakes are present but not visible and have the potential to remain latent, affecting the safe operation of aircraft for longer periods of time. Human AMTs are confronted with a set of human factors unique Capabilities within aviation. Often times, they are working in the evening Physical or early morning hours, in confined spaces, on platforms that State are up high, and in a variety of adverse temperature/humidity conditions. The work can be physically strenuous, yet it also Human requires attention to detail. [Figure 14-2] Because of the Limitations nature of the maintenance tasks, AMTs commonly spend more time preparing for a task than actually carrying it out. Proper Environmental documentation of all maintenance work is a key element, and Conditions Human-Machine AMTs typically spend as much time updating maintenance Interface logs as they do performing the work.
    [Show full text]
  • Human Error and Accident Causation Theories, Frameworks and Analytical Techniques: an Annotated Bibliography
    Publications 9-2000 Human Error and Accident Causation Theories, Frameworks and Analytical Techniques: An Annotated Bibliography Douglas A. Wiegmann University of Illinois at Urbana-Champaign Aaron M. Rich University of Illinois at Urbana-Champaign Scott A. Shappell Federal Aviation Administration, Civil Aeromedical Institute, [email protected] Follow this and additional works at: https://commons.erau.edu/publication Part of the Aviation Safety and Security Commons, and the Human Factors Psychology Commons Scholarly Commons Citation Wiegmann, D. A., Rich, A. M., & Shappell, S. A. (2000). Human Error and Accident Causation Theories, Frameworks and Analytical Techniques: An Annotated Bibliography. , (). Retrieved from https://commons.erau.edu/publication/736 This Report is brought to you for free and open access by Scholarly Commons. It has been accepted for inclusion in Publications by an authorized administrator of Scholarly Commons. For more information, please contact [email protected]. Aviation Research Lab Institute of Aviation University of Illinois at Urbana-Champaign ARL 1 Airport Road Savoy, Illinois 61874 Human Error and Accident Causation Theories, Frameworks and Analytical Techniques: An Annotated Bibliography Douglas A. Wiegmann and Aaron M. Rich Aviation Research Lab and Scott A. Shappell Civil Aeromedical Institute Technical Report ARL-00-12/FAA-00-7 September 2000 Prepared for Federal Aviation Administration Oklahoma City, OK Contract DTFA 99-G-006 1 ABSTRACT Over the last several decades, humans have played a progressively more important causal role in aviation accidents as aircraft have become more. Consequently, a growing number of aviation organizations are tasking their safety personnel with developing safety programs to address the highly complex and often nebulous issue of human error.
    [Show full text]
  • The Field Guide to Human Error Investigations
    Contents Preface VII PART I Human error as a cause of mishaps 1. The Bad Apple Theory 3 2. Reacting to Failure 15 3. What is the Cause? 31 4. Human Error by any Other Name 41 5. Human Error—In the Head or in the World? 47 6. Put Data into Context 51 PART II Human error as symptom of trouble deeper inside the system 7. Human Error—The New View 61 8. Human Factors Data 67 9. Reconstruct the Unfolding Mindset 77 10. Patterns of Failure 101 11. Writing Recommendations 133 12. Learning from Failure 143 13. Rules for in the Rubble 151 Acknowledgements 157 Subject Index 159 Preface You are faced with an incident or accident that has a significant human contribution in it. What do you do? How do you make sense out of other people's controversial and puzzling assessments and actions? You ba- sically have two options, and your choice determines the focus, ques- tions, answers and ultimately the success of your probe, as well as the potential for progress on safety: • You can see human error as the cause of a mishap. In this case "human error", under whatever label—loss of situation awareness, procedural violation, regulatory shortfalls, managerial deficiencies— is the conclusion to your investigation. • You can see human error as the symptom of deeper trouble. In this case, human error is the starting point for your investigation. You will probe how human error is systematically connected to features of people's tools, tasks and operational/organizational environment. The first is called the old view of human error, while the second—itself already 50 years in the making—is the new view of human error.
    [Show full text]
  • Comparative Analysis of Nuclear Event Investigation Methods, Tools and Techniques, Presented in This Interim Report, Are Preliminary in Character
    COMPARATIVE ANALYSIS OF NUCLEAR EVENT INVESTIGATION METHODS, TOOLS AND TECHNIQUES Interim Technical Report Stanislovas Ziedelis, Marc Noel EUR 24757 EN - 2011 The mission of the JRC-IE is to provide support to Community policies related to both nuclear and non-nuclear energy in order to ensure sustainable, secure and efficient energy production, distribution and use. European Commission Joint Research Centre Institute for Energy Contact information Address: Postbus 2, 1755 ZG Petten, the Netherlands E-mail: [email protected] Tel.: +31-224-565447 Fax: +31-224-565637 http://ie.jrc.ec.europa.eu/ http://www.jrc.ec.europa.eu/ Legal Notice Neither the European Commission nor any person acting on behalf of the Commission is responsible for the use which might be made of this publication. Europe Direct is a service to help you find answers to your questions about the European Union Freephone number (*): 00 800 6 7 8 9 10 11 (*) Certain mobile telephone operators do not allow access to 00 800 numbers or these calls may be billed. A great deal of additional information on the European Union is available on the Internet. It can be accessed through the Europa server http://europa.eu/ JRC 62929 EUR 24757 EN ISBN 978-92-79-19712-3 ISSN 1018-5593 doi:10.2790/3097 Luxembourg: Publications Office of the European Union © European Union, 2011 Reproduction is authorised provided the source is acknowledged Printed in Luxembourg List of acronyms and definitions AEB Accident evolution and barrier function ASSET Assessment of Safety Significant
    [Show full text]
  • The Skill, Rule and Knowledge Based Classification
    Human Error Understanding Human Behaviour and Error David Embrey Human Reliability Associates 1, School House, Higher Lane, Dalton, Wigan, Lancashire. WN8 7RP 1. The Skill, Rule and Knowledge Based Classification An influential classification of the different types of information processing involved in industrial tasks was developed by J. Rasmussen of the Risø Laboratory in Denmark. This scheme provides a useful framework for identifying the types of error likely to occur in different operational situations, or within different aspects of the same task where different types of information processing demands on the individual may occur. The classification system, known as the Skill, Rule, Knowledge based (SRK) approach is described in a number of publications, e.g. Rasmussen (1979, 1982, 1987), Reason (1990). An extensive discussion of Rasmussen’s influential work in this area is contained in Goodstein et al (1988) which also contains a comprehensive bibliography. The terms skill, rule and knowledge based information processing refer to the degree of conscious control exercised by the individual over his or her activities. Figure 1 contrasts two extreme cases. In the knowledge based mode, the human carries out a task in an almost completely conscious manner. This would occur in a situation where a beginner was performing the task (e.g. a trainee process worker) or where an experienced individual was faced with a completely novel situation. In either of these cases, the worker would have to exert considerable mental effort to assess the situation, and his or her responses are likely to be slow. Also, after each control action, the worker would need to review its effect before taking further action, which would probably further slow down the responses to the situation.
    [Show full text]
  • Understanding Human Errors to Improve Requirements Quality
    UNDERSTANDING HUMAN ERRORS TO IMPROVE REQUIREMENTS QUALITY A Paper Submitted to the Graduate Faculty of the North Dakota State University of Agriculture and Applied Science By Kavitha Manjunath In Fulfillment of the Requirements for the Degree of MASTER OF SCIENCE Major Program: Software Engineering February 2018 Fargo, North Dakota North Dakota State University Graduate School Title UNDERSTANDING HUMAN ERRORS TO IMPROVE REQUIREMENTS QUALITY By Kavitha Manjunath The Supervisory Committee certifies that this disquisition complies with North Dakota State University’s regulations and meets the accepted standards for the degree of MASTER OF SCIENCE SUPERVISORY COMMITTEE: Dr. Gursimran Walia Chair Dr. Kendall E. Nygard Dr. Limin Zhang Approved: February 27, 2018 Dr. Kendall Nygard Date Department Chair ABSTRACT Requirements engineering is the first and perhaps the most important phase of software life cycle. Software faults committed during the requirements development, if left undetected can affect downstream activities. While previous research has developed fault-detection techniques, they are inadequate because they lack an understanding of root cause of faults to be able to avoid its future occurrences. Our research is aimed at helping software engineers understand human errors (i.e., the root cause) that cause faults that are inserted into Software Requirements Specification (SRS). This can help software engineers become more aware and reduce the likelihood of committing human errors during requirements development. Using a retrospective error abstraction approach, this paper reports the results from an industrial study wherein software engineers were trained on human errors followed by their application of human error abstraction (based on fifteen requirement faults supplied to them) and recommendations on prevention techniques for those abstracted errors.
    [Show full text]
  • Cause and Prevention of Human Error in Electric Utility Operations
    CAUSE AND PREVENTION OF HUMAN ERROR IN ELECTRIC UTILITY OPERATIONS Terry Bilke Midwest ISO ABSTRACT Trends in the electric utility industry have increased the chances and costs of errors by power system operators. Competition is dictating that more be done with available assets. There are fewer field personnel. Cost, rather than reliability and ease of operation is taking priority in new power distribution equipment. Computers, which can be used to simplify work , are installed primarily to process more information and allow the operator to do more. Although this study touches on many aspects of electric operations, it focuses on errors committed by the people running the nerve center of the industry, the power system dispatchers or operators. By the nature of the job, these people are the most visible when something goes wrong. The effects of their mistakes are immediate and potentially costly. This is in contrast with engineering or management mistakes which might never be detected even though things later goes awry because of them. Even though errors are taken seriously by the industry, little has been done to formally track and reduce their occurrence. The two cases found relied on the principle that lack of concentration was the primary cause of mishaps. In other words, the dispatcher must be "fixed" if the situation is to improve. This theory is in contrast with other process industry and manufacturing approaches to prevent errors, defects and accidents. This study started with a survey of dispatchers from 18 utilities representing nearly 2000 years of operating experience. Operators identified underlying error causes and suggested improvements.
    [Show full text]
  • Rail Accident Report
    Rail Accident Report Class investigation into accidents and near misses involving trains and track workers outside possessions Report 07/2017 April 2017 This investigation was carried out in accordance with: l the Railway Safety Directive 2004/49/EC; l the Railways and Transport Safety Act 2003; and l the Railways (Accident Investigation and Reporting) Regulations 2005. © Crown copyright 2017 You may re-use this document/publication (not including departmental or agency logos) free of charge in any format or medium. You must re-use it accurately and not in a misleading context. The material must be acknowledged as Crown copyright and you must give the title of the source publication. Where we have identified any third party copyright material you will need to obtain permission from the copyright holders concerned. This document/publication is also available at www.raib.gov.uk. Any enquiries about this publication should be sent to: RAIB Email: [email protected] The Wharf Telephone: 01332 253300 Stores Road Fax: 01332 253301 Derby UK Website: www.gov.uk/raib DE21 4BA This report is published by the Rail Accident Investigation Branch, Department for Transport. Preface Preface The purpose of a Rail Accident Investigation Branch (RAIB) investigation is to improve railway safety by preventing future railway accidents or by mitigating their consequences. It is not the purpose of such an investigation to establish blame or liability. Accordingly, it is inappropriate that RAIB reports should be used to assign fault or blame, or determine liability, since neither the investigation nor the reporting process has been undertaken for that purpose.
    [Show full text]
  • Human Error, Decision Making and Fatigue
    Human Error, Decision Making and Fatigue Prof. Chris Johnson, School of Computing Science, University of Glasgow. [email protected] http://www.dcs.gla.ac.uk/~johnson Introduction: Individual Human Error • Slips, Lapses and Mistakes. • Rasmussen: Skill, Rules, Knowledge. • Reason: Generic Error Modelling. • Risk Homeostasis.. • Fatigue What is Error? • Deviation from optimal performance? – very few achieve the optimal. • Failure to achieve desired outcome? – desired outcome can be unsafe. • Departure from intended plan? – but environment may change plan... What is Error? Acknowledgement: J. Reason, Human Error, Cambridge University Press, 1990 (ISBN-0-521-31419-4). Types of Errors... • Slips: – correct plan but incorrect action; – more readily observed. • Lapses: – correct plan but incorrect action; – failure of memory so more covert? • Mistakes: – incorrect plan; – more complex, less understood. • Human error models help to analyse error types. Rasmussen: Skill, Rules and Knowledge • Skill based behaviour: – sensory-motor performance; – without conscious control, automated. • Rule based behaviour: – stored procedures, induced by experience; – Taught problem solving/planning. • Knowledge based behaviour: – in unfamiliar situations, explicit thinking; – develop plan, try it and see if it works. Rasmussen: Skill, Rules and Knowledge Acknowledgement: J. Rasmussen, Skill, Rules, Knowledge: Signals, Signs and Symbols and Other Distinctions in Human Performance Models. IEEE Transactions on Systems, Man and Cybernetics (SMC-13)3:257-266, 1983. Rasmussen: Skill, Rules and Knowledge • Signals: – sensory data from environment; – continuous variables, cf Gibson’s direct perception. • Signs: – indicate state of the environment, – conventions for action; – activate stored pattern or action. • Symbols: – can be formally processed; – related by convention to state. Rasmussen: Skill, Rules and Knowledge • Skill-based errors: – variability of human performance.
    [Show full text]
  • Human Factors and Medication Errors
    Human Factors and Medication Errors Human Factors and Medication Errors Judy Smetzer, BSN, FISMP Vice President Institute for Safe Medication Practices [email protected] ©2020 ISMP | www.ismp.org | 1 1 Objectives Identify the differences between the automatic mode of cognition and the problem-solving mode of cognition List conditions that degrade human performance Review perceptual and cognitive biases and how to reduce their negative impact on patient safety Describe system design strategies that can prevent or detect human error or mitigate patient harm from errors ©2020 ISMP | www.ismp.org | 2 2 © 2020 ISMP 1 Human Factors and Medication Errors Medication Error Causes — Systems • System design issues (latent failures) — People • Human error • At-risk behavior • Reckless behavior (active failures) ©2020 ISMP | www.ismp.org | 3 3 Human Error A failure of a common sequence of psychological functions that are basic to human behavior: — Perceive • Perception of all the senses — Think • The way these stimuli are interpreted, the formulation of an action plan, and planning how the action should be carried out — Behave • Execution of the planned actions ©2020 ISMP | www.ismp.org | 4 4 © 2020 ISMP 2 Human Factors and Medication Errors Human Error Probabilities Unfamiliar task performed at speed/no idea of consequences 50% Task involving high stress levels 30% Complex task requiring high comprehension/skill 15% Select ambiguously labeled control/package 5% Failure to perform a check correctly 5% Error in routine operation when care required
    [Show full text]
  • Cognitive Biases and Errors in Radiation Therapy
    Systematic Review Cognitive Biases and Errors in Radiation Therapy Rosann Keller, MEd, R.T.(T) Amy Heath, MS, R.T.(T) Purpose To describe common cognitive biases that can lead to radiation therapy errors. Methods A review of the literature was performed using PubMed and Cumulative Index to Nursing and Allied Health Literature. Textbooks, popular literature, and internet resources also were evaluated for their relevance to radiation therapy. Results Several common cognitive biases can lead to radiation therapy errors, including automaticity, inattentional blindness, illusion of attention, and illusion of knowledge. Discussion Cognitive biases can manifest in radiation therapy in many ways. Radiation therapists should be aware of these biases so that they can employ interventions to minimize the risk of errors. Conclusion Human error can result in treatment misadministration and near misses. One cause of human error is cognitive bias. Radiation therapists can take actions to reduce the risk of error caused by cognitive biases. Keywords cognitive bias, radiation therapy errors, medical error, human error rrors in radiation therapy treatments are rela- radiation therapy errors occur, and human error often is tively rare—0.18% to 0.29% per fraction, to blame. Research shows that human error is the cause according to 1 study.1 However, errors can hap- of 70% of plane wrecks, 90% of car accidents, and 90% pen at anytime during the 269 steps in the exter- of errors in the workplace.4 However, understanding Enal beam radiation therapy process, and the results can how the human brain works and how the brain affects be devastating.2 In 2010, the New York Times published the way decisions are made can minimize errors.
    [Show full text]