MITIGATING COGNITIVE BIASES in RISK IDENTIFICATION: Practitioner Checklist for the AEROSPACE SECTOR

Total Page:16

File Type:pdf, Size:1020Kb

MITIGATING COGNITIVE BIASES in RISK IDENTIFICATION: Practitioner Checklist for the AEROSPACE SECTOR MITIGATING COGNITIVE BIASES IN RISK IDENTIFICATION: Practitioner Checklist for the AEROSPACE SECTOR Debra L. Emmons, Thomas A. Mazzuchi, Shahram Sarkani, and Curtis E. Larsen This research contributes an operational checklist for mitigating cogni- tive biases in the aerospace sector risk management process. The Risk Identification and Evaluation Bias Reduction Checklist includes steps for grounding the risk identification and evaluation activities in past project experiences through historical data, and emphasizes the importance of incorporating multiple methods and perspectives to guard against optimism and a singular project instantiation-focused view. The authors developed a survey to elicit subject matter expert judgment on the value of the check- list to support its use in government and industry as a risk management tool. The survey also provided insights on bias mitigation strategies and lessons learned. This checklist addresses the deficiency in the literature in providing operational steps for the practitioner to recognize and implement strategies for bias reduction in risk management in the aerospace sector. DOI: https://doi.org/10.22594/dau.16-770.25.01 Keywords: Risk Management, Optimism Bias, Planning Fallacy, Cognitive Bias Reduction Mitigating Cognitive Biases in Risk Identification http://www.dau.mil January 2018 This article and its accompanying research contribute an operational FIGURE 1. RESEARCH APPROACH Risk Identification and Evaluation Bias Reduction Checklist for cognitive bias mitigation in risk management for the aerospace sector. The checklist Cognitive Biases & Bias described herein offers a practical and implementable project management Enabling framework to help reduce biases in the aerospace sector and redress the Conditions cognitive limitations in the risk identification and analysis process. Detailed discussion focuses on the development of strategies targeted at reducing Key Biases: Optimism four cognitive biases and their influence on the risk identification process, Planning Fallacy Anchoring and to the development and validation of the practitioner checklist. Ambiguity Effect Open-ended Likert scale Questions Expert Judgment Expert Judgment Final Bias Initial Literature Survey: Survey: Reduction Background and Literature Bias Reduction Review Checklist Bias Reduction Checklist & Checklist The authors’ research began with a review of the literature, which Validation Strategies Strategies covered the areas of risk management, cognitive biases, and bias enabling conditions. The biases of optimism, planning fallacy, anchoring, and ambiguity Bias Risk Mitigation effect were deemed particularly influential to the risk identification and Management Approaches evaluation processes. The authors reviewed and synthesized the bias mit- igation literature and developed the initial bias reduction checklist. After the development of the initial checklist, they designed and administered The checklist presented herein is grounded in the academic literature the survey to seek feedback and validation of the checklist surrounding cognitive bias mitigation and, in particular, the Nobel-prize- as a risk management tool. A Likert-scale instrument was winning efforts of Kahneman and Tversky (1977) in reference class used for the survey, which is an appropriate instrument for forecasting. The review of the literature begins with a discussion of risk measuring attitudes and beliefs. The answers to the open-ended management in the aerospace sector and a description of the risk identi- questions of the survey provided insights, lessons learned, as well fication practices and challenges. Subsequently, the nature of cognitive as other measures that are used by practitioners to biases is described. These biases are persistent across industries, individual reduce biases. The survey design, data experts, and teams, and affect humans’ ability to impartially identify and collection, and analysis followed the assess risks. Bias enabling conditions in the project environment are also academic literature guidelines for garner- examined, and the characteristics common to both the transportation and ing attitudes and feedback on the effectiveness aerospace sectors are highlighted. Although the research is tailored to the of the checklist as a risk management tool. aerospace sector, important insights from the transportation sector are also Nonetheless, the authors recognize that like considered. Finally, the review of the literature concludes with a discussion any of the measurement methods in the science of the approaches to reduce the cognitive biases. disciplines, the social or attitude survey method is not error-free (Fowler, 2013). The authors Risk Management incorporated the feedback from both the Risk management includes a documented process, and both formal Likert survey and the open-ended ques- and informal practices applied in government programs and commer- tions into the final checklist. Finally, cial industries alike. The Department of Defense (DoD) and the National a discussion follows on the checklist Aeronautics and Space Administration (NASA) have established risk implementation and potential chal- management processes. For example, the Department of Defense Risk, Issue, lenges. The research approach is and Opportunity Management Guide for Defense Acquisition Programs highlighted in Figure 1. represents but one of numerous DoD policy and guidance documents that 54 Defense ARJ, January 2018, Vol. 25 No. 1 : 52–93 Defense ARJ, January 2018, Vol. 25 No. 1 : 52–93 55 Mitigating Cognitive Biases in Risk Identification http://www.dau.mil January 2018 focus on risk management. Figure 2 depicts an overview of the risk and issue Maytorena, Winch, Freeman, and Kiely (2007) highlight the “importance of management process, which is an organized and iterative decision-making the risk identification and analysis phases of the risk management process” technique designed to improve the probability of project success (DoD, (p. 315), since they can have a great influence on the correctness of the risk 2017). This five-step management process may be used for issues that assessment activity. Their research suggests that the role of experience in are nonprobabilistic in nature, or risks. This process is intended to be a this process is much less meaningful than it is regularly presumed to be. proactive and continuous approach that identifies discrete risks or issues, Alternately, “information search approach, education, and training in risk assesses the likelihood and consequence of these risks or consequences of management have a significant role in risk identification performance” the issues, develops mitigation options for all the identified risks, monitors (p. 315). As for the role of expertise in the risk identification and analysis progress to confirm that cumulative project risk is truly declining, and process, Freudenburg (1988) described the challenges amongst specialists communicates the risk status (DoD, 2017). The DoD risk-mitigation options that may lead to cognitive miscalculations in the risk estimation methods, include acceptance (and monitoring), avoidance, transfer, and control (DoD, including the failure to anticipate all the elements. Such miscalculations 2017). Similar continuous risk management processes have also been rep- may then lead to mistakes and bias in the estimating. The identification resented in the NASA guidance (NASA, 2007) and A Guide to the Project or risk discovery methods and tools typically include brainstorming, per- Management Body of Knowledge (Project Management Institute, 2013). A sonal knowledge and experience, questionnaires, lessons learned, and risk review of the literature also indicates that project risk management prac- management tools such as Failure Modes Effects and Analysis, Fault Tree tices differ across projects and are affected by project characteristics such Analysis, and Probability Risk Analysis. as scope, complexity, and category (Omidvar, Jaryani, Zafarghandi, Nasab, & Jam Shidi, 2011). Risk identification is a continuing process throughout the life cycle of the project; however, it is critically important in the early conceptual design FIGURE 2. RISK AND ISSUE MANAGEMENT PROCESS OVERVIEW and formulation phases to ensure the appropriate risk and programmatic posture is established. A study by the Jet Propulsion Laboratory noted Process “significant variability in risk identification and risk reporting in the early Planning conceptual design” (Hihn, Chattopadhyay, Hanna, Port, & Eggleston, 2010, What are the Identification p. 14). Some of this variability is attributable to the inherent vagueness of program’s risk and What has, can, or new system design at this early phase in the life cycle. Further exacerbating issue management will go wrong? processes? factors include the hectic concurrent engineering design team environment and the absence of an organized risk identification and ranking process that would potentially increase the level of evenness across risk-recording activities. This team concluded, “Generating risk checklists that can be used Communication for risk identification guidance during early concept studies would enable and Feedback more consistent risk reporting” (Hihn et al., 2010, p. 14). Monitoring Analysis What is the likelihood How has the risk Cognitive Biases of the risk and the or issue consequence of the The issue of bias based on human mental shortcuts (also called heuris- changed? risk or issue? tics) in subjective assessment and decision
Recommended publications
  • Ambiguity Aversion in Qualitative Contexts: a Vignette Study
    Ambiguity aversion in qualitative contexts: A vignette study Joshua P. White Melbourne School of Psychological Sciences University of Melbourne Andrew Perfors Melbourne School of Psychological Sciences University of Melbourne Abstract Most studies of ambiguity aversion rely on experimental paradigms involv- ing contrived monetary bets. Thus, the extent to which ambiguity aversion is evident outside of such contexts is largely unknown, particularly in those contexts which cannot easily be reduced to numerical terms. The present work seeks to understand whether ambiguity aversion occurs in a variety of different qualitative domains, such as work, family, love, friendship, exercise, study and health. We presented participants with 24 vignettes and measured the degree to which they preferred risk to ambiguity. In a separate study we asked participants for their prior probability estimates about the likely outcomes in the ambiguous events. Ambiguity aversion was observed in the vast majority of vignettes, but at different magnitudes. It was predicted by gain/loss direction but not by the prior probability estimates (with the inter- esting exception of the classic Ellsberg ‘urn’ scenario). Our results suggest that ambiguity aversion occurs in a wide variety of qualitative contexts, but to different degrees, and may not be generally driven by unfavourable prior probability estimates of ambiguous events. Corresponding Author: Joshua P. White ([email protected]) AMBIGUITY AVERSION IN QUALITATIVE CONTEXTS: A VIGNETTE STUDY 2 Introduction The world is replete with the unknown, yet people generally prefer some types of ‘unknown’ to others. Here, an important distinction exists between risk and uncertainty. As defined by Knight (1921), risk is a measurable lack of certainty that can be represented by numerical probabilities (e.g., “there is a 50% chance that it will rain tomorrow”), while ambiguity is an unmeasurable lack of certainty (e.g., “there is an unknown probability that it will rain tomorrow”).
    [Show full text]
  • A Task-Based Taxonomy of Cognitive Biases for Information Visualization
    A Task-based Taxonomy of Cognitive Biases for Information Visualization Evanthia Dimara, Steven Franconeri, Catherine Plaisant, Anastasia Bezerianos, and Pierre Dragicevic Three kinds of limitations The Computer The Display 2 Three kinds of limitations The Computer The Display The Human 3 Three kinds of limitations: humans • Human vision ️ has limitations • Human reasoning 易 has limitations The Human 4 ️Perceptual bias Magnitude estimation 5 ️Perceptual bias Magnitude estimation Color perception 6 易 Cognitive bias Behaviors when humans consistently behave irrationally Pohl’s criteria distilled: • Are predictable and consistent • People are unaware they’re doing them • Are not misunderstandings 7 Ambiguity effect, Anchoring or focalism, Anthropocentric thinking, Anthropomorphism or personification, Attentional bias, Attribute substitution, Automation bias, Availability heuristic, Availability cascade, Backfire effect, Bandwagon effect, Base rate fallacy or Base rate neglect, Belief bias, Ben Franklin effect, Berkson's paradox, Bias blind spot, Choice-supportive bias, Clustering illusion, Compassion fade, Confirmation bias, Congruence bias, Conjunction fallacy, Conservatism (belief revision), Continued influence effect, Contrast effect, Courtesy bias, Curse of knowledge, Declinism, Decoy effect, Default effect, Denomination effect, Disposition effect, Distinction bias, Dread aversion, Dunning–Kruger effect, Duration neglect, Empathy gap, End-of-history illusion, Endowment effect, Exaggerated expectation, Experimenter's or expectation bias,
    [Show full text]
  • Optimism Bias and Illusion of Control in Finance Professionals
    November, 28th – P13 Look ma(rket), No Hands! Optimism Bias and Illusion of Control in Finance Professionals Francesco Marcatto, Giovanni Colangelo, Donatella Ferrante University of Trieste, Department of Life Sciences, Psychology Unit Gaetano Kanizsa, Trieste, Italy Abstract behavior, whereas extreme optimists display financial habits The optimism bias is the tendency to judge one’s own risk as and behavior that are generally not considered prudent (Puri less than the risk of others. In the present study we found that & Robinson, 2007). Moreover, analysts have been shown to also finance professionals (N = 60) displayed an optimism be systematically overoptimistic in their forecasts about the bias when forecasting the return of an investment made by earnings potential of firms at the time of their initial public themselves or by a colleague of the same expertise. Using a offerings, and this optimism increases as the length of the multidimensional approach to the assessment of risk forecast period increases (McNichols & O’Brien, 1997; perception, we found that participants’ forecasts were biased Rajan & Servaes, 1997). To the best of our knowledge, not because they judged negative consequences as less likely for themselves, but because they were overconfident in their however, the presence of the optimism bias/unrealistic ability to avoid and control them. comparative optimism in financial risk (i.e., judging personal risk of losing money as lower than other investors’ Keywords: Optimism bias; unrealistic comparative risk) has never been subject of empirical testing. optimism; financial risk; investment risk; perceived control; illusion of control. The main aim of this study is to investigate whether the optimism bias can be observed also in experts in the field of Introduction financial risk.
    [Show full text]
  • Behavioral Biases on Investment Decision: a Case Study in Indonesia
    Kartini KARTINI, Katiya NAHDA / Journal of Asian Finance, Economics and Business Vol 8 No 3 (2021) 1231–1240 1231 Print ISSN: 2288-4637 / Online ISSN 2288-4645 doi:10.13106/jafeb.2021.vol8.no3.1231 Behavioral Biases on Investment Decision: A Case Study in Indonesia Kartini KARTINI1, Katiya NAHDA2 Received: November 30, 2020 Revised: February 07, 2021 Accepted: February 16, 2021 Abstract A shift in perspective from standard finance to behavioral finance has taken place in the past two decades that explains how cognition and emotions are associated with financial decision making. This study aims to investigate the influence of various psychological factors on investment decision-making. The psychological factors that are investigated are differentiated into two aspects, cognitive and emotional aspects. From the cognitive aspect, we examine the influence of anchoring, representativeness, loss aversion, overconfidence, and optimism biases on investor decisions. Meanwhile, from the emotional aspect, the influence of herding behavior on investment decisions is analyzed. A quantitative approach is used based on a survey method and a snowball sampling that result in 165 questionnaires from individual investors in Yogyakarta. Further, we use the One-Sample t-test in testing all hypotheses. The research findings show that all of the variables, anchoring bias, representativeness bias, loss aversion bias, overconfidence bias, optimism bias, and herding behavior have a significant effect on investment decisions. This result emphasizes the influence of behavioral factors on investor’s decisions. It contributes to the existing literature in understanding the dynamics of investor’s behaviors and enhance the ability of investors in making more informed decision by reducing all potential biases.
    [Show full text]
  • The Optimism Bias: a Cognitive Neuroscience Perspective 1
    Xjenza Online - Journal of The Malta Chamber of Scientists www.xjenza.org Doi: http://dx.medra.org/10.7423/XJENZA.2014.1.04 Review Article The Optimism Bias: A cognitive neuroscience perspective Claude J. Bajada1 1Institute of Cognitive Neuroscience, University College London, WC1N 3AR, UK Abstract. The optimism bias is a well-established 2 The Study of the Optimism psychological phenomenon. Its study has implications that are far reaching in fields as diverse as mental Bias health and economic theory. With the emerging field of cognitive neuroscience and the advent of advanced neu- roimaging techniques, it has been possible to investigate the neural basis of the optimism bias and to understand One way for scientists to test the optimism bias in the in which neurological conditions this natural bias fails. laboratory is to ask an individual to predict his chances This review first defines the optimism bias, discusses its of experiencing an event and then following up to see implications and reviews the literature that investigates whether the event transpired. The problem with this its neural basis. Finally some potential pitfalls in approach is that the outcome of an event does not al- experimental design are discussed. ways accurately represent the person's prior chances to attain that outcome; this is especially true when the outcome is a binary one. For example, we know that an Keywords Optimism bias - cognitive neuroscience - individual has an infinitesimally small chance at winning psychology - neural basis. the national lottery. If Peter predicts that he has a 75% chance of winning next week's lottery and then happens to win the lottery, this does not mean that Peter was 1 Introduction actually pessimistic in his prediction.
    [Show full text]
  • Communication Science to the Public
    David M. Berube North Carolina State University ▪ HOW WE COMMUNICATE. In The Age of American Unreason, Jacoby posited that it trickled down from the top, fueled by faux-populist politicians striving to make themselves sound approachable rather than smart. (Jacoby, 2008). EX: The average length of a sound bite by a presidential candidate in 1968 was 42.3 seconds. Two decades later, it was 9.8 seconds. Today, it’s just a touch over seven seconds and well on its way to being supplanted by 140/280- character Twitter bursts. ▪ DATA FRAMING. ▪ When asked if they truly believe what scientists tell them, NEW ANTI- only 36 percent of respondents said yes. Just 12 percent expressed strong confidence in the press to accurately INTELLECTUALISM: report scientific findings. ▪ ROLE OF THE PUBLIC. A study by two Princeton University researchers, Martin TRENDS Gilens and Benjamin Page, released Fall 2014, tracked 1,800 U.S. policy changes between 1981 and 2002, and compared the outcome with the expressed preferences of median- income Americans, the affluent, business interests and powerful lobbies. They concluded that average citizens “have little or no independent influence” on policy in the U.S., while the rich and their hired mouthpieces routinely get their way. “The majority does not rule,” they wrote. ▪ Anti-intellectualism and suspicion (trends). ▪ Trump world – outsiders/insiders. ▪ Erasing/re-writing history – damnatio memoriae. ▪ False news. ▪ Infoxication (CC) and infobesity. ▪ Aggregators and managed reality. ▪ Affirmation and confirmation bias. ▪ Negotiating reality. ▪ New tribalism is mostly ideational not political. ▪ Unspoken – guns, birth control, sexual harassment, race… “The amount of technical information is doubling every two years.
    [Show full text]
  • Cognitive Biases in Software Engineering: a Systematic Mapping Study
    Cognitive Biases in Software Engineering: A Systematic Mapping Study Rahul Mohanani, Iflaah Salman, Burak Turhan, Member, IEEE, Pilar Rodriguez and Paul Ralph Abstract—One source of software project challenges and failures is the systematic errors introduced by human cognitive biases. Although extensively explored in cognitive psychology, investigations concerning cognitive biases have only recently gained popularity in software engineering research. This paper therefore systematically maps, aggregates and synthesizes the literature on cognitive biases in software engineering to generate a comprehensive body of knowledge, understand state of the art research and provide guidelines for future research and practise. Focusing on bias antecedents, effects and mitigation techniques, we identified 65 articles (published between 1990 and 2016), which investigate 37 cognitive biases. Despite strong and increasing interest, the results reveal a scarcity of research on mitigation techniques and poor theoretical foundations in understanding and interpreting cognitive biases. Although bias-related research has generated many new insights in the software engineering community, specific bias mitigation techniques are still needed for software professionals to overcome the deleterious effects of cognitive biases on their work. Index Terms—Antecedents of cognitive bias. cognitive bias. debiasing, effects of cognitive bias. software engineering, systematic mapping. 1 INTRODUCTION OGNITIVE biases are systematic deviations from op- knowledge. No analogous review of SE research exists. The timal reasoning [1], [2]. In other words, they are re- purpose of this study is therefore as follows: curring errors in thinking, or patterns of bad judgment Purpose: to review, summarize and synthesize the current observable in different people and contexts. A well-known state of software engineering research involving cognitive example is confirmation bias—the tendency to pay more at- biases.
    [Show full text]
  • Fundamental Attribution Error Placebo E Ect Reactance Optimism Bias
    availability curse of anchoring sunk cost fallacy heuristic knowledge The first thing you judge influences your You irrationally cling to things that have already Your judgments are influenced by what Once you understand something you presume judgment of all that follows. cost you something. springs most easily to mind. it to be obvious to everyone. Human minds are associative in nature, so the order in which we When we've invested our time, money, or emotion into something, it How recent, emotionally powerful, or unusual your memories are can Things makes sense once they make sense, so it can be hard to receive information helps determine the course of our judgments hurts us to let it go. This aversion to pain can distort our better make them seem more relevant. This, in turn, can cause you to apply remember why they didn't. We build complex networks of understanding and perceptions. judgment and cause us to make unwise investments. them too readily. and forget how intricate the path to our available knowledge really is. Be especially mindful of this bias during financial negotiations such To regain objectivity, ask yourself: had I not already invested Try to gain dierent perspectives and relevant statistical When teaching someone something new, go slow and explain like as houses, cars, and salaries. The initial price oered is proven to something, would I still do so now? What would I counsel a friend information rather than relying purely on first judgments and they're ten years old (without being patronizing). Repeat key points have a significant eect.
    [Show full text]
  • Mitigating Cognitive Biases in Risk Identification
    https://ntrs.nasa.gov/search.jsp?R=20190025941 2019-08-31T14:18:02+00:00Z View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by NASA Technical Reports Server Mitigating Cognitive Biases in Risk Identification: Practitioner Checklist for the Aerospace Sector Debra Emmons, Thomas A. Mazzuchi, Shahram Sarkani, and Curtis E. Larsen This research contributes an operational checklist for mitigating cognitive biases in the aerospace sector risk management process. The Risk Identification and Evaluation Bias Reduction Checklist includes steps for grounding the risk identification and evaluation activities in past project experiences, through historical data, and the importance of incorporating multiple methods and perspectives to guard against optimism and a singular project instantiation focused view. The authors developed a survey to elicit subject matter expert (SME) judgment on the value of the checklist to support its use in government and industry as a risk management tool. The survey also provided insights on bias mitigation strategies and lessons learned. This checklist addresses the deficiency in the literature in providing operational steps for the practitioner for bias reduction in risk management in the aerospace sector. Two sentence summary: Cognitive biases such as optimism, planning fallacy, anchoring and ambiguity effect influence the risk identification and analysis processes used in the aerospace sector at the Department of Defense, other government organizations, and industry. This article incorporates practical experience through subject matter expert survey feedback into an academically grounded operational checklist and offers strategies for the project manager and risk management practitioner to reduce these pervasive biases. MITIGATING COGNITIVE BIASES IN RISK IDENTIFICATION Keywords: risk management; optimism bias; planning fallacy; cognitive bias reduction The research began with the review of the literature, which covered the areas of risk management, cognitive biases and bias enabling conditions.
    [Show full text]
  • 50 Cognitive and Affective Biases in Medicine (Alphabetically)
    50 Cognitive and Affective Biases in Medicine (alphabetically) Pat Croskerry MD, PhD, FRCP(Edin), Critical Thinking Program, Dalhousie University Aggregate bias: when physicians believe that aggregated data, such as those used to develop clinical practice guidelines, do not apply to individual patients (especially their own), they are exhibiting the aggregate fallacy. The belief that their patients are atypical or somehow exceptional, may lead to errors of commission, e.g. ordering x-rays or other tests when guidelines indicate none are required. Ambiguity effect: there is often an irreducible uncertainty in medicine and ambiguity is associated with uncertainty. The ambiguity effect is due to decision makers avoiding options when the probability is unknown. In considering options on a differential diagnosis, for example, this would be illustrated by a tendency to select options for which the probability of a particular outcome is known over an option for which the probability is unknown. The probability may be unknown because of lack of knowledge or because the means to obtain the probability (a specific test, or imaging) is unavailable. The cognitive miser function (choosing an option that requires less cognitive effort) may also be at play here. Anchoring: the tendency to perceptually lock on to salient features in the patient’s initial presentation too early in the diagnostic process, and failure to adjust this initial impression in the light of later information. This bias may be severely compounded by the confirmation bias. Ascertainment bias: when a physician’s thinking is shaped by prior expectation; stereotyping and gender bias are both good examples. Attentional bias: the tendency to believe there is a relationship between two variables when instances are found of both being present.
    [Show full text]
  • Fixing Biases: Principles of Cognitive De-Biasing
    Fixing biases: Principles of cognitive de-biasing Pat Croskerry MD, PhD Clinical Reasoning in Medical Education National Science Learning Centre, University of York Workshop, November 26, 2019 Case q A 65 year old female presents to the ED with a complaint of shoulder sprain. She said she was gardening this morning and injured her shoulder pushing her lawn mower. q At triage she has normal vital signs and in no distress. The triage nurse notes her complaint and triages her to the fast track area. q She is seen by an emergency physician who notes her complaint and examines her shoulder. He orders an X-ray. q The shoulder X ray shows narrowing of the joint and signs of osteoarthrtritis q He discharges her with a sling and Rx for Arthrotec q She is brought to the ED 4 hours later following an episode of syncope, sweating, and weakness. She is diagnosed with an inferior MI. Biases q A 65 year old female presents to the ED with a complaint of ‘shoulder sprain’. She said she was gardening this morning and sprained her shoulder pushing her lawn mower (Framing). q At triage she has normal vital signs and in no distress. The triage nurse notes her complaint and triages her to the fast track area (Triage cueing). q She is seen by an emergency physician who notes her complaint and examines her shoulder. He orders an X-ray (Ascertainment bias). q The shoulder X ray shows narrowing of the joint and signs of osteoarthrtritis. He explains to the patient the cause of her pain (Confirmation bias).
    [Show full text]
  • EFCOG Final Report Improve Planning and Forecasting, And
    EFCOG Final Report Improve Planning and Forecasting, and Reduce Risk, Using Behavioral Science to Mitigate Bias Project Delivery Working Group Risk Management Task Team September 2018 Improve Planning and Forecasting, and Reduce Risk, Using Behavioral Science to Mitigate Bias September 2018 Executive Summary The Energy Facility Contractors Group (EFCOG) is a self-directed group of contractors of U.S. Department of Energy Facilities. The purpose of EFCOG is to promote excellence in all aspects of operation and management of DOE facilities in a safe, environmentally sound, secure, efficient, and cost-effective manner through the ongoing exchange of information and corresponding improvement initiatives. The EFCOG Project Management Working Subgroup (PMWSG) established a Risk Management Task Team to promote, coordinate, and facilitate the active exchange of successful Risk Management programs, practices, procedures, lessons learned, and other pertinent information of common interest that have been effectively utilized by DOE contractors and can be adapted to enhance operational excellence and cost effectiveness for continual performance improvement by other DOE contractors. As part of the EFCOG Risk Management Task Team activities initiatives are identified, prioritized and planned. The planned activities are established in advance of the fiscal year start as part of an EFCOG Project Delivery Working Group (PDWG) Annual Work Plan. One such initiative is the investigation into how to eliminate “bias” in schedule development and schedule uncertainty with a goal of identifying and defining the different types of bias that influence practitioners when estimating uncertainty ranges and risk impacts, and providing recommendations (up to and including tools to avoid bias), to enable a more reasonable calibration of risk and uncertainty and resulting in a more accurate derivation of Management Reserve and Contingency.
    [Show full text]