How Can Project Managers Mitigate Their Cognitive Biases? Cognitive Biases in Decision-Making

Total Page:16

File Type:pdf, Size:1020Kb

How Can Project Managers Mitigate Their Cognitive Biases? Cognitive Biases in Decision-Making MPM – Master of Project Management How can project managers mitigate their cognitive biases? Cognitive biases in decision-making May, 2020 Student: Ingunn Þorvarðardóttir 080984-2949 Supervisor: Mike Hogan 9 ECTS for the degree of MPM (Master of Project Management) HOW CAN PROJECT MANAGERS MITIGATE THEIR COGNITIVE BIASES? -COGNITIVE BIASES IN DECISION-MAKING Ingunn Þorvarðardóttir Paper presented as part of requirements for the degree of Master of Project Management (MPM) Reykjavik University - May 2020 ABSTRACT The decision-making process is an essential part of project managers profession. Project managers are in the process every day, making their own decisions or consulting decision made by other parties of the project. Every decision made is affected by judgemental errors like cognitive biases. This paper is a literature review where research findings on cognitive biases in decision-making are studied, and what are the key factors for project managers to get to know their biases and mitigate them in their decision-making. The findings of this paper are that awareness, good communication and collaboration are the key factors in mitigating cognitive biases in decision-making. The literature review indicates a gap in researches within project management and the awareness of cognitive biases, with an opportunity to further researches in the field of cognitive biases and project management. 1. INTRODUCTION The decision-making process is one of the foundations in project management. Both internal and external factors can affect the process with variable outcomes. One of the factors that can affect the project manager judgment and thinking are cognitive biases. The motivation and interest in this subject come from its importance. The importance that project managers have self-awareness and are able to face and accept that biases and thinking errors can occur irrespective of experience and knowledge. The decision-making process is complex, and with various factors to look at before a decision is made, and cognitive biases should be one of them. This paper endeavours to address the decision-making process and the most common cognitive biases in decision-making. That, along with how project managers can effectively mitigate their own cognitive biases. Furthermore, this paper aims to answer the following research question: "How can a project manager raise awareness and mitigate cognitive biases in decision-making?" 2 2. LITERATURE REVIEW This section briefly describes the decision-making process, cognitive biases, and ways to mitigate the effect cognitive biases can have on the decision-making process. 2.1 Decision-making Herbert Simon is a known researcher in decision-making within companies. His studies showed that decision-making is at the core of all manageable processes, and companies that adapt to modern decision-making methods would get a competitive advantage in a world where the business environment is in rapid progress. Ingason and Jónasson (2016) described Simons three stages in decision-making as following and seen in figure 1: 1. Intelligence gathering 2. Design 3. Choice Figure 1- Simon's three stages of decision-making. Adapted from Wikipedia. https://en.wikipedia.org/wiki/Herbert_A._Simon. Retrieved on May 5th, 2020. Simon's research about decision-making revealed the following. Firstly, decisions should be investigated empirically instead of assuming that decisions will follow any formal models like logical or statistical models. Secondly, in decision-making, three factors should be taken into account: a. Of what type is the task? b. What are the characteristics of the environment and the apparent characteristics of the cognitive system that controls the decision- making? Thirdly, the human behaviour in decision-making should be compared to models made from the empirical data and their predictions, and this should only be done in conjunction with the collection of the empirical data (Campitelli & Gobet, 2010). 3 In every project, many decisions are made. Decisions can be made by the project manager, the steering committee, the project owner or other parties of interest. Most of the time, the decision is influenced by a recommendation from the project manager. The quality of the decision made depends on many factors such as the assumption underlying the decision, the estimate of the decision-maker knowledge and experience, the understanding of the project goal, and to perceive the decision's impact on other parts of the project. The quality of the decision made depends on these and other factors (Mikkelsen & Riis, 2017). In Individual Competence Baseline for Project, Programme & Portfolio Management, 4th Version (ICB4) by the International Project Management Association (2015), making a decision is defined as "being able to select a course of action based on several possible alternative paths" (IPMA, 2015, p. 78). It also says that decisions are made by choosing consciously and selecting alternatives that best fit the subject. The decision should be made based on information and data analyse in collaboration with others and their opinion. In some cases, a decision is made with inadequate information, even based on the project manager's intuition with unknown consequences. Based on that, decisions often need to be reviewed or even changed due to new information. For some project managers, it might seem natural to analyse information in a short period of time, be able to make a decision and stand by the decision. However, the ability to make decision isn't necessarily innate; the process of decision-making is a skill that can be developed like any other skill. Decision-making is built on the ability to coordinate different points of view, analyse information that is on hand, evaluate the information, pick and choose and finally choose the decision that is the most suitable (Ingason & Jónasson, 2016). 2.2 Cognitive biases and decision-making In the early 1970s, Kahneman and Tversky introduced the term cognitive bias and defined it as the human inability to reason in a rational way (Bendul, 2019). Cognitive biases are mental behaviours that can affect the quality of our decision- making. Therefore it is often called judgement biases (Arnott, 2006). It can also influence how individuals form their beliefs and behave in diverse environments (Chatzipanos & Giotis, 2014). In the literature, more than 100 cognitive, decision- making and memory-related biases have been documented, and further research continues to identify and delineate new biases (Ehrlinger et al., 2016). In PMI's guide, Navigating Complexity: A Practice Guide (2014), three groups of causes of complexity in projects and/or programs are addressed. These group of causes are: - Human Behaviour (individual behaviour, group behaviour [organisational/social/political], communication and control, and organisational design and development) - System Behaviour (connectedness, dependency, and system dynamics) - Ambiguity (uncertainty and emergence) Because of human nature, an individual's behaviour is a part of every project complexity, but human behaviours are neither always rational nor deliberate. One 4 element of this act of irrationality is cognitive biases (Project Management Institute, 2014). This is in harmony with Kahneman's research that decision-making in a complex and uncertain environment goes wrong because of mental behaviours like cognitive biases (Kahneman, 2002). Cognitive biases are a type of error in thinking that occurs when the brain tries to simplify the information process. That can result in seeing things that are nonsensical not there, or in just being wrong by refusing to see the facts. As said before, when cognitive biases affect individual thinking, complexity can arise. (Chatzipanos & Giotis, 2014) Researches on decision-making and cognitive bias have shown a different result of what cognitive biases are the most common in that process. It varies on the type of decision and subject. Some are mentioned more often than others, but all affect the decision process and the lifecycle of the project in one way or another (Arnott, 2006; Bendul, 2019; Chatzipanos & Giotis, 2014; Project Management Institute, 2014; Taylor, 2013). Cognitive biases can likely overlap in definition, and their effect is more extensive. Factors that arise from psychological pathology, religious belief or social pressure are typically excluded from consideration in researches regarding cognitive bias (Arnott, 2006). 2.3 Getting to know your cognitive biases (Types of cognitive biases) In general, cognitive bias can be divided into two categories. Information biases affect information processing, make the decision-maker take fast and noncritical decisions without paying attention to information that truly matters. That also includes the use of heuristics, but heuristics are mental shortcuts in decision-making. It allows the decision-maker to make a decision quickly with minimal mental effort and even pass all judgement. Secondly, there are ego biases. Ego biases are cognitive biases that act on emotional motivation and social influences like the need for acceptance and peer pressure (Taylor, 2013). The following are some of the cognitive biases that affect the decision-making process: Optimism bias Optimism bias is the tendency of being overly optimistic, to believe to be less likely to fail or get adverse outcomes, to be more of a success than others and not estimate at all or underestimate outcomes that are not in the project favour. It also presents as overconfidence
Recommended publications
  • A Task-Based Taxonomy of Cognitive Biases for Information Visualization
    A Task-based Taxonomy of Cognitive Biases for Information Visualization Evanthia Dimara, Steven Franconeri, Catherine Plaisant, Anastasia Bezerianos, and Pierre Dragicevic Three kinds of limitations The Computer The Display 2 Three kinds of limitations The Computer The Display The Human 3 Three kinds of limitations: humans • Human vision ️ has limitations • Human reasoning 易 has limitations The Human 4 ️Perceptual bias Magnitude estimation 5 ️Perceptual bias Magnitude estimation Color perception 6 易 Cognitive bias Behaviors when humans consistently behave irrationally Pohl’s criteria distilled: • Are predictable and consistent • People are unaware they’re doing them • Are not misunderstandings 7 Ambiguity effect, Anchoring or focalism, Anthropocentric thinking, Anthropomorphism or personification, Attentional bias, Attribute substitution, Automation bias, Availability heuristic, Availability cascade, Backfire effect, Bandwagon effect, Base rate fallacy or Base rate neglect, Belief bias, Ben Franklin effect, Berkson's paradox, Bias blind spot, Choice-supportive bias, Clustering illusion, Compassion fade, Confirmation bias, Congruence bias, Conjunction fallacy, Conservatism (belief revision), Continued influence effect, Contrast effect, Courtesy bias, Curse of knowledge, Declinism, Decoy effect, Default effect, Denomination effect, Disposition effect, Distinction bias, Dread aversion, Dunning–Kruger effect, Duration neglect, Empathy gap, End-of-history illusion, Endowment effect, Exaggerated expectation, Experimenter's or expectation bias,
    [Show full text]
  • Optimism Bias and Illusion of Control in Finance Professionals
    November, 28th – P13 Look ma(rket), No Hands! Optimism Bias and Illusion of Control in Finance Professionals Francesco Marcatto, Giovanni Colangelo, Donatella Ferrante University of Trieste, Department of Life Sciences, Psychology Unit Gaetano Kanizsa, Trieste, Italy Abstract behavior, whereas extreme optimists display financial habits The optimism bias is the tendency to judge one’s own risk as and behavior that are generally not considered prudent (Puri less than the risk of others. In the present study we found that & Robinson, 2007). Moreover, analysts have been shown to also finance professionals (N = 60) displayed an optimism be systematically overoptimistic in their forecasts about the bias when forecasting the return of an investment made by earnings potential of firms at the time of their initial public themselves or by a colleague of the same expertise. Using a offerings, and this optimism increases as the length of the multidimensional approach to the assessment of risk forecast period increases (McNichols & O’Brien, 1997; perception, we found that participants’ forecasts were biased Rajan & Servaes, 1997). To the best of our knowledge, not because they judged negative consequences as less likely for themselves, but because they were overconfident in their however, the presence of the optimism bias/unrealistic ability to avoid and control them. comparative optimism in financial risk (i.e., judging personal risk of losing money as lower than other investors’ Keywords: Optimism bias; unrealistic comparative risk) has never been subject of empirical testing. optimism; financial risk; investment risk; perceived control; illusion of control. The main aim of this study is to investigate whether the optimism bias can be observed also in experts in the field of Introduction financial risk.
    [Show full text]
  • Behavioral Biases on Investment Decision: a Case Study in Indonesia
    Kartini KARTINI, Katiya NAHDA / Journal of Asian Finance, Economics and Business Vol 8 No 3 (2021) 1231–1240 1231 Print ISSN: 2288-4637 / Online ISSN 2288-4645 doi:10.13106/jafeb.2021.vol8.no3.1231 Behavioral Biases on Investment Decision: A Case Study in Indonesia Kartini KARTINI1, Katiya NAHDA2 Received: November 30, 2020 Revised: February 07, 2021 Accepted: February 16, 2021 Abstract A shift in perspective from standard finance to behavioral finance has taken place in the past two decades that explains how cognition and emotions are associated with financial decision making. This study aims to investigate the influence of various psychological factors on investment decision-making. The psychological factors that are investigated are differentiated into two aspects, cognitive and emotional aspects. From the cognitive aspect, we examine the influence of anchoring, representativeness, loss aversion, overconfidence, and optimism biases on investor decisions. Meanwhile, from the emotional aspect, the influence of herding behavior on investment decisions is analyzed. A quantitative approach is used based on a survey method and a snowball sampling that result in 165 questionnaires from individual investors in Yogyakarta. Further, we use the One-Sample t-test in testing all hypotheses. The research findings show that all of the variables, anchoring bias, representativeness bias, loss aversion bias, overconfidence bias, optimism bias, and herding behavior have a significant effect on investment decisions. This result emphasizes the influence of behavioral factors on investor’s decisions. It contributes to the existing literature in understanding the dynamics of investor’s behaviors and enhance the ability of investors in making more informed decision by reducing all potential biases.
    [Show full text]
  • MITIGATING COGNITIVE BIASES in RISK IDENTIFICATION: Practitioner Checklist for the AEROSPACE SECTOR
    MITIGATING COGNITIVE BIASES IN RISK IDENTIFICATION: Practitioner Checklist for the AEROSPACE SECTOR Debra L. Emmons, Thomas A. Mazzuchi, Shahram Sarkani, and Curtis E. Larsen This research contributes an operational checklist for mitigating cogni- tive biases in the aerospace sector risk management process. The Risk Identification and Evaluation Bias Reduction Checklist includes steps for grounding the risk identification and evaluation activities in past project experiences through historical data, and emphasizes the importance of incorporating multiple methods and perspectives to guard against optimism and a singular project instantiation-focused view. The authors developed a survey to elicit subject matter expert judgment on the value of the check- list to support its use in government and industry as a risk management tool. The survey also provided insights on bias mitigation strategies and lessons learned. This checklist addresses the deficiency in the literature in providing operational steps for the practitioner to recognize and implement strategies for bias reduction in risk management in the aerospace sector. DOI: https://doi.org/10.22594/dau.16-770.25.01 Keywords: Risk Management, Optimism Bias, Planning Fallacy, Cognitive Bias Reduction Mitigating Cognitive Biases in Risk Identification http://www.dau.mil January 2018 This article and its accompanying research contribute an operational FIGURE 1. RESEARCH APPROACH Risk Identification and Evaluation Bias Reduction Checklist for cognitive bias mitigation in risk management for the aerospace sector. The checklist Cognitive Biases & Bias described herein offers a practical and implementable project management Enabling framework to help reduce biases in the aerospace sector and redress the Conditions cognitive limitations in the risk identification and analysis process.
    [Show full text]
  • The Optimism Bias: a Cognitive Neuroscience Perspective 1
    Xjenza Online - Journal of The Malta Chamber of Scientists www.xjenza.org Doi: http://dx.medra.org/10.7423/XJENZA.2014.1.04 Review Article The Optimism Bias: A cognitive neuroscience perspective Claude J. Bajada1 1Institute of Cognitive Neuroscience, University College London, WC1N 3AR, UK Abstract. The optimism bias is a well-established 2 The Study of the Optimism psychological phenomenon. Its study has implications that are far reaching in fields as diverse as mental Bias health and economic theory. With the emerging field of cognitive neuroscience and the advent of advanced neu- roimaging techniques, it has been possible to investigate the neural basis of the optimism bias and to understand One way for scientists to test the optimism bias in the in which neurological conditions this natural bias fails. laboratory is to ask an individual to predict his chances This review first defines the optimism bias, discusses its of experiencing an event and then following up to see implications and reviews the literature that investigates whether the event transpired. The problem with this its neural basis. Finally some potential pitfalls in approach is that the outcome of an event does not al- experimental design are discussed. ways accurately represent the person's prior chances to attain that outcome; this is especially true when the outcome is a binary one. For example, we know that an Keywords Optimism bias - cognitive neuroscience - individual has an infinitesimally small chance at winning psychology - neural basis. the national lottery. If Peter predicts that he has a 75% chance of winning next week's lottery and then happens to win the lottery, this does not mean that Peter was 1 Introduction actually pessimistic in his prediction.
    [Show full text]
  • Cognitive Biases in Software Engineering: a Systematic Mapping Study
    Cognitive Biases in Software Engineering: A Systematic Mapping Study Rahul Mohanani, Iflaah Salman, Burak Turhan, Member, IEEE, Pilar Rodriguez and Paul Ralph Abstract—One source of software project challenges and failures is the systematic errors introduced by human cognitive biases. Although extensively explored in cognitive psychology, investigations concerning cognitive biases have only recently gained popularity in software engineering research. This paper therefore systematically maps, aggregates and synthesizes the literature on cognitive biases in software engineering to generate a comprehensive body of knowledge, understand state of the art research and provide guidelines for future research and practise. Focusing on bias antecedents, effects and mitigation techniques, we identified 65 articles (published between 1990 and 2016), which investigate 37 cognitive biases. Despite strong and increasing interest, the results reveal a scarcity of research on mitigation techniques and poor theoretical foundations in understanding and interpreting cognitive biases. Although bias-related research has generated many new insights in the software engineering community, specific bias mitigation techniques are still needed for software professionals to overcome the deleterious effects of cognitive biases on their work. Index Terms—Antecedents of cognitive bias. cognitive bias. debiasing, effects of cognitive bias. software engineering, systematic mapping. 1 INTRODUCTION OGNITIVE biases are systematic deviations from op- knowledge. No analogous review of SE research exists. The timal reasoning [1], [2]. In other words, they are re- purpose of this study is therefore as follows: curring errors in thinking, or patterns of bad judgment Purpose: to review, summarize and synthesize the current observable in different people and contexts. A well-known state of software engineering research involving cognitive example is confirmation bias—the tendency to pay more at- biases.
    [Show full text]
  • Fundamental Attribution Error Placebo E Ect Reactance Optimism Bias
    availability curse of anchoring sunk cost fallacy heuristic knowledge The first thing you judge influences your You irrationally cling to things that have already Your judgments are influenced by what Once you understand something you presume judgment of all that follows. cost you something. springs most easily to mind. it to be obvious to everyone. Human minds are associative in nature, so the order in which we When we've invested our time, money, or emotion into something, it How recent, emotionally powerful, or unusual your memories are can Things makes sense once they make sense, so it can be hard to receive information helps determine the course of our judgments hurts us to let it go. This aversion to pain can distort our better make them seem more relevant. This, in turn, can cause you to apply remember why they didn't. We build complex networks of understanding and perceptions. judgment and cause us to make unwise investments. them too readily. and forget how intricate the path to our available knowledge really is. Be especially mindful of this bias during financial negotiations such To regain objectivity, ask yourself: had I not already invested Try to gain dierent perspectives and relevant statistical When teaching someone something new, go slow and explain like as houses, cars, and salaries. The initial price oered is proven to something, would I still do so now? What would I counsel a friend information rather than relying purely on first judgments and they're ten years old (without being patronizing). Repeat key points have a significant eect.
    [Show full text]
  • Fixing Biases: Principles of Cognitive De-Biasing
    Fixing biases: Principles of cognitive de-biasing Pat Croskerry MD, PhD Clinical Reasoning in Medical Education National Science Learning Centre, University of York Workshop, November 26, 2019 Case q A 65 year old female presents to the ED with a complaint of shoulder sprain. She said she was gardening this morning and injured her shoulder pushing her lawn mower. q At triage she has normal vital signs and in no distress. The triage nurse notes her complaint and triages her to the fast track area. q She is seen by an emergency physician who notes her complaint and examines her shoulder. He orders an X-ray. q The shoulder X ray shows narrowing of the joint and signs of osteoarthrtritis q He discharges her with a sling and Rx for Arthrotec q She is brought to the ED 4 hours later following an episode of syncope, sweating, and weakness. She is diagnosed with an inferior MI. Biases q A 65 year old female presents to the ED with a complaint of ‘shoulder sprain’. She said she was gardening this morning and sprained her shoulder pushing her lawn mower (Framing). q At triage she has normal vital signs and in no distress. The triage nurse notes her complaint and triages her to the fast track area (Triage cueing). q She is seen by an emergency physician who notes her complaint and examines her shoulder. He orders an X-ray (Ascertainment bias). q The shoulder X ray shows narrowing of the joint and signs of osteoarthrtritis. He explains to the patient the cause of her pain (Confirmation bias).
    [Show full text]
  • EFCOG Final Report Improve Planning and Forecasting, And
    EFCOG Final Report Improve Planning and Forecasting, and Reduce Risk, Using Behavioral Science to Mitigate Bias Project Delivery Working Group Risk Management Task Team September 2018 Improve Planning and Forecasting, and Reduce Risk, Using Behavioral Science to Mitigate Bias September 2018 Executive Summary The Energy Facility Contractors Group (EFCOG) is a self-directed group of contractors of U.S. Department of Energy Facilities. The purpose of EFCOG is to promote excellence in all aspects of operation and management of DOE facilities in a safe, environmentally sound, secure, efficient, and cost-effective manner through the ongoing exchange of information and corresponding improvement initiatives. The EFCOG Project Management Working Subgroup (PMWSG) established a Risk Management Task Team to promote, coordinate, and facilitate the active exchange of successful Risk Management programs, practices, procedures, lessons learned, and other pertinent information of common interest that have been effectively utilized by DOE contractors and can be adapted to enhance operational excellence and cost effectiveness for continual performance improvement by other DOE contractors. As part of the EFCOG Risk Management Task Team activities initiatives are identified, prioritized and planned. The planned activities are established in advance of the fiscal year start as part of an EFCOG Project Delivery Working Group (PDWG) Annual Work Plan. One such initiative is the investigation into how to eliminate “bias” in schedule development and schedule uncertainty with a goal of identifying and defining the different types of bias that influence practitioners when estimating uncertainty ranges and risk impacts, and providing recommendations (up to and including tools to avoid bias), to enable a more reasonable calibration of risk and uncertainty and resulting in a more accurate derivation of Management Reserve and Contingency.
    [Show full text]
  • Croskerry MD, Phd, FRCP(Edin)
    Clinical Decision Making + Strategies for Cognitive Debiasing Pat Croskerry MD, PhD, FRCP(Edin) International Association of Endodontists Scottsdale, Arizona June 2019 Financial Disclosures or other Conflicts of Interest None It is estimated that an American adult makes 35,000 decisions a day i.e. about 2200 each waking hour Sollisch J: The cure for decision fatigue. Wall Street Journal, 2016 Decision making ‘The most important decision we need to make in Life is how we are going to make decisions’ Professor Gigerenzer Is there a problem with the way we think and make decisions? 3 domains of decision making Patients Healthcare leadership Healthcare providers Patients Leading Medical Causes of Death in the US and their Preventability in 2000 Cause Total Preventability (%) Heart disease 710,760 46 Malignant neoplasms 553,091 66 Cerebrovascular 167,661 43 Chronic respiratory 122,009 76 Accidents 97,900 44 Diabetes mellitus 69,301 33 Acute respiratory 65,313 23 Suicide 29,350 100 Chronic Liver disease 26,552 75 Hypertension/renal 12,228 68 Assault (homicide) 16,765 100 All other 391,904 14 Keeney (2008) Healthcare leadership Campbell et al, 2017 Healthcare providers US deaths in 2013 • 611,105 Heart disease • 584,881 Cancer • 251,454 Medical error Medical error is the 3rd leading cause of death Estimated number of preventable hospital deaths due to diagnostic failure annually in the US 40,000 – 80,000 Leape, Berwick and Bates JAMA 2002 Diagnostic failure is the biggest problem in patient safety Newman-Toker, 2017 Sources of Diagnostic Failure The System 25% The Individual 75% Graber M, Gordon R, Franklin N.
    [Show full text]
  • Following the Rules This Page Intentionally Left Blank Following the Rules
    Following the Rules This page intentionally left blank Following the Rules Practical Reasoning and Deontic Constraint joseph heath 1 2008 3 Oxford University Press, Inc., publishes works that further Oxford University’s objective of excellence in research, scholarship, and education. Oxford New York Auckland Cape Town Dar es Salaam Hong Kong Karachi Kuala Lumpur Madrid Melbourne Mexico City Nairobi New Delhi Shanghai Taipei Toronto With offi ces in Argentina Austria Brazil Chile Czech Republic France Greece Guatemala Hungary Italy Japan Poland Portugal Singapore South Korea Switzerland Thailand Turkey Ukraine Vietnam Copyright © 2008 by Oxford University Press, Inc. Published by Oxford University Press, Inc. 198 Madison Avenue, New York, New York 10016 www.oup.com Oxford is a registered trademark of Oxford University Press All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior permission of Oxford University Press. Library of Congress Cataloging-in-Publication Data Heath, Joseph, 1967– Following the rules : practical reasoning and deontic constraint / Joseph Heath. p. cm. ISBN 978-0-19-537029-4 1. Deontic logic. 2. Practical reason. 3. Ethics. 4. Duty. I. Title. BC145.H43 2008 128'.4—dc22 2007052440 9 8 7 6 5 4 3 2 1 Printed in the United States of America on acid-free paper Acknowledgments My thanks to those who have helped me with this book over the years, along with those who have read and commented on it. Special thanks to Joel Anderson, Benoît Dubreuil, Benoit Hardy-Vallée, Vida Panitch, Patrick Turmel, Scott Woodcock, and Sergio Tenenbaum.
    [Show full text]
  • (2010). Subjective Invulnerability, Optimism Bias and Adjustment In
    J Youth Adolescence DOI 10.1007/s10964-009-9409-9 EMPIRICAL RESEARCH Subjective Invulnerability, Optimism Bias and Adjustment in Emerging Adulthood Daniel K. Lapsley Æ Patrick L. Hill Received: 16 February 2009 / Accepted: 27 March 2009 Ó Springer Science+Business Media, LLC 2009 Abstract The relationship between subjective invulner- Introduction ability and optimism bias in risk appraisal, and their comparative association with indices of risk activity, sub- It is believed widely that adolescents and emerging adults stance use and college adjustment problems was assessed engage in risk behaviors partly because of their felt sense in a sample of 350 (Mage = 20.17; 73% female; 93% of invulnerability to injury, harm and danger. This view is White/European American) emerging adults. Subjective so common that it seems to be a deeply entrenched part of invulnerability was measured with the newly devised our folk psychology of adolescence. Yet, in spite of its adolescent invulnerability scale (AIS). Optimism bias in ubiquity as an explanation of adolescent behavior, there is decision-making was assessed with a standard compara- no consensus on how invulnerability is to be understood, tive-conditional risk appraisal task. Results showed that the and, indeed, there is controversy about its role in adoles- danger- and psychological invulnerability subscales of the cent development and its implications for adaptation (e.g., AIS demonstrated strong internal consistency and evidence Elkind 1985; Lapsley and Murphy 1985). of predictive validity. Subjective invulnerability and opti- At least two developmental approaches to invulnera- mism bias were also shown to be empirically distinct bility can be discerned in the literature.
    [Show full text]