Subjective Moral Biases & Fallacies

Total Page:16

File Type:pdf, Size:1020Kb

Subjective Moral Biases & Fallacies SUBJECTIVE MORAL BIASES & FALLACIES: DEVELOPING SCIENTIFICALLY & PRACTICALLY ADEQUATE MORAL ANALOGUES OF COGNITIVE HEURISTICS & BIASES Mark Herman A Dissertation Submitted to the Graduate College of Bowling Green State University in partial fulfillment of the requirements for the degree of DOCTOR OF PHILOSOPHY May 2019 Committee: Sara Worley, Advisor Richard Anderson Graduate Faculty Representative Theodore Bach Michael Bradie Michael Weber ii ABSTRACT Sara Worley, Advisor In this dissertation, I construct scientifically and practically adequate moral analogues of cognitive heuristics and biases. Cognitive heuristics are reasoning “shortcuts” that are efficient but flawed. Such flaws yield systematic judgment errors, cognitive biases. For example, the availability heuristic infers an event’s probability by seeing how easy it is to recall similar events. Since dramatic events like airplane crashes are disproportionately easy to recall, this heuristic explains systematic overestimations of their probability (availability bias). The research program on cognitive heuristics and biases (e.g., Daniel Kahneman’s work) has been scientifically successful and has yielded useful error-prevention techniques, cognitive debiasing. I attempt applying this framework to moral reasoning to yield moral heuristics and biases. For instance, a moral bias of unjustified differences in animal-species treatment might be partially explained by a moral heuristic that dubiously infers animals’ moral status from their aesthetic features. While the basis for identifying judgments as cognitive errors is often unassailable (e.g., per violating laws of logic), identifying moral errors seemingly requires appealing to moral truth, which, I argue, is problematic within science. Such appeals can be avoided by repackaging moral theories as mere “standards-of-interest” (a la non-normative metrics of purportedly right- making features/properties). However, standards-of-interest do not provide authority, which is needed for effective debiasing. Nevertheless, since each person deems their own subjective morality authoritative, subjective morality (qua standard-of-interest and not moral subjectivism) satisfies both scientific and practical concerns. As such, (idealized) subjective morality grounds iii a moral analogue of cognitive biases, subjective moral biases (e.g., committed non-racists unconsciously discriminating). I also argue that cognitive heuristic is defined by its contrast with rationality. Consequently, heuristics explain biases, which are also so defined. However, this property is causally-irrelevant to cognition. This frustrates heuristic’s presumed usefulness in causal explanation, wherein categories should be defined by causally-efficacious properties. As such, in the moral case, I jettison this role and tailor categories solely to contrastive explanations. As such, “moral heuristic” is replaced with subjective moral fallacy, which is defined by its contrast with subjective morality and explains subjective moral biases. The resultant subjective moral biases and fallacies framework can undergird future empirical research. iv To my grandmother, mother, and father for all their unwavering support. v ACKNOWLEDGMENTS This dissertation would not have been possible without the support and guidance of many people. First, I would like to thank my committee: Michael Weber, Michael Bradie, Theodore Bach, Richard Anderson, and especially, my advisor Sara Worley, who has been a great teacher, mentor, and friend over these many years. My thanks to Margy Deluca for all her tireless help. I’d also like to thank two special friends and philosophical interlocuters, Peter Jaworski and Scott Simmons. Most of all, I would like to thank my grandmother, mother, and father for all their love and unwavering support. vi TABLE OF CONTENTS Page CHAPTER 1: INTRODUCTION .......................................................................................... 1 1. Objective: Developing Moral Analogues of Cognitive Heuristics and Biases ...... 1 2. Potential Examples................................................................................................. 7 3. Plausibility ............................................................................................................ 10 3.1. Per Capacities & Processes ..................................................................... 11 3.2. Per Paradigm Applications ..................................................................... 15 4. Potential Benefits ................................................................................................... 17 CHAPTER 2: BACKGROUND ............................................................................................ 21 1. Cognitive Psychology’s (Default) Explanatory Paradigm ..................................... 21 2. Cognitive Heuristics & Biases ............................................................................... 35 2.1. Strong-DRCT .......................................................................................... 35 2.2. Weak-DRCT ........................................................................................... 39 2.3. Heuristics & Biases ................................................................................. 41 2.4. Representativeness Heuristic .................................................................. 44 2.5. The Linda Problem ................................................................................. 46 3. Judgments/Decisions/Actions ................................................................................ 51 CHAPTER 3: ADAPTING ‘COGNITIVE HEURISTIC’ .................................................... 57 1. Worry: Is ‘Cognitive Heuristic’ an Adequate Basis for a Moral Analogue? ......... 57 2. Does the Kind, Cognitive Heuristic, Make a Causal Explanatory Contribution? . 59 2.1. Is SCIRA a Causally-Efficacious Property? ........................................... 62 2.2. Is the Cognitive System Directly-Sensitive to SCIRA? ......................... 63 vii 2.3. Is the Cognitive System Indirectly-Sensitive to SCIRA? ....................... 65 3. Is Cognitive Heuristic an Objective Kind? ............................................................ 71 3.1. System-Intrinsic Objectivity ................................................................... 72 4. Does the Kind, Cognitive Heuristic, Contribute to Why-Have Explaining? ......... 75 4.1. Selection per SCIRA via the But-For Condition?................................... 76 4.2. Selection per SCIRA via SCIRA-Score? ................................................ 79 5. Does the Kind, Cognitive Heuristic, Contribute to Explaining Cognitive Biases? 82 6. Conjuncts of SCIRA .............................................................................................. 91 7. ‘Moral Fallacy’ ...................................................................................................... 97 CHAPTER 4: ADAPTING ‘COGNITIVE BIAS’ ................................................................ 99 1. Worry: Scientifically-Admissible Moral Standard? .............................................. 99 1.1. Adapt Ideal-Theoretical-Rationality? ..................................................... 101 1.2. Adapt A Posteriori Knowledge? ............................................................. 105 2. Viable Solution: Ideal Instrumental Moral Rationality ......................................... 106 2.1. Independence from the True, Best, and/or Real Morality ...................... 110 3. Subjective Moral Error .......................................................................................... 114 3.1. Idealization .............................................................................................. 116 3.2. Extensional Adequacy & Non-Alienation .............................................. 118 3.3. Counterfactual Endowments ................................................................... 120 3.4. Two-Tiered Subjective Idealization ........................................................ 129 4. Why Subjective Morality? ..................................................................................... 132 4.1. Species of Instrumental Rationality ........................................................ 132 4.2. The Best Standard for the MBF Program ............................................... 134 viii 4.3. Subjective Morality and Moral Improvement......................................... 138 CHAPTER 5: CLOSING REMARKS .................................................................................. 142 REFERENCES ...................................................................................................................... 147 ix LIST OF FIGURES Figure Page 1 Victor metal pedal rat trap M200 ........................................................................... 25 x LIST OF ACRONYMS CHB ..................................................................................... Cognitive heuristics and biases MHB .......................................................................................... Moral heuristics and biases MBF ............................................................................................. Moral biases and fallacies SMBF ......................................................................... Subjective moral biases and fallacies SCIRA ...................................................... Shortcut-vis-à-vis-the-ideally-rational-algorithm MEIRA ............................................
Recommended publications
  • A Task-Based Taxonomy of Cognitive Biases for Information Visualization
    A Task-based Taxonomy of Cognitive Biases for Information Visualization Evanthia Dimara, Steven Franconeri, Catherine Plaisant, Anastasia Bezerianos, and Pierre Dragicevic Three kinds of limitations The Computer The Display 2 Three kinds of limitations The Computer The Display The Human 3 Three kinds of limitations: humans • Human vision ️ has limitations • Human reasoning 易 has limitations The Human 4 ️Perceptual bias Magnitude estimation 5 ️Perceptual bias Magnitude estimation Color perception 6 易 Cognitive bias Behaviors when humans consistently behave irrationally Pohl’s criteria distilled: • Are predictable and consistent • People are unaware they’re doing them • Are not misunderstandings 7 Ambiguity effect, Anchoring or focalism, Anthropocentric thinking, Anthropomorphism or personification, Attentional bias, Attribute substitution, Automation bias, Availability heuristic, Availability cascade, Backfire effect, Bandwagon effect, Base rate fallacy or Base rate neglect, Belief bias, Ben Franklin effect, Berkson's paradox, Bias blind spot, Choice-supportive bias, Clustering illusion, Compassion fade, Confirmation bias, Congruence bias, Conjunction fallacy, Conservatism (belief revision), Continued influence effect, Contrast effect, Courtesy bias, Curse of knowledge, Declinism, Decoy effect, Default effect, Denomination effect, Disposition effect, Distinction bias, Dread aversion, Dunning–Kruger effect, Duration neglect, Empathy gap, End-of-history illusion, Endowment effect, Exaggerated expectation, Experimenter's or expectation bias,
    [Show full text]
  • The Availability Heuristic
    CHAPTER 11 THE AVAILABILITY HEURISTIC According to Amos Tversky and Daniel Kahneman (1974, p. 1127), the availability heuristic is a rule of thumb in which decision makers "assess the frequency of a class or the probability of an event by the ease with which instances or occurrences can be brought to mind." Usually this heuristic works quite well; all things being equal, common events are easier to remember or imagine than are uncommon events. By rely­ ing on availability to estimate frequency and probability, decision mak­ ers are able to simplify what might otherwise be very difficult judg­ ments. As with any heuristic, however, there are cases in which the general rule of thumb breaks down and leads to systematic biases. Some events are more available than others not because they tend to occur frequently or with high probability, but because they are inherently easier to think about, because they have taken place recently, because they are highly emotional, and so forth. This chapter examines three general questions: (1) What are instances in which the availability heuristic leads to biased judgments? (2) Do decision makers perceive an event as more likely after they have imagined it happening? (3) How is vivid information dif­ ferent from other information? AVAILABILITY GOES AWRY Which is a more likely cause of death in the United States-being killed by falling airplane parts or by a shark? Most people rate shark attacks as more probable than death from falling airplane parts (see Item #7 of the Reader Survey for your answer). Shark attacks certainly receive more publicity than do deaths from falling airplane parts, and they are far easier to imagine (thanks in part to movies such as Jaws).
    [Show full text]
  • Nudge Theory
    Nudge Theory http://www.businessballs.com/nudge-theory.htm Nudge theory is a flexible and modern concept for: • understanding of how people think, make decisions, and behave, • helping people improve their thinking and decisions, • managing change of all sorts, and • identifying and modifying existing unhelpful influences on people. Nudge theory was named and popularized by the 2008 book, 'Nudge: Improving Decisions About Health, Wealth, and Happiness', written by American academics Richard H Thaler and Cass R Sunstein. The book is based strongly on the Nobel prize- winning work of the Israeli-American psychologists Daniel Kahneman and Amos Tversky. This article: • reviews and explains Thaler and Sunstein's 'Nudge' concept, especially 'heuristics' (tendencies for humans to think and decide instinctively and often mistakenly) • relates 'Nudge' methods to other theories and models, and to Kahneman and Tversky's work • defines and describes additional methods of 'nudging' people and groups • extends the appreciation and application of 'Nudge' methodology to broader change- management, motivation, leadership, coaching, counselling, parenting, etc • offers 'Nudge' methods and related concepts as a 'Nudge' theory 'toolkit' so that the concept can be taught and applied in a wide range of situations involving relationships with people, and enabling people to improve their thinking and decision- making • and offers a glossary of Nudge theory and related terms 'Nudge' theory was proposed originally in US 'behavioral economics', but it can be adapted and applied much more widely for enabling and encouraging change in people, groups, or yourself. Nudge theory can also be used to explore, understand, and explain existing influences on how people behave, especially influences which are unhelpful, with a view to removing or altering them.
    [Show full text]
  • The Situational Character: a Critical Realist Perspective on the Human Animal , 93 93 Geo L
    Santa Clara Law Santa Clara Law Digital Commons Faculty Publications Faculty Scholarship 11-2004 The ituaS tional Character: A Critical Realist Perspective on the Human Animal Jon Hanson Santa Clara University School of Law David Yosifon Santa Clara University School of Law Follow this and additional works at: https://digitalcommons.law.scu.edu/facpubs Part of the Law and Economics Commons, Law and Society Commons, and the Legal History Commons Automated Citation Jon Hanson and David Yosifon, The Situational Character: A Critical Realist Perspective on the Human Animal , 93 93 Geo L. J. 1 (2004), Available at: https://digitalcommons.law.scu.edu/facpubs/59 This Article is brought to you for free and open access by the Faculty Scholarship at Santa Clara Law Digital Commons. It has been accepted for inclusion in Faculty Publications by an authorized administrator of Santa Clara Law Digital Commons. For more information, please contact [email protected], [email protected]. Articles The Situational Character: A Critical Realist Perspective on the Human Animal JON HANSON* & DAVID YOSIFON** Th is Article is dedicated to retiring the now-dominant "rational actor" model of human agency, together with its numerous "dispositionist" cohorts, and replacing them with a new conception of human agency that the authors call the "situational character." Th is is a key installment of a larger project recently introduced in an article titled The Situation: An Introduction to the Situational Character, Critical Realism, Power Economics, and Deep Capture. 1 That introduc­ tory article adumbrated, often in broad stroke, the central premises and some basic conclusions of a new app roach to legal theory and policy analysis.
    [Show full text]
  • Working Memory, Cognitive Miserliness and Logic As Predictors of Performance on the Cognitive Reflection Test
    Working Memory, Cognitive Miserliness and Logic as Predictors of Performance on the Cognitive Reflection Test Edward J. N. Stupple ([email protected]) Centre for Psychological Research, University of Derby Kedleston Road, Derby. DE22 1GB Maggie Gale ([email protected]) Centre for Psychological Research, University of Derby Kedleston Road, Derby. DE22 1GB Christopher R. Richmond ([email protected]) Centre for Psychological Research, University of Derby Kedleston Road, Derby. DE22 1GB Abstract Most participants respond that the answer is 10 cents; however, a slower and more analytic approach to the The Cognitive Reflection Test (CRT) was devised to measure problem reveals the correct answer to be 5 cents. the inhibition of heuristic responses to favour analytic ones. The CRT has been a spectacular success, attracting more Toplak, West and Stanovich (2011) demonstrated that the than 100 citations in 2012 alone (Scopus). This may be in CRT was a powerful predictor of heuristics and biases task part due to the ease of administration; with only three items performance - proposing it as a metric of the cognitive miserliness central to dual process theories of thinking. This and no requirement for expensive equipment, the practical thesis was examined using reasoning response-times, advantages are considerable. There have, moreover, been normative responses from two reasoning tasks and working numerous correlates of the CRT demonstrated, from a wide memory capacity (WMC) to predict individual differences in range of tasks in the heuristics and biases literature (Toplak performance on the CRT. These data offered limited support et al., 2011) to risk aversion and SAT scores (Frederick, for the view of miserliness as the primary factor in the CRT.
    [Show full text]
  • Heuristics and Biases the Psychology of Intuitive Judgment. In
    P1: FYX/FYX P2: FYX/UKS QC: FCH/UKS T1: FCH CB419-Gilovich CB419-Gilovich-FM May 30, 2002 12:3 HEURISTICS AND BIASES The Psychology of Intuitive Judgment Edited by THOMAS GILOVICH Cornell University DALE GRIFFIN Stanford University DANIEL KAHNEMAN Princeton University iii P1: FYX/FYX P2: FYX/UKS QC: FCH/UKS T1: FCH CB419-Gilovich CB419-Gilovich-FM May 30, 2002 12:3 published by the press syndicate of the university of cambridge The Pitt Building, Trumpington Street, Cambridge, United Kingdom cambridge university press The Edinburgh Building, Cambridge CB2 2RU, UK 40 West 20th Street, New York, NY 10011-4211, USA 477 Williamstown, Port Melbourne, VIC 3207, Australia Ruiz de Alarcon´ 13, 28014, Madrid, Spain Dock House, The Waterfront, Cape Town 8001, South Africa http://www.cambridge.org C Cambridge University Press 2002 This book is in copyright. Subject to statutory exception and to the provisions of relevant collective licensing agreements, no reproduction of any part may take place without the written permission of Cambridge University Press. First published 2002 Printed in the United States of America Typeface Palatino 9.75/12.5 pt. System LATEX2ε [TB] A catalog record for this book is available from the British Library. Library of Congress Cataloging in Publication data Heuristics and biases : the psychology of intuitive judgment / edited by Thomas Gilovich, Dale Griffin, Daniel Kahneman. p. cm. Includes bibliographical references and index. ISBN 0-521-79260-6 – ISBN 0-521-79679-2 (pbk.) 1. Judgment. 2. Reasoning (Psychology) 3. Critical thinking. I. Gilovich, Thomas. II. Griffin, Dale III. Kahneman, Daniel, 1934– BF447 .H48 2002 153.4 – dc21 2001037860 ISBN 0 521 79260 6 hardback ISBN 0 521 79679 2 paperback iv P1: FYX/FYX P2: FYX/UKS QC: FCH/UKS T1: FCH CB419-Gilovich CB419-Gilovich-FM May 30, 2002 12:3 Contents List of Contributors page xi Preface xv Introduction – Heuristics and Biases: Then and Now 1 Thomas Gilovich and Dale Griffin PART ONE.
    [Show full text]
  • Cognitive Biases in Software Engineering: a Systematic Mapping Study
    Cognitive Biases in Software Engineering: A Systematic Mapping Study Rahul Mohanani, Iflaah Salman, Burak Turhan, Member, IEEE, Pilar Rodriguez and Paul Ralph Abstract—One source of software project challenges and failures is the systematic errors introduced by human cognitive biases. Although extensively explored in cognitive psychology, investigations concerning cognitive biases have only recently gained popularity in software engineering research. This paper therefore systematically maps, aggregates and synthesizes the literature on cognitive biases in software engineering to generate a comprehensive body of knowledge, understand state of the art research and provide guidelines for future research and practise. Focusing on bias antecedents, effects and mitigation techniques, we identified 65 articles (published between 1990 and 2016), which investigate 37 cognitive biases. Despite strong and increasing interest, the results reveal a scarcity of research on mitigation techniques and poor theoretical foundations in understanding and interpreting cognitive biases. Although bias-related research has generated many new insights in the software engineering community, specific bias mitigation techniques are still needed for software professionals to overcome the deleterious effects of cognitive biases on their work. Index Terms—Antecedents of cognitive bias. cognitive bias. debiasing, effects of cognitive bias. software engineering, systematic mapping. 1 INTRODUCTION OGNITIVE biases are systematic deviations from op- knowledge. No analogous review of SE research exists. The timal reasoning [1], [2]. In other words, they are re- purpose of this study is therefore as follows: curring errors in thinking, or patterns of bad judgment Purpose: to review, summarize and synthesize the current observable in different people and contexts. A well-known state of software engineering research involving cognitive example is confirmation bias—the tendency to pay more at- biases.
    [Show full text]
  • Mind Perception Daniel R. Ames Malia F. Mason Columbia
    Mind Perception Daniel R. Ames Malia F. Mason Columbia University To appear in The Sage Handbook of Social Cognition, S. Fiske and N. Macrae (Eds.) Please do not cite or circulate without permission Contact: Daniel Ames Columbia Business School 707 Uris Hall 3022 Broadway New York, NY 10027 [email protected] 2 What will they think of next? The contemporary colloquial meaning of this phrase often stems from wonder over some new technological marvel, but we use it here in a wholly literal sense as our starting point. For millions of years, members of our evolving species have gazed at one another and wondered: what are they thinking right now … and what will they think of next? The interest people take in each other’s minds is more than idle curiosity. Two of the defining features of our species are our behavioral flexibility—an enormously wide repertoire of actions with an exquisitely complicated and sometimes non-obvious connection to immediate contexts— and our tendency to live together. As a result, people spend a terrific amount of time in close company with conspecifics doing potentially surprising and bewildering things. Most of us resist giving up on human society and embracing the life of a hermit. Instead, most perceivers proceed quite happily to explain and predict others’ actions by invoking invisible qualities such as beliefs, desires, intentions, and feelings and ascribing them without conclusive proof to others. People cannot read one another’s minds. And yet somehow, many times each day, most people encounter other individuals and “go mental,” as it were, adopting what is sometimes called an intentional stance, treating the individuals around them as if they were guided by unseen and unseeable mental states (Dennett, 1987).
    [Show full text]
  • CHALK TALK Thinking About Thinking: Medical Decision Making Under the Microscope Christiana Iyasere, MD, and Douglas Wright, MD, Phd
    SGIM FORUM 2011; 34(11) CHALK TALK Thinking about Thinking: Medical Decision Making Under the Microscope Christiana Iyasere, MD, and Douglas Wright, MD, PhD Drs. Iyasere and Wright are faculty in the Inpatient Clinician Educator Service of the Department of Medicine at Massachusetts General Hospital in Boson, MA. ase: A 36-year-old African-Ameri- athletic—he graduated from college than 10 seconds and says “20,160” C can woman, healthy except for with a degree in physics and has before moving breezily along with treated hypothyroidism, visits you in completed several triathlons. Neil is a her coffee. Having Katrina’s input, are clinic complaining of six months of fa- veteran of the US Navy, where he you tempted to change your answer tigue and progressive shortness of served as fleet naval aviator and land- to questions 3a and 3b? Go ahead, breath with exertion. You thoroughly ing signal officer. Is Neil more likely to admit it. Aren’t you now more confi- interview and examine the patient. be: a) a librarian or b) an astronaut? dent that the correct answer is that Physical examination reveals conjunc- Question 2: Jot down a list of the product is closest to 20,000? tival pallor and dullness to percussion English words that begin with the let- Question 5: You have known one third of the way up both lung ter “r” (e.g. rooster). Next, jot down your medical school roommate Jus- fields. Something tells you to ask her a list of words that have an r in the tice for four years.
    [Show full text]
  • Nawj Psych Terms
    NAWJ Terms List 1 Psychological terms useful in understanding mechanisms allowing unconscious bias Accentuation Effect: Overestimation of similarities among people within a group and dissimilarities between people from different groups Accentuation principle: States that categorization accentuates perceived similarities within and differences between groups on dimensions that people believe are correlated with the category. The effect is amplified where the categorization/dimension has subjective importance, relevance or value Actor-Observer effect: Tendency to attribute our own behaviors externally and others’ behaviors internally Agentic mode: State of mind thought by Milgram to characterize unquestioning obedience, in which people transfer personal responsibility to the person giving orders Anchoring and adjustment : A cognitive short-cut in which inferences are tied to initial standards or schemas Attitude: A relatively enduring organization of beliefs, feelings and behavioral tendencies towards socially significant objects, groups, events or symbols. Attitude change can occur by inducing someone to perform an act that runs counter to an existing attitude. Attribution : The process of assigning a cause to behaviors and events Availability bias: A cognitive shortcut in which the frequency or likelihood of an event is based on how quickly instances or associations come to mind Bias blind spot: tendency to perceive cognitive and motivational biases much more in others than in oneself Cognition: The knowledge, beliefs, thoughts, and ideas
    [Show full text]
  • 50 Cognitive and Affective Biases in Medicine (Alphabetically)
    50 Cognitive and Affective Biases in Medicine (alphabetically) Pat Croskerry MD, PhD, FRCP(Edin), Critical Thinking Program, Dalhousie University Aggregate bias: when physicians believe that aggregated data, such as those used to develop clinical practice guidelines, do not apply to individual patients (especially their own), they are exhibiting the aggregate fallacy. The belief that their patients are atypical or somehow exceptional, may lead to errors of commission, e.g. ordering x-rays or other tests when guidelines indicate none are required. Ambiguity effect: there is often an irreducible uncertainty in medicine and ambiguity is associated with uncertainty. The ambiguity effect is due to decision makers avoiding options when the probability is unknown. In considering options on a differential diagnosis, for example, this would be illustrated by a tendency to select options for which the probability of a particular outcome is known over an option for which the probability is unknown. The probability may be unknown because of lack of knowledge or because the means to obtain the probability (a specific test, or imaging) is unavailable. The cognitive miser function (choosing an option that requires less cognitive effort) may also be at play here. Anchoring: the tendency to perceptually lock on to salient features in the patient’s initial presentation too early in the diagnostic process, and failure to adjust this initial impression in the light of later information. This bias may be severely compounded by the confirmation bias. Ascertainment bias: when a physician’s thinking is shaped by prior expectation; stereotyping and gender bias are both good examples. Attentional bias: the tendency to believe there is a relationship between two variables when instances are found of both being present.
    [Show full text]
  • Croskerry MD, Phd, FRCP(Edin)
    Clinical Decision Making + Strategies for Cognitive Debiasing Pat Croskerry MD, PhD, FRCP(Edin) International Association of Endodontists Scottsdale, Arizona June 2019 Financial Disclosures or other Conflicts of Interest None It is estimated that an American adult makes 35,000 decisions a day i.e. about 2200 each waking hour Sollisch J: The cure for decision fatigue. Wall Street Journal, 2016 Decision making ‘The most important decision we need to make in Life is how we are going to make decisions’ Professor Gigerenzer Is there a problem with the way we think and make decisions? 3 domains of decision making Patients Healthcare leadership Healthcare providers Patients Leading Medical Causes of Death in the US and their Preventability in 2000 Cause Total Preventability (%) Heart disease 710,760 46 Malignant neoplasms 553,091 66 Cerebrovascular 167,661 43 Chronic respiratory 122,009 76 Accidents 97,900 44 Diabetes mellitus 69,301 33 Acute respiratory 65,313 23 Suicide 29,350 100 Chronic Liver disease 26,552 75 Hypertension/renal 12,228 68 Assault (homicide) 16,765 100 All other 391,904 14 Keeney (2008) Healthcare leadership Campbell et al, 2017 Healthcare providers US deaths in 2013 • 611,105 Heart disease • 584,881 Cancer • 251,454 Medical error Medical error is the 3rd leading cause of death Estimated number of preventable hospital deaths due to diagnostic failure annually in the US 40,000 – 80,000 Leape, Berwick and Bates JAMA 2002 Diagnostic failure is the biggest problem in patient safety Newman-Toker, 2017 Sources of Diagnostic Failure The System 25% The Individual 75% Graber M, Gordon R, Franklin N.
    [Show full text]