No Evidence That Omission and Confirmation Biases Affect the Perception and Recall of Vaccine- Related Information

Total Page:16

File Type:pdf, Size:1020Kb

No Evidence That Omission and Confirmation Biases Affect the Perception and Recall of Vaccine- Related Information 1 NO EVIDENCE THAT OMISSION AND 2 CONFIRMATION BIASES AFFECT THE 3 PERCEPTION AND RECALL OF VACCINE- 4 RELATED INFORMATION 5 6 This is a preliminary draft 7 8 Ángel V. Jiménez 1 2 3 *, Alex Mesoudi 3 & Jamshid J. Tehrani 1 2 9 1 Durham University, UK, Centre for the Coevolution of Biology and Culture, Department of 10 Anthropology. 11 2 Durham University, UK, Conspiracy Theories in Health Special Interest Group, Wolfson Research 12 Institute for Health and Wellbeing. 13 3 Human Behaviour and Cultural Evolution Group, Department of Biosciences, University of Exeter, 14 UK 15 *Correspondence should be addressed to Ángel V. Jiménez: [email protected] 16 ABSTRACT 17 Despite the spectacular success of vaccines in preventing infectious diseases, fears 18 about their safety and other anti-vaccination claims are widespread. To better understand 19 how such fears and claims persist and spread, we must understand how they are perceived 20 and recalled. One influence on the perception and recall of vaccination-related information 21 might be universal cognitive biases acting against vaccination. An omission bias describes 22 the tendency to perceive as worse, and recall better, bad outcomes resulting from 23 commissions (e.g. vaccine side effects) compared to the same bad outcomes resulting from 24 omissions (e.g. symptoms of vaccine preventable diseases). Another factor influencing the 25 perception and recall of vaccination-related information might be people’s attitudes 26 towards vaccines. A confirmation bias would mean that pre-existing pro-vaccination 27 attitudes positively predict perceptions of severity and recall of symptoms of vaccine 28 preventable diseases and negatively predict perceptions of severity and recall of vaccine 29 side effects. To test for these hypothesized biases, 202 female participants aged 18-60 30 (M=38.15, SD=10.37) completed an online experiment with a between-subjects 31 experimental design. Participants imagined that they had a 1-year old child who suffered 32 from either vaccine side effects (Commission Condition) or symptoms of a vaccine- 33 preventable disease (Omission Condition). They then rated a list of symptoms/side effects 34 for their perceived severity on a 7-point Likert scale. Finally, they completed a surprise recall 35 test in which they recalled the symptoms/side effects previously rated. An additional scale 36 was used to measure their attitudes towards vaccines. Contrary to the hypotheses, 37 perceptions of severity and the recall of symptoms/side effects were not associated with 38 experimental condition, failing to support the omission bias, nor did they interact with 39 attitudes towards vaccines, failing to support the confirmation bias. This raises the 40 possibility that the spread of anti-vaccination claims might not be explained by these 41 particular universal cognitive biases. 42 KEY WORDS 43 Omission bias, Confirmation Bias, Recall Bias, Antivaxx, Cultural Evolution, Evolutionary Psychology. 44 1.- INTRODUCTION 45 Ever since vaccination was developed in the late 18th century, it has been 46 accompanied by scepticism and opposition. Early resistance to the first vaccine, against 47 smallpox, was motivated by fears towards a practice that conflicted with folk intuitions and 48 religious beliefs (Arnold, 1993; Baldwin, 1999; Durbach, 2005). Smallpox vaccination 49 entailed the introduction of an alien substance into the human bloodstream, which was 50 seen as a violation of bodily purity and as the cause of, rather than the cure for, diseases 51 (Durbach, 2005). Moreover, the mechanism by which vaccination against smallpox worked 52 was counterintuitive as it implied that the intentional infection with one disease (cowpox) 53 could prevent the future development of a related but different disease (smallpox) 54 (Baldwin, 1999). Furthermore, smallpox vaccination involved the mixture of a substance 55 coming from a cow with human blood, which was considered anti-natural and anti-Christian 56 in Europe (Baldwin, 1999) and was also disturbing for Hindus in colonial India (Arnold, 57 1993). 58 More recently, specific vaccines (e.g. the MMR vaccine) have been the targets of 59 opposition and have often been falsely linked to idiopathic medical conditions such as 60 autism, neurological disorders and allergies (Burgess, Burgess, & Leask, 2006; Leask, 61 Chapman, & Robbins, 2010; Offit, 2008). Anti-vaccination messages recurrently emerge and 62 spread through the mass media (Larson, Wilson, Hanley, Parys, & Paterson, 2014; Leask & 63 Chapman, 1998), social media (Salathé & Khandelwal, 2011) and on anti-vaccination 64 websites (Bean, 2011; Davies, Chapman, & Leask, 2002; Kata, 2010; Wolfe, Sharp, & Lipsky, 65 2002; Zimmerman et al., 2005). This spread of anti-vaccination messages affects people’s 66 decisions regarding whether to vaccinate or not (Betsch, Renkewitz, Betsch, & Ulshöfer, 67 2010; Jolley & Douglas, 2014). As a consequence of the erosion of herd immunity and the 68 formation of clusters of unvaccinated individuals (Salathé & Bonhoeffer, 2008), outbreaks of 69 vaccine-preventable infectious diseases still occur in developed countries, sometimes 70 leading to fatal consequences. 71 The recurrence and potency of social fears about the safety of vaccines and the 72 widespread diffusion of anti-vaccination information, despite the spectacular contribution of 73 vaccines in preventing diseases, calls for a scientific explanation. Miton and Mercier (2015) 74 recently argued that two characteristics of human cognition - disgust towards potential 75 contaminants and an omission bias - make vaccination counterintuitive. This counter- 76 intuitiveness makes anti-vaccination messages “cognitively attractive” and likely to spread 77 (see Boudry, Blancke, & Pigliucci, 2014 for a general discussion on how scientific and 78 pseudoscientific ideas spread). In the present article we empirically test the role of one of 79 these cognitive factors, the omission bias, i.e. the tendency to consider bad outcomes 80 resulting from a commission (e.g. side effects of a vaccine) as worse that the same bad 81 outcomes resulting from an omission (e.g. symptoms of a vaccine-preventable disease). 82 While there is some evidence for the existence of omission bias in the context of vaccination 83 (see next section), our study is novel in extending the scope of application of the omission 84 bias to perceptions of severity of vaccine-related symptoms and side effects and their recall. 85 Consequently, we predict that people should rate as more severe, and retain longer in 86 memory, vaccine side effects over symptoms of a vaccine-preventable disease, where the 87 symptoms and the side effects are otherwise identical. These perceptual and memory 88 effects relating to vaccine-related information (symptoms and side effects) would play a key 89 role in the retention and transmission of anti-vaccination messages and opinions. Our study 90 is one of the first experiments exploring the effects of cognitive biases in the recall of 91 vaccine-related information (see also Altay & Mercier, 2018; Jiménez, Stubbersfield, & 92 Tehrani, 2018). 93 1.1.- The Omission Bias in Vaccine Decisions 94 A substantial body of literature maintains that humans have a general tendency to 95 consider the bad outcomes resulting from an action (commission) as worse than the same 96 bad outcomes resulting from a lack of action (omission), even when the bad outcomes 97 resulting from an omission affect a greater number of people or have a higher probability of 98 occurrence (e.g. Spranca, Minsk, & Baron, 1991). The basic paradigm for studying the 99 omission bias applied to vaccine decisions was developed by Ritov and Baron (1990) . 100 Participants are prompted to imagine a scenario in which an infectious disease kills 10 out of 101 10,000 non-vaccinated children but a new vaccine confers immunity against this disease. 102 However, the side effects of this vaccine can result in death with a certain probability. 103 Participants have to decide whether or not to vaccinate their child when the probability of 104 death by the vaccine side effects is 0, 1, 2, 3, 4, 5, 6, 7, 8, 9 or 10 out of 10,000 vaccinated 105 children. The omission bias is detected when participants prefer not to vaccinate when the 106 probability of dying from vaccine side effects is lower than the probability of dying from the 107 vaccine-preventable infectious disease. Multiple experiments have found that many people 108 prefer not to vaccinate themselves or their children when the risks of suffering from a 109 vaccine-preventable disease are higher than the risks of suffering from a vaccine reaction, 110 where the severity of the disease and the vaccine reaction are equal (Asch et al., 1994; 111 Baron, 1992; Baron & Ritov, 1994, 2004; Connolly & Reb, 2003; DiBonaventura & Chapman, 112 2008; Meszaros et al., 1996; Ritov & Baron, 1990; Wroe, Turner, & Salkovskis, 2004). 113 Importantly , the omission bias in hypothetical vaccine decisions positively predicts actual 114 vaccine behaviour in both retrospective (Asch et al., 1994; Meszaros et al., 1996; Wroe et 115 al., 2004) and prospective studies (DiBonaventura & Chapman, 2008). 116 One of the cognitive mechanisms that might explain the omission bias is anticipated 117 regret. Some authors (e.g. Asch et al., 1994) have suggested that bad outcomes resulting 118 from actions elicit considerably more regret than the same bad outcomes resulting from a 119 lack of action. The higher level of regret after bad outcomes
Recommended publications
  • The Status Quo Bias and Decisions to Withdraw Life-Sustaining Treatment
    HUMANITIES | MEDICINE AND SOCIETY The status quo bias and decisions to withdraw life-sustaining treatment n Cite as: CMAJ 2018 March 5;190:E265-7. doi: 10.1503/cmaj.171005 t’s not uncommon for physicians and impasse. One factor that hasn’t been host of psychological phenomena that surrogate decision-makers to disagree studied yet is the role that cognitive cause people to make irrational deci- about life-sustaining treatment for biases might play in surrogate decision- sions, referred to as “cognitive biases.” Iincapacitated patients. Several studies making regarding withdrawal of life- One cognitive bias that is particularly show physicians perceive that nonbenefi- sustaining treatment. Understanding the worth exploring in the context of surrogate cial treatment is provided quite frequently role that these biases might play may decisions regarding life-sustaining treat- in their intensive care units. Palda and col- help improve communication between ment is the status quo bias. This bias, a leagues,1 for example, found that 87% of clinicians and surrogates when these con- decision-maker’s preference for the cur- physicians believed that futile treatment flicts arise. rent state of affairs,3 has been shown to had been provided in their ICU within the influence decision-making in a wide array previous year. (The authors in this study Status quo bias of contexts. For example, it has been cited equated “futile” with “nonbeneficial,” The classic model of human decision- as a mechanism to explain patient inertia defined as a treatment “that offers no rea- making is the rational choice or “rational (why patients have difficulty changing sonable hope of recovery or improvement, actor” model, the view that human beings their behaviour to improve their health), or because the patient is permanently will choose the option that has the best low organ-donation rates, low retirement- unable to experience any benefit.”) chance of satisfying their preferences.
    [Show full text]
  • A Task-Based Taxonomy of Cognitive Biases for Information Visualization
    A Task-based Taxonomy of Cognitive Biases for Information Visualization Evanthia Dimara, Steven Franconeri, Catherine Plaisant, Anastasia Bezerianos, and Pierre Dragicevic Three kinds of limitations The Computer The Display 2 Three kinds of limitations The Computer The Display The Human 3 Three kinds of limitations: humans • Human vision ️ has limitations • Human reasoning 易 has limitations The Human 4 ️Perceptual bias Magnitude estimation 5 ️Perceptual bias Magnitude estimation Color perception 6 易 Cognitive bias Behaviors when humans consistently behave irrationally Pohl’s criteria distilled: • Are predictable and consistent • People are unaware they’re doing them • Are not misunderstandings 7 Ambiguity effect, Anchoring or focalism, Anthropocentric thinking, Anthropomorphism or personification, Attentional bias, Attribute substitution, Automation bias, Availability heuristic, Availability cascade, Backfire effect, Bandwagon effect, Base rate fallacy or Base rate neglect, Belief bias, Ben Franklin effect, Berkson's paradox, Bias blind spot, Choice-supportive bias, Clustering illusion, Compassion fade, Confirmation bias, Congruence bias, Conjunction fallacy, Conservatism (belief revision), Continued influence effect, Contrast effect, Courtesy bias, Curse of knowledge, Declinism, Decoy effect, Default effect, Denomination effect, Disposition effect, Distinction bias, Dread aversion, Dunning–Kruger effect, Duration neglect, Empathy gap, End-of-history illusion, Endowment effect, Exaggerated expectation, Experimenter's or expectation bias,
    [Show full text]
  • A Dissertation Entitled Exploring Common Antecedents of Three Related Decision Biases by Jonathan E. Westfall Submitted As Parti
    A Dissertation Entitled Exploring Common Antecedents of Three Related Decision Biases by Jonathan E. Westfall Submitted as partial fulfillment of the requirements for the Doctor of Philosophy in Psychology __________________________ Advisor: Dr. J. D. Jasper __________________________ Dr. S. D. Christman __________________________ Dr. R. E. Heffner __________________________ Dr. K. L. London __________________________ Dr. M. E. Doherty __________________________ Graduate School The University of Toledo August 2009 Exploring Common Antecedents ii Copyright © 2009 – Jonathan E. Westfall This document is copyrighted material. Under copyright law, no parts of this document may be reproduced in any manner without the expressed written permission of the author. Exploring Common Antecedents iii An Abstract of Exploring Common Antecedents of Three Related Decision Biases Jonathan E. Westfall Submitted as partial fulfillment of the requirements for The Doctor of Philosophy in Psychology The University of Toledo August 2009 “Decision making inertia” is a term loosely used to describe the similar nature of a variety of decision making biases that predominantly favor a decision to maintain one course of action over switching to a new course. Three of these biases, the sunk cost effect, status-quo bias, and inaction inertia are discussed here. Combining earlier work on strength of handedness and the sunk cost effect along with new findings regarding counterfactual thought, this work principally seeks to determine if counterfactual thought may drive the three decision biases of note while also analyzing common relationships between the biases, strength of handedness, and the variables of regret and loss aversion. Over a series of experiments, it was found that handedness differences did exist in the three biases discussed, that amount and type of counterfactuals generated did not predict choice within the status-quo bias, and that the remaining variables potentially thought to drive the biases presented did not link causally to them.
    [Show full text]
  • 1. Service Evaluations and Qualitative Research
    EAHP Academy Seminars 20 - 21 September 2019 Brussels #ACASEM2019 EAHP Academy Seminars 20-21 September 2019 Introduction to service evaluation and qualitative research Jonathan Underhill, NICE and Keele University, UK Ulrika Gillespie Uppsala University Hospital, Sweden Conflicts of interest • No conflicts of interest to declare Introduction • Understanding how humans make decisions and its impact on: – Communicating findings – Conducting research • Answering questions using qualitative and quantitative studies Introduction • Understanding how humans make decisions and its impact on: – Communicating findings – Conducting research • Answering questions using qualitative and quantitative studies What do we know about how people make decisions? • Behavioural economics and cognitive psychology: – Bounded rationality (Herbert Simon 1978) – Dual process theory (Daniel Kahneman 2002) – Most decisions are informed by brief reading and talking to other people - please find a piece of paper and a pen - a list of words follows look at them once, do not re-read them - when you have read the list close your eyes Flange Routemaster Laggard Sausages Automaton Approach Antichrist Research Slipper Haggle Fridge Locomotive Bracket Confused Telesales Professor Stool pigeon Hale Banquet Irrelevance Write down as many words as you can remember Flange How many words Routemaster A that you remembered Laggard are in each group? Sausages Automaton Approach B Antichrist Research Slipper Haggle C Fridge Locomotive Bracket Confused D Telesales Professor Stool pigeon Hale E Banquet Irrelevance Herbert Simon 1978 Economics Bounded rationality Satisfycing Please list all the medicines which have a potential interaction with warfarin – both increasing and decreasing its effect Drug interactions with warfarin – decreased effect • Amobarbital • Primidone Butabarbital Ribavirin Carbamazepine Rifabutin Cholestyramine Rifampin Dicloxacillin Secobarbital Griseofulvin Sucralfate Mercaptopurine Vitamin K Mesalamine Coenzyme Q10 Nafcillin Ginseng Phenobarbital St.
    [Show full text]
  • Immersive High Fidelity Simulation of Critically Ill Patients to Study Cognitive Errors
    Prakash et al. BMC Medical Education (2017) 17:36 DOI 10.1186/s12909-017-0871-x RESEARCHARTICLE Open Access Immersive high fidelity simulation of critically ill patients to study cognitive errors: a pilot study Shivesh Prakash1,2*, Shailesh Bihari2, Penelope Need3, Cyle Sprick4 and Lambert Schuwirth1,5 Abstract Background: The majority of human errors in healthcare originate from cognitive errors or biases. There is dearth of evidence around relative prevalence and significance of various cognitive errors amongst doctors in their first post-graduate year. This study was conducted with the objective of using high fidelity clinical simulation as a tool to study the relative occurrence of selected cognitive errors amongst doctors in their first post-graduate year. Methods: Intern simulation sessions on acute clinical problems, conducted in year 2014, were reviewed by two independent assessors with expertise in critical care. The occurrence of cognitive errors was identified using Likert scale based questionnaire and think-aloud technique. Teamwork and leadership skills were assessed using Ottawa Global Rating Scale. Results: The most prevalent cognitive errors included search satisfying (90%), followed by premature closure (PC) (78.6%), and anchoring (75.7%). The odds of occurrence of various cognitive errors did not change with time during internship, in contrast to teamwork and leadership skills (x2 = 11.9, P = 0.01). Anchoring appeared to be significantly associated with delay in diagnoses (P = 0.007) and occurrence of PC (P = 0.005). There was a negative association between occurrence of confirmation bias and the ability to make correct diagnosis (P = 0.05). Conclusions: Our study demonstrated a high prevalence of anchoring, premature closure, and search satisfying amongst doctors in their first post-graduate year, using high fidelity simulation as a tool.
    [Show full text]
  • Communication Science to the Public
    David M. Berube North Carolina State University ▪ HOW WE COMMUNICATE. In The Age of American Unreason, Jacoby posited that it trickled down from the top, fueled by faux-populist politicians striving to make themselves sound approachable rather than smart. (Jacoby, 2008). EX: The average length of a sound bite by a presidential candidate in 1968 was 42.3 seconds. Two decades later, it was 9.8 seconds. Today, it’s just a touch over seven seconds and well on its way to being supplanted by 140/280- character Twitter bursts. ▪ DATA FRAMING. ▪ When asked if they truly believe what scientists tell them, NEW ANTI- only 36 percent of respondents said yes. Just 12 percent expressed strong confidence in the press to accurately INTELLECTUALISM: report scientific findings. ▪ ROLE OF THE PUBLIC. A study by two Princeton University researchers, Martin TRENDS Gilens and Benjamin Page, released Fall 2014, tracked 1,800 U.S. policy changes between 1981 and 2002, and compared the outcome with the expressed preferences of median- income Americans, the affluent, business interests and powerful lobbies. They concluded that average citizens “have little or no independent influence” on policy in the U.S., while the rich and their hired mouthpieces routinely get their way. “The majority does not rule,” they wrote. ▪ Anti-intellectualism and suspicion (trends). ▪ Trump world – outsiders/insiders. ▪ Erasing/re-writing history – damnatio memoriae. ▪ False news. ▪ Infoxication (CC) and infobesity. ▪ Aggregators and managed reality. ▪ Affirmation and confirmation bias. ▪ Negotiating reality. ▪ New tribalism is mostly ideational not political. ▪ Unspoken – guns, birth control, sexual harassment, race… “The amount of technical information is doubling every two years.
    [Show full text]
  • The Comprehensive Assessment of Rational Thinking
    EDUCATIONAL PSYCHOLOGIST, 51(1), 23–34, 2016 Copyright Ó Division 15, American Psychological Association ISSN: 0046-1520 print / 1532-6985 online DOI: 10.1080/00461520.2015.1125787 2013 THORNDIKE AWARD ADDRESS The Comprehensive Assessment of Rational Thinking Keith E. Stanovich Department of Applied Psychology and Human Development University of Toronto, Canada The Nobel Prize in Economics was awarded in 2002 for work on judgment and decision- making tasks that are the operational measures of rational thought in cognitive science. Because assessments of intelligence (and similar tests of cognitive ability) are taken to be the quintessence of good thinking, it might be thought that such measures would serve as proxies for the assessment of rational thought. It is important to understand why such an assumption would be misplaced. It is often not recognized that rationality and intelligence (as traditionally defined) are two different things conceptually and empirically. Distinguishing between rationality and intelligence helps explain how people can be, at the same time, intelligent and irrational. Thus, individual differences in the cognitive skills that underlie rational thinking must be studied in their own right because intelligence tests do not explicitly assess rational thinking. In this article, I describe how my research group has worked to develop the first prototype of a comprehensive test of rational thought (the Comprehensive Assessment of Rational Thinking). It was truly a remarkable honor to receive the E. L. Thorn- Cunningham, of the University of California, Berkeley. To dike Career Achievement Award for 2012. The list of previ- commemorate this award, Anne bought me a 1923 ous winners is truly awe inspiring and humbling.
    [Show full text]
  • Thinking and Reasoning
    Thinking and Reasoning Thinking and Reasoning ■ An introduction to the psychology of reason, judgment and decision making Ken Manktelow First published 2012 British Library Cataloguing in Publication by Psychology Press Data 27 Church Road, Hove, East Sussex BN3 2FA A catalogue record for this book is available from the British Library Simultaneously published in the USA and Canada Library of Congress Cataloging in Publication by Psychology Press Data 711 Third Avenue, New York, NY 10017 Manktelow, K. I., 1952– Thinking and reasoning : an introduction [www.psypress.com] to the psychology of reason, Psychology Press is an imprint of the Taylor & judgment and decision making / Ken Francis Group, an informa business Manktelow. p. cm. © 2012 Psychology Press Includes bibliographical references and Typeset in Century Old Style and Futura by index. Refi neCatch Ltd, Bungay, Suffolk 1. Reasoning (Psychology) Cover design by Andrew Ward 2. Thought and thinking. 3. Cognition. 4. Decision making. All rights reserved. No part of this book may I. Title. be reprinted or reproduced or utilised in any BF442.M354 2012 form or by any electronic, mechanical, or 153.4'2--dc23 other means, now known or hereafter invented, including photocopying and 2011031284 recording, or in any information storage or retrieval system, without permission in writing ISBN: 978-1-84169-740-6 (hbk) from the publishers. ISBN: 978-1-84169-741-3 (pbk) Trademark notice : Product or corporate ISBN: 978-0-203-11546-6 (ebk) names may be trademarks or registered trademarks, and are used
    [Show full text]
  • Moral Heuristics and Moral Framing Cass R
    University of Minnesota Law School Scholarship Repository Minnesota Law Review 2004 Moral Heuristics and Moral Framing Cass R. Sunstein Follow this and additional works at: https://scholarship.law.umn.edu/mlr Part of the Law Commons Recommended Citation Sunstein, Cass R., "Moral Heuristics and Moral Framing" (2004). Minnesota Law Review. 736. https://scholarship.law.umn.edu/mlr/736 This Article is brought to you for free and open access by the University of Minnesota Law School. It has been accepted for inclusion in Minnesota Law Review collection by an authorized administrator of the Scholarship Repository. For more information, please contact [email protected]. Lecture Moral Heuristics and Moral Framing t Cass R. Sunstein INTRODUCTION Consider two positions voiced by many people in the American media in 2003: A. When you have been a fan of a sports team, you have a moral obligation to continue to be a fan even if the team is now terrible. It is disloyal to cease being a fan merely because the team keeps losing. "Once you're a fan, you're a fan for life." B. In opposing military action to topple Saddam Hussein, France violated its moral obligations. The United States liberated France from Hitler's Germany, and if the United States favored military action to topple Saddam Hussein, France was under a moral obligation to sup- port the United States. Both of these claims are absurd. A sports team is not a * This essay is based on the 2003-04 John Dewey Lecture in the Phi- losophy of Law, delivered by Professor Sunstein at the University of Minne- sota Law School on September 24, 2003.
    [Show full text]
  • 50 Cognitive and Affective Biases in Medicine (Alphabetically)
    50 Cognitive and Affective Biases in Medicine (alphabetically) Pat Croskerry MD, PhD, FRCP(Edin), Critical Thinking Program, Dalhousie University Aggregate bias: when physicians believe that aggregated data, such as those used to develop clinical practice guidelines, do not apply to individual patients (especially their own), they are exhibiting the aggregate fallacy. The belief that their patients are atypical or somehow exceptional, may lead to errors of commission, e.g. ordering x-rays or other tests when guidelines indicate none are required. Ambiguity effect: there is often an irreducible uncertainty in medicine and ambiguity is associated with uncertainty. The ambiguity effect is due to decision makers avoiding options when the probability is unknown. In considering options on a differential diagnosis, for example, this would be illustrated by a tendency to select options for which the probability of a particular outcome is known over an option for which the probability is unknown. The probability may be unknown because of lack of knowledge or because the means to obtain the probability (a specific test, or imaging) is unavailable. The cognitive miser function (choosing an option that requires less cognitive effort) may also be at play here. Anchoring: the tendency to perceptually lock on to salient features in the patient’s initial presentation too early in the diagnostic process, and failure to adjust this initial impression in the light of later information. This bias may be severely compounded by the confirmation bias. Ascertainment bias: when a physician’s thinking is shaped by prior expectation; stereotyping and gender bias are both good examples. Attentional bias: the tendency to believe there is a relationship between two variables when instances are found of both being present.
    [Show full text]
  • Concepts and Implications of Altruism Bias and Pathological Altruism
    Concepts and implications of altruism bias and pathological altruism Barbara A. Oakley1 Department of Industrial and Systems Engineering, Oakland University, Rochester, MI 48309 Edited by John C. Avise, University of California, Irvine, CA, and approved April 9, 2013 (received for review February 14, 2013) The profound benefits of altruism in modern society are self-evident. Pathological altruism can be conceived as behavior in which However, the potential hurtful aspects of altruism have gone largely attempts to promote the welfare of another, or others, results unrecognized in scientific inquiry. This is despite the fact that instead in harm that an external observer would conclude was virtually all forms of altruism are associated with tradeoffs—some reasonably foreseeable. More precisely, this paper defines path- of enormous importance and sensitivity—and notwithstanding that ological altruism as an observable behavior or personal tendency examples of pathologies of altruism abound. Presented here are the in which the explicit or implicit subjective motivation is in- mechanistic bases and potential ramifications of pathological altru- tentionally to promote the welfare of another, but instead of ism, that is, altruism in which attempts to promote the welfare of overall beneficial outcomes the altruism instead has unreasonable others instead result in unanticipated harm. A basic conceptual ap- (from the relative perspective of an outside observer) negative proach toward the quantification of altruism bias is presented. consequences to the other or even to the self. This definition does Guardian systems and their over arching importance in the evolution not suggest that there are absolutes but instead suggests that, of cooperation are also discussed.
    [Show full text]
  • 1 Embrace Your Cognitive Bias
    1 Embrace Your Cognitive Bias http://blog.beaufortes.com/2007/06/embrace-your-co.html Cognitive Biases are distortions in the way humans see things in comparison to the purely logical way that mathematics, economics, and yes even project management would have us look at things. The problem is not that we have them… most of them are wired deep into our brains following millions of years of evolution. The problem is that we don’t know about them, and consequently don’t take them into account when we have to make important decisions. (This area is so important that Daniel Kahneman won a Nobel Prize in 2002 for work tying non-rational decision making, and cognitive bias, to mainstream economics) People don’t behave rationally, they have emotions, they can be inspired, they have cognitive bias! Tying that into how we run projects (project leadership as a compliment to project management) can produce results you wouldn’t believe. You have to know about them to guard against them, or use them (but that’s another article)... So let’s get more specific. After the jump, let me show you a great list of cognitive biases. I’ll bet that there are at least a few that you haven’t heard of before! Decision making and behavioral biases Bandwagon effect — the tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink, herd behaviour, and manias. Bias blind spot — the tendency not to compensate for one’s own cognitive biases. Choice-supportive bias — the tendency to remember one’s choices as better than they actually were.
    [Show full text]