Hawkish Biases-Website

Total Page:16

File Type:pdf, Size:1020Kb

Hawkish Biases-Website ∗ Hawkish Biases Daniel Kahneman Jonathan Renshon Eugene Higgins Professor of Psychology, Emeritus Weatherhead Center for International Affairs Professor of Psychology and Public Affairs, Emeritus Department of Government Woodrow Wilson School Harvard University Princeton University [email protected] [email protected] Daniel Kahneman is a Senior Scholar at the Woodrow Wilson School of Public and International Affairs and a fellow of the Center for Rationality at the Hebrew University in Jerusalem. Daniel Kahneman won the Nobel Prize in Economic Sciences in 2002 for his work on prospect theory and decision-making under uncertainty. Jonathan Renshon is a PhD candidate in the Department of Government at Harvard University. Renshon studies national security decision-making, foreign policy analysis and political psychology. He is the author of Why Leaders Choose War: The Psychology of Prevention (Greenwood, 2006), and his work has appeared in the Journal of Strategic Studies, Political Psychology, Foreign Policy and the Journal of Conflict Resolution. ∗ This essay appears in a slightly shorter and modified form in Trevor Thrall and Jane Cramer (eds.) American Foreign Policy and the Politics of Fear: Threat Inflation Since 9/11 (New York: Routledge Press, 2009), 79-96. In a short article in Foreign Policy in 2007, we put forward a hypothesis of “directionality” in cognitive biases that generated a fair amount of debate and some controversy (Kahneman and Renshon 2007). This chapter is an attempt to expand upon and clarify our original argument. In the last few decades, cognitive and social psychologists have described many cognitive biases –predictable errors in the ways that individuals interpret information and make decisions.1 An unexpected and significant pattern emerges when theses biases are viewed as a set: we find, almost without exception, that the biases recently uncovered by psychological research favor hawkish decisions in conflict situations. We use the term “hawkish” to describe a propensity for suspicion, hostility and aggression in the conduct of conflict, and for less cooperation and trust when the resolution of a conflict is on the agenda. Actors who are susceptible to hawkish biases are not only more likely to see threats as more dire than an objective observer would perceive, but are also likely to act in a way that will produce unnecessary conflict. We do not contend, of course, that all decisions in the international political context will be hostile or aggressive as a result of biases of cognition and preference, only that more decisions will be so more often than they would be in the absence of bias. Nor do we contend that all suspicions are produced by biased thinking, and therefore unjustified. Our point is only that the collective effect of the biases that psychology has identified is to increase the probability that agents will act more hawkishly than an objective observer would deem appropriate. For a more concrete image, suppose that a national leader is exposed to conflicting advice from a hawk and a dove. Our contention is that cognitive biases will tend to make the hawk’s arguments more persuasive than they deserve to be. The biases discussed in this chapter pose a difficult methodological problem for political scientists, since we cannot “prove” the bias to have been at fault in any given decision. Instead, we define the bias and invite readers to consider its consequences in conflict situations by invoking a hypothetical objective observer. We can only hope that 1 More specifically, a bias exists when an error in estimating or assessing a value is more likely in one direction than another. For example, in estimating our own skills as drivers, a bias exists because we are far more likely to over-estimate our abilities relative to others than to under-estimate them. In the absence of an objective criterion, we take the opinions of knowledgeable but uninvolved observer as a definition of an unbiased judgment. The perspective of history provides another “unbiased” view of the situation that decision makers faced. 1 this perspective allows dispassionate analysts to take that role. And we can only hope that the retrospective judgments are not overly tainted by the familiar hindsight bias, which makes it all too easy to explain past disasters by finding flaws in the decisions of actors who cannot defend themselves. The test of our approach is whether it offers historians and political scientists a useful heuristic. The biases that we list are to be viewed as hypotheses, to be confirmed by examining evidence of what the decision makers believed, desired or feared at the critical time. For example, students of conflict should expect to find evidence of unreasonably negative interpretations of the opponent’s intentions, and overly optimistic assessments of the situation by both sides, because these biases have been established by prior research. In our original analysis, we compiled a list of known biases, and proceeded to trace the implications of those biases for international conflict. This chapter proceeds in much the same way. For each bias, we first present empirical evidence illustrating how and when it is likely to affect judgment and decision-making. Note that the evidence on which we rely was documented in experimental situations that did not involve conflict – the biases are considered to be general features of cognition and preference. We then proceed to examine the potential behavioral implications of each bias in situations in international conflict. The biases and their main effects are listed in Table 1. 2 Table 1. Biases examined in this chapter Bias Primary Effect in Conflict Situations POSITIVE ILLUSIONS Biased overconfidence raises the probability of violent conflict occurring and of deadlock in negotiations (when the parties overestimate their bargaining position or ability) FAE Perceive hostile actions of adversaries as due to unchanging, dispositional factors and discount the role of contextual factors; neglect the effects of one’s own hostility on the behavior of adversaries ILLUSION OF Ignore how one’s actions are likely to be perceived by others, resulting TRANSPARENCY in behavior that is likely to provoke aggression or hostility ENDOWMENT EFFECT/LOSS Induces an aversion to making concessions and a reluctance to accept AVERSION objectively “fair” exchanges. RISK SEEKING IN LOSSES Reluctance to settle, prolongation of conflict PSEUDO-CERTAINTY Lowers probability of concessions if there is a potential that those concessions might advantage an opponent in a possible future conflict and concurrently raises the probability of conflict occurring by adopting a worst-case scenario of the other’s intentions. REACTIVE DEVALUATION Unconscious devaluation of offers, concessions or plans suggested by rivals or adversaries makes it difficult to reach agreement. 3 Positive Illusions One of the most robust findings in cognitive and social psychology is that individuals often fall victim to “positive illusions.”2 Among these positive illusions are: unrealistically positive views of one’s abilities and character, the illusion of control, and unrealistic optimism (Taylor and Brown 1988: 195-6). Unrealistically positive views of the self have been documented in many domains. Among other things, most people believe themselves to be better than average drivers, decision-makers and negotiators (Svenson 1981: 143; Bazerman 1998: 69). A survey of university professors found that 94 percent believed themselves to be better teachers than the average at their institution, which of course is a statistical impossibility (Cross 1977). Because individuals resist information that conflicts with positive self-assessments, these unrealistically positive views of oneself are generally robust over time (Crary 1966: 246; Marks 1984: 203). The “illusion of control” is an exaggerated perception of the extent to which outcomes depend on one’s actions. When people were given a button and instructed to cause a particular color to appear on the screen, they erroneously believed that they had substantial control over events, even when the outcomes were actually determined by a computer (Martin, Abramson et al. 1984). Experiments have shown that people act as if they can control the outcome of rolling a die, and are more willing to bet when they do the rolling (Silverman 1964: 114; Langer 1975: 312,324; Campbell 1986: 290). It has also been demonstrated that stress (common in conflict or crisis decision-making) increases the preference for strategies that engender a feeling of control, even if it is illusory, and even if it leads to worse outcomes (Friedland, Keinan et al. 1992: 923). In a competitive situation, the illusion of control causes each side to believe that the outcome of the competition depends mostly on its own actions and abilities, even when it depends equally on the achievements of competitors. The third positive illusion is “unrealistic optimism.” The evidence for “illusory,” or biased, optimism comes from the comparisons of individuals’ judgments of themselves 2 There is an important caveat to the findings described in this section. The first is that there may well be cultural variation in unrealistic optimism. Some studies have found that individuals from societies that do not place much emphasis on the individual are less likely to evince “self-enhancing” biases such as unrealistic optimism. As an example, one study found that Japanese were much less likely than Canadians to demonstrate unrealistic optimism. See (Heine and Lehman 1995). 4 and of others. People generally believe that the probability of positive outcomes (such as having a gifted child, or enjoying their first job) is higher for themselves than for their peers, and judge the probability of negative events (such as being the victim of a crime or being in a car accident) as less likely for themselves than for others (Robertson 1977: 136; Weinstein 1980: 806; Perloff and Fetzer 1986: 502). In addition, experimental evidence suggests that people’s predictions of what will occur correspond closely to what they would like to happen, rather than what is objectively likely to occur (Sherman 1980: 211).
Recommended publications
  • Power and Illusory Control
    Research Paper No. 2009 Illusory Control: A Generative Force Behind Power’s Far-Reaching Effects Nathanael J. Fast Deborah H. Gruenfeld Niro Sivanathan Adam D. Galinsky December 2008 R ESEARCH P APER S ERIES Electronic copy available at: http://ssrn.com/abstract=1314952 Running Head: POWER AND ILLUSORY CONTROL Illusory Control: A Generative Force Behind Power’s Far-Reaching Effects Nathanael J. Fast and Deborah H Gruenfeld Stanford University Niro Sivanathan London Business School Adam D. Galinsky Northwestern University *In press at Psychological Science Please address correspondence concerning this article to Nathanael J. Fast, Department of Organizational Behavior, Stanford University, 518 Memorial Way, Stanford, CA 94305; email: [email protected]. Electronic copy available at: http://ssrn.com/abstract=1314952 Power and Illusory Control 2 Abstract Three experiments demonstrated that the experience of power leads to an illusion of personal control. Regardless of whether power was experientially primed (Experiments 1 and 3) or manipulated through manager-subordinate roles (Experiment 2), it led to perceived control over outcomes that were beyond the reach of the powerholder. Furthermore, this illusory control mediated the influence of power on several self-enhancement and approach-related effects found in the power literature, including optimism (Experiment 2), self-esteem (Experiment 3), and action-orientation (Experiment 3), demonstrating its theoretical importance as a generative cause and driving force behind many of power’s far-reaching effects. A fourth experiment ruled out an alternative explanation: that positive mood, rather than illusory control, is at the root of power’s effects. The discussion considers implications for existing and future research on the psychology of power, perceived control, and positive illusions.
    [Show full text]
  • Impact of Illusion of Control on Perceived Efficiency in Pakistani Financial Markets
    Impact of Illusion…. Abasyn Journal of Social Sciences Vol. 5 No. 2 Impact of Illusion of Control on Perceived Efficiency in Pakistani Financial Markets Sidra Ajmal1 Maria Mufti2 Dr. Zulfiqar Ali Shah3 Abstract The purpose of this study is to test impact of illusion of control on perceived market efficiency. On the basis of literature review, questionnaire was developed. The study is carried out on sample of convenience of investors, financial analyst and finance scholars of Islamabad/ Rawalpindi. Questionnaire comprises of 15 items related to three forms of market efficiency and particular bias (illusion of control). The findings of the regression show significant results. This means that illusion of control has impact on perceived efficiency of Pakistani financial market. This study focuses on only individual located in Islamabad/Rawalpindi, and the sample size is also small due to time constraints. Further research can be done on a bigger scale to get the more general conclusions. This study will help investors to identify their biases and then formulate different strategies to reduce their irrational behavior. Key words: Illusion of control, market efficiency 1Sidra Ajmal, MS (Scholar), SZABIST, Islamabad. 2Maria Mufti, MS (Scholar), SZABIST, Islamabad. 3Dr. Zulfiqar Ali Shah, Associate Professor, SZABIST, Islamabad. Illusion of control is a psychological term referring to the tendency of people to overestimate their abilities to control any event, where as market efficiency is the phenomena primarily explained by Fama (1970). He had created a hypothesis that Sidra Ajmal, Maria Mufti, & Dr. Zulfiqar Ali Shah 100 Impact of Illusion…. Abasyn Journal of Social Sciences Vol. 5 No.
    [Show full text]
  • Social Psychology Glossary
    Social Psychology Glossary This glossary defines many of the key terms used in class lectures and assigned readings. A Altruism—A motive to increase another's welfare without conscious regard for one's own self-interest. Availability Heuristic—A cognitive rule, or mental shortcut, in which we judge how likely something is by how easy it is to think of cases. Attractiveness—Having qualities that appeal to an audience. An appealing communicator (often someone similar to the audience) is most persuasive on matters of subjective preference. Attribution Theory—A theory about how people explain the causes of behavior—for example, by attributing it either to "internal" dispositions (e.g., enduring traits, motives, values, and attitudes) or to "external" situations. Automatic Processing—"Implicit" thinking that tends to be effortless, habitual, and done without awareness. B Behavioral Confirmation—A type of self-fulfilling prophecy in which people's social expectations lead them to behave in ways that cause others to confirm their expectations. Belief Perseverance—Persistence of a belief even when the original basis for it has been discredited. Bystander Effect—The tendency for people to be less likely to help someone in need when other people are present than when they are the only person there. Also known as bystander inhibition. C Catharsis—Emotional release. The catharsis theory of aggression is that people's aggressive drive is reduced when they "release" aggressive energy, either by acting aggressively or by fantasizing about aggression. Central Route to Persuasion—Occurs when people are convinced on the basis of facts, statistics, logic, and other types of evidence that support a particular position.
    [Show full text]
  • Infographic I.10
    The Digital Health Revolution: Leaving No One Behind The global AI in healthcare market is growing fast, with an expected increase from $4.9 billion in 2020 to $45.2 billion by 2026. There are new solutions introduced every day that address all areas: from clinical care and diagnosis, to remote patient monitoring to EHR support, and beyond. But, AI is still relatively new to the industry, and it can be difficult to determine which solutions can actually make a difference in care delivery and business operations. 59 Jan 2021 % of Americans believe returning Jan-June 2019 to pre-coronavirus life poses a risk to health and well being. 11 41 % % ...expect it will take at least 6 The pandemic has greatly increased the 65 months before things get number of US adults reporting depression % back to normal (updated April and/or anxiety.5 2021).4 Up to of consumers now interested in telehealth going forward. $250B 76 57% of providers view telehealth more of current US healthcare spend % favorably than they did before COVID-19.7 could potentially be virtualized.6 The dramatic increase in of Medicare primary care visits the conducted through 90% $3.5T telehealth has shown longevity, with rates in annual U.S. health expenditures are for people with chronic and mental health conditions. since April 2020 0.1 43.5 leveling off % % Most of these can be prevented by simple around 30%.8 lifestyle changes and regular health screenings9 Feb. 2020 Apr. 2020 OCCAM’S RAZOR • CONJUNCTION FALLACY • DELMORE EFFECT • LAW OF TRIVIALITY • COGNITIVE FLUENCY • BELIEF BIAS • INFORMATION BIAS Digital health ecosystems are transforming• AMBIGUITY BIAS • STATUS medicineQUO BIAS • SOCIAL COMPARISONfrom BIASa rea• DECOYctive EFFECT • REACTANCEdiscipline, • REVERSE PSYCHOLOGY • SYSTEM JUSTIFICATION • BACKFIRE EFFECT • ENDOWMENT EFFECT • PROCESSING DIFFICULTY EFFECT • PSEUDOCERTAINTY EFFECT • DISPOSITION becoming precise, preventive,EFFECT • ZERO-RISK personalized, BIAS • UNIT BIAS • IKEA EFFECT and • LOSS AVERSION participatory.
    [Show full text]
  • Effects of Mindset on Positive Illusions
    EffectEffectss ooff MindseMindsett oonn PositivPositivee IllusionIllusionss ShelleSheJleyy EE.. TayloTaylorr PetePeterr MM.. GollwitzeGoUwitzerr UniversitUniversityy oofCalifomia,f California, LoLoss AngeleAngeless UniversitUniversityy ooff KonstanKonstanzz SS.. EE.. TayloTayklrandr and JJ.. DD.. Brown'Brown'ss (1988(1988)) positioposilionn thalhalt mentallmentallyy healthhealthyy peoplpoopIee exhibiexhibitt positivpositivee illusioni11lol5ionss raiseraisess aI dilemmadikmma:: HowHow dodo pewitpeople funC1Wnfunction effectivelyeffectively ifthc:irif their perceptionsperceptions arcare positivelypositively biased?biased? UsingUsing Gollwitzer'Go/lwiucr'ss deliberative-implementadeliberative-implementall mindseffiindsett distinctiondistinction,, wv.ee assesseassessedd whethewhetherr peoplpeoplee iinn a deliber­deliber- ativativee mindsemioosett shoshoww leslesss evidenceevidence ofof positivpositivee illusionillusionss thathann peoplpc~e inin anan implementalimplemental mindsetmind$Cl... ParticPartk·- ipantipanlSs completecompletedd aa mindsemindsett tastaskk anandd assessmentassessmentss ooff moodmood,, self-perceptionsself-perceptions,, anandd perceiveperceivedd (invul(in)vul­- nerabilitnerabilityy t10o riskrisk.. DeliberatioDeliberationn leledd tloworsenedo worsened moodmood,, greatevealerr perceiveperceivedd riskrisk,, anandd poorepoorerr self-percepself.percep­- tionstiool,, relativrelativee ttoo implementationimplementation;; controcontroll (n(noo mindsetmindset)) participantparticipantss typicalltypicallyy scorescoredd iinn betweenbet~n.
    [Show full text]
  • Illusion and Well-Being: a Social Psychological Perspective on Mental Health
    Psyehologlcal Bulletin Copyright 1988 by the American Psychological Association, Inc. 1988, Vol. 103, No. 2, 193-210 0033-2909/88/$00.75 Illusion and Well-Being: A Social Psychological Perspective on Mental Health Shelley E. Taylor Jonathon D. Brown University of California, Los Angeles Southern Methodist University Many prominenttheorists have argued that accurate perceptions of the self, the world, and the future are essential for mental health. Yet considerable research evidence suggests that overly positive self- evaluations, exaggerated perceptions of control or mastery, and unrealistic optimism are characteris- tic of normal human thought. Moreover, these illusions appear to promote other criteria of mental health, including the ability to care about others, the ability to be happy or contented, and the ability to engage in productive and creative work. These strategies may succeed, in large part, because both the social world and cognitive-processingmechanisms impose filters on incoming information that distort it in a positive direction; negativeinformation may be isolated and represented in as unthreat- ening a manner as possible. These positive illusions may be especially useful when an individual receives negative feedback or is otherwise threatened and may be especially adaptive under these circumstances. Decades of psychological wisdom have established contact dox: How can positive misperceptions of one's self and the envi- with reality as a hallmark of mental health. In this view, the ronment be adaptive when accurate information processing wcU-adjusted person is thought to engage in accurate reality seems to be essential for learning and successful functioning in testing,whereas the individual whose vision is clouded by illu- the world? Our primary goal is to weave a theoretical context sion is regarded as vulnerable to, ifnot already a victim of, men- for thinking about mental health.
    [Show full text]
  • John Collins, President, Forensic Foundations Group
    On Bias in Forensic Science National Commission on Forensic Science – May 12, 2014 56-year-old Vatsala Thakkar was a doctor in India but took a job as a convenience store cashier to help pay family expenses. She was stabbed to death outside her store trying to thwart a theft in November 2008. Bloody Footwear Impression Bloody Tire Impression What was the threat? 1. We failed to ask ourselves if this was a footwear impression. 2. The appearance of the impression combined with the investigator’s interpretation created prejudice. The accuracy of our analysis became threatened by our prejudice. Types of Cognitive Bias Available at: http://en.wikipedia.org/wiki/List_of_cognitive_biases | Accessed on April 14, 2014 Anchoring or focalism Hindsight bias Pseudocertainty effect Illusory superiority Levels-of-processing effect Attentional bias Hostile media effect Reactance Ingroup bias List-length effect Availability heuristic Hot-hand fallacy Reactive devaluation Just-world phenomenon Misinformation effect Availability cascade Hyperbolic discounting Recency illusion Moral luck Modality effect Backfire effect Identifiable victim effect Restraint bias Naive cynicism Mood-congruent memory bias Bandwagon effect Illusion of control Rhyme as reason effect Naïve realism Next-in-line effect Base rate fallacy or base rate neglect Illusion of validity Risk compensation / Peltzman effect Outgroup homogeneity bias Part-list cueing effect Belief bias Illusory correlation Selective perception Projection bias Peak-end rule Bias blind spot Impact bias Semmelweis
    [Show full text]
  • The Self-Attribution Bias and Paranormal Beliefs ⇑ Michiel Van Elk
    Consciousness and Cognition 49 (2017) 313–321 Contents lists available at ScienceDirect Consciousness and Cognition journal homepage: www.elsevier.com/locate/concog The self-attribution bias and paranormal beliefs ⇑ Michiel van Elk Department of Psychology, University of Amsterdam, The Netherlands Amsterdam Brain and Cognition Center, University of Amsterdam, The Netherlands article info abstract Article history: The present study investigated the relation between paranormal beliefs, illusory control Received 8 December 2016 and the self-attribution bias, i.e., the motivated tendency to attribute positive outcomes Revised 2 February 2017 to oneself while negative outcomes are externalized. Visitors of a psychic fair played a card Accepted 2 February 2017 guessing game and indicated their perceived control over randomly selected cards as a function of the congruency and valence of the card. A stronger self-attribution bias was observed for paranormal believers compared to skeptics and this bias was specifically Keywords: related to traditional religious beliefs and belief in superstition. No relation between para- Paranormal beliefs normal beliefs and illusory control was found. Self-report measures indicated that paranor- Self-attribution bias Cognitive biases mal beliefs were associated to being raised in a spiritual family and to anomalous Individual differences experiences during childhood. Thereby this study suggests that paranormal beliefs are Superstition related to specific cognitive biases that in turn are shaped by socio-cultural factors. Illusion of control Ó 2017 Elsevier Inc. All rights reserved. 1. Introduction Paranormal beliefs refer to an eclectic range of New Age beliefs and practices, involving belief in Psi, precognition, witch- craft, superstition, as well as telekinesis and channeling (Lindeman & Svedholm, 2012).
    [Show full text]
  • Optimal Illusion of Control and Related Perception Biases
    OPTIMAL ILLUSION OF CONTROL AND RELATED PERCEPTION BIASES Olivier Gossner Jakub Steiner CERGE Charles University Center for Economic Research and Graduate Education Academy of Sciences of the Czech Republic Economics Institute EI WORKING PAPER SERIES (ISSN 1211-3298) Electronic Version 571 Working Paper Series 571 (ISSN 1211-3298) Optimal Illusion of Control and Related Perception Biases Olivier Gossner Jakub Steiner CERGE-EI Prague, September 2016 ISBN 978-80-7343-378-9 (Univerzita Karlova v Praze, Centrum pro ekonomický výzkum a doktorské studium) ISBN 978-80-7344-394-8 (Národohospodářský ústav AV ČR, v. v. i.) Optimal Illusion of Control and Related Perception Biases∗ Olivier Gossnery CREST, CNRS { Ecole´ Polytechnique, Universit´eParis Saclay and London School of Economics Jakub Steinerz University of Edinburgh and Cerge-Ei September 2016 Abstract We study perception biases arising under second-best perception strategies. An agent correctly observes a parameter that is payoff-relevant in many decision problems that she encounters in her environment but is unable to retain all the information un- til her decision. A designer of the decision process chooses a perception strategy that determines the distribution of the perception errors. If some information loss is un- avoidable due to cognition constraints, then (under additional conditions) the optimal perception strategy exhibits the illusion of control, overconfidence, and optimism. ∗We thank Andrew Clausen, Bard Harstad, Yuval Heller, Ed Hopkins, Drazen Prelec, J´ozsefS´akovics, Colin Stewart, Andy Zapechelnyuk, and the audiences at YEM 2016, BRIC 2016, GAMES 2016 and the Limited Cognition workshop in Barcelona for comments. Ludmila Matyskov´ahas provided excellent research assistance and Judith Levi has carefully edited the text.
    [Show full text]
  • I Am Fine but You Are Not: Optimistic Bias and Illusion of Control on Information Security Hyeun-Suk Rhee University of Texas at Dallas
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by AIS Electronic Library (AISeL) Association for Information Systems AIS Electronic Library (AISeL) International Conference on Information Systems ICIS 2005 Proceedings (ICIS) December 2005 I Am Fine but You Are Not: Optimistic Bias and Illusion of Control on Information Security Hyeun-Suk Rhee University of Texas at Dallas Young Ryu University of Texas at Dallas Cheong-Tag Kim Seoul National University Follow this and additional works at: http://aisel.aisnet.org/icis2005 Recommended Citation Rhee, Hyeun-Suk; Ryu, Young; and Kim, Cheong-Tag, "I Am Fine but You Are Not: Optimistic Bias and Illusion of Control on Information Security" (2005). ICIS 2005 Proceedings. 32. http://aisel.aisnet.org/icis2005/32 This material is brought to you by the International Conference on Information Systems (ICIS) at AIS Electronic Library (AISeL). It has been accepted for inclusion in ICIS 2005 Proceedings by an authorized administrator of AIS Electronic Library (AISeL). For more information, please contact [email protected]. I AM FINE BUT YOU ARE NOT: OPTIMISTIC BIAS AND ILLUSION OF CONTROL ON INFORMATION SECURITY Hyeun-Suk Rhee Young U. Ryu School of Management School of Management University of Texas at Dallas University of Texas at Dallas Dallas, TX U.S.A. Dallas, TX U.S.A. [email protected] [email protected] Cheong-Tag Kim Department of Psychology Seoul National University Seoul, Korea [email protected] Abstract Information security is a critically important issue in current networked business and work environments. While there is extensive publicity on the increasing incidents of numerous information security breaches and their serious consequences, recent surveys and research on information security repeatedly identify the low levels of user and managerial awareness as a key obstacle to achieving a good information security posture.
    [Show full text]
  • 50 Cognitive and Affective Biases in Medicine (Alphabetically)
    50 Cognitive and Affective Biases in Medicine (alphabetically) Pat Croskerry MD, PhD, FRCP(Edin), Critical Thinking Program, Dalhousie University Aggregate bias: when physicians believe that aggregated data, such as those used to develop clinical practice guidelines, do not apply to individual patients (especially their own), they are exhibiting the aggregate fallacy. The belief that their patients are atypical or somehow exceptional, may lead to errors of commission, e.g. ordering x-rays or other tests when guidelines indicate none are required. Ambiguity effect: there is often an irreducible uncertainty in medicine and ambiguity is associated with uncertainty. The ambiguity effect is due to decision makers avoiding options when the probability is unknown. In considering options on a differential diagnosis, for example, this would be illustrated by a tendency to select options for which the probability of a particular outcome is known over an option for which the probability is unknown. The probability may be unknown because of lack of knowledge or because the means to obtain the probability (a specific test, or imaging) is unavailable. The cognitive miser function (choosing an option that requires less cognitive effort) may also be at play here. Anchoring: the tendency to perceptually lock on to salient features in the patient’s initial presentation too early in the diagnostic process, and failure to adjust this initial impression in the light of later information. This bias may be severely compounded by the confirmation bias. Ascertainment bias: when a physician’s thinking is shaped by prior expectation; stereotyping and gender bias are both good examples. Attentional bias: the tendency to believe there is a relationship between two variables when instances are found of both being present.
    [Show full text]
  • 1 Embrace Your Cognitive Bias
    1 Embrace Your Cognitive Bias http://blog.beaufortes.com/2007/06/embrace-your-co.html Cognitive Biases are distortions in the way humans see things in comparison to the purely logical way that mathematics, economics, and yes even project management would have us look at things. The problem is not that we have them… most of them are wired deep into our brains following millions of years of evolution. The problem is that we don’t know about them, and consequently don’t take them into account when we have to make important decisions. (This area is so important that Daniel Kahneman won a Nobel Prize in 2002 for work tying non-rational decision making, and cognitive bias, to mainstream economics) People don’t behave rationally, they have emotions, they can be inspired, they have cognitive bias! Tying that into how we run projects (project leadership as a compliment to project management) can produce results you wouldn’t believe. You have to know about them to guard against them, or use them (but that’s another article)... So let’s get more specific. After the jump, let me show you a great list of cognitive biases. I’ll bet that there are at least a few that you haven’t heard of before! Decision making and behavioral biases Bandwagon effect — the tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink, herd behaviour, and manias. Bias blind spot — the tendency not to compensate for one’s own cognitive biases. Choice-supportive bias — the tendency to remember one’s choices as better than they actually were.
    [Show full text]