The Impact of Delegating Decision Making to IT on the Sunk Cost Effect
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
The Psychology of Sunk Cost
ORGANIZATIONAL BEHAVIOR AND HUMAN DECISION PROCESSES 35, 124-140(1985) The Psychology of Sunk Cost HAL R. ARKES AND CATHERINE BLUMER Ohio University The sunk cost effect is manifested in a greater tendency to continue an endeavor once an investment in money, effort, or time has been made. Evi- dence that the psychological justification for this behavior is predicated on the desire not to appear wasteful is presented. In a field study, customers who had initially paid more for a season subscription to a theater series attended more plays during the next 6 months, presumably because of their higher sunk cost in the season tickets. Several questionnaire studies corroborated and extended this finding. It is found that those who had incurred a sunk cost inflated their estimate of how likely a project was to succeed compared to the estimates of the same project by those who had not incurred a sunk cost. The basic sunk cost finding that people will throw good money after bad appears to be well described by prospect theory (D. Kahneman & A. Tversky, 1979, Econometrica, 47, 263-291). Only moderate support for the contention that personal involvement increases the sunk cost effect is presented. The sunk cost effect was not lessened by having taken prior courses in economics. Finally, the sunk cost effect cannot be fully subsumed under any of several social psychological theories. 0 1985 Academic PRSS, IIIC. To terminate a project in which $1.1 billion has been invested represents an un- conscionable mishandling of taxpayers’ dollars. Senator Denton, November 4, 1981 Completing Tennessee-Tombigbee [Waterway Project] is not a waste of taxpayer dollars. -
Ambiguity Aversion in Qualitative Contexts: a Vignette Study
Ambiguity aversion in qualitative contexts: A vignette study Joshua P. White Melbourne School of Psychological Sciences University of Melbourne Andrew Perfors Melbourne School of Psychological Sciences University of Melbourne Abstract Most studies of ambiguity aversion rely on experimental paradigms involv- ing contrived monetary bets. Thus, the extent to which ambiguity aversion is evident outside of such contexts is largely unknown, particularly in those contexts which cannot easily be reduced to numerical terms. The present work seeks to understand whether ambiguity aversion occurs in a variety of different qualitative domains, such as work, family, love, friendship, exercise, study and health. We presented participants with 24 vignettes and measured the degree to which they preferred risk to ambiguity. In a separate study we asked participants for their prior probability estimates about the likely outcomes in the ambiguous events. Ambiguity aversion was observed in the vast majority of vignettes, but at different magnitudes. It was predicted by gain/loss direction but not by the prior probability estimates (with the inter- esting exception of the classic Ellsberg ‘urn’ scenario). Our results suggest that ambiguity aversion occurs in a wide variety of qualitative contexts, but to different degrees, and may not be generally driven by unfavourable prior probability estimates of ambiguous events. Corresponding Author: Joshua P. White ([email protected]) AMBIGUITY AVERSION IN QUALITATIVE CONTEXTS: A VIGNETTE STUDY 2 Introduction The world is replete with the unknown, yet people generally prefer some types of ‘unknown’ to others. Here, an important distinction exists between risk and uncertainty. As defined by Knight (1921), risk is a measurable lack of certainty that can be represented by numerical probabilities (e.g., “there is a 50% chance that it will rain tomorrow”), while ambiguity is an unmeasurable lack of certainty (e.g., “there is an unknown probability that it will rain tomorrow”). -
A Task-Based Taxonomy of Cognitive Biases for Information Visualization
A Task-based Taxonomy of Cognitive Biases for Information Visualization Evanthia Dimara, Steven Franconeri, Catherine Plaisant, Anastasia Bezerianos, and Pierre Dragicevic Three kinds of limitations The Computer The Display 2 Three kinds of limitations The Computer The Display The Human 3 Three kinds of limitations: humans • Human vision ️ has limitations • Human reasoning 易 has limitations The Human 4 ️Perceptual bias Magnitude estimation 5 ️Perceptual bias Magnitude estimation Color perception 6 易 Cognitive bias Behaviors when humans consistently behave irrationally Pohl’s criteria distilled: • Are predictable and consistent • People are unaware they’re doing them • Are not misunderstandings 7 Ambiguity effect, Anchoring or focalism, Anthropocentric thinking, Anthropomorphism or personification, Attentional bias, Attribute substitution, Automation bias, Availability heuristic, Availability cascade, Backfire effect, Bandwagon effect, Base rate fallacy or Base rate neglect, Belief bias, Ben Franklin effect, Berkson's paradox, Bias blind spot, Choice-supportive bias, Clustering illusion, Compassion fade, Confirmation bias, Congruence bias, Conjunction fallacy, Conservatism (belief revision), Continued influence effect, Contrast effect, Courtesy bias, Curse of knowledge, Declinism, Decoy effect, Default effect, Denomination effect, Disposition effect, Distinction bias, Dread aversion, Dunning–Kruger effect, Duration neglect, Empathy gap, End-of-history illusion, Endowment effect, Exaggerated expectation, Experimenter's or expectation bias, -
MITIGATING COGNITIVE BIASES in RISK IDENTIFICATION: Practitioner Checklist for the AEROSPACE SECTOR
MITIGATING COGNITIVE BIASES IN RISK IDENTIFICATION: Practitioner Checklist for the AEROSPACE SECTOR Debra L. Emmons, Thomas A. Mazzuchi, Shahram Sarkani, and Curtis E. Larsen This research contributes an operational checklist for mitigating cogni- tive biases in the aerospace sector risk management process. The Risk Identification and Evaluation Bias Reduction Checklist includes steps for grounding the risk identification and evaluation activities in past project experiences through historical data, and emphasizes the importance of incorporating multiple methods and perspectives to guard against optimism and a singular project instantiation-focused view. The authors developed a survey to elicit subject matter expert judgment on the value of the check- list to support its use in government and industry as a risk management tool. The survey also provided insights on bias mitigation strategies and lessons learned. This checklist addresses the deficiency in the literature in providing operational steps for the practitioner to recognize and implement strategies for bias reduction in risk management in the aerospace sector. DOI: https://doi.org/10.22594/dau.16-770.25.01 Keywords: Risk Management, Optimism Bias, Planning Fallacy, Cognitive Bias Reduction Mitigating Cognitive Biases in Risk Identification http://www.dau.mil January 2018 This article and its accompanying research contribute an operational FIGURE 1. RESEARCH APPROACH Risk Identification and Evaluation Bias Reduction Checklist for cognitive bias mitigation in risk management for the aerospace sector. The checklist Cognitive Biases & Bias described herein offers a practical and implementable project management Enabling framework to help reduce biases in the aerospace sector and redress the Conditions cognitive limitations in the risk identification and analysis process. -
Communication Science to the Public
David M. Berube North Carolina State University ▪ HOW WE COMMUNICATE. In The Age of American Unreason, Jacoby posited that it trickled down from the top, fueled by faux-populist politicians striving to make themselves sound approachable rather than smart. (Jacoby, 2008). EX: The average length of a sound bite by a presidential candidate in 1968 was 42.3 seconds. Two decades later, it was 9.8 seconds. Today, it’s just a touch over seven seconds and well on its way to being supplanted by 140/280- character Twitter bursts. ▪ DATA FRAMING. ▪ When asked if they truly believe what scientists tell them, NEW ANTI- only 36 percent of respondents said yes. Just 12 percent expressed strong confidence in the press to accurately INTELLECTUALISM: report scientific findings. ▪ ROLE OF THE PUBLIC. A study by two Princeton University researchers, Martin TRENDS Gilens and Benjamin Page, released Fall 2014, tracked 1,800 U.S. policy changes between 1981 and 2002, and compared the outcome with the expressed preferences of median- income Americans, the affluent, business interests and powerful lobbies. They concluded that average citizens “have little or no independent influence” on policy in the U.S., while the rich and their hired mouthpieces routinely get their way. “The majority does not rule,” they wrote. ▪ Anti-intellectualism and suspicion (trends). ▪ Trump world – outsiders/insiders. ▪ Erasing/re-writing history – damnatio memoriae. ▪ False news. ▪ Infoxication (CC) and infobesity. ▪ Aggregators and managed reality. ▪ Affirmation and confirmation bias. ▪ Negotiating reality. ▪ New tribalism is mostly ideational not political. ▪ Unspoken – guns, birth control, sexual harassment, race… “The amount of technical information is doubling every two years. -
Cognitive Biases in Software Engineering: a Systematic Mapping Study
Cognitive Biases in Software Engineering: A Systematic Mapping Study Rahul Mohanani, Iflaah Salman, Burak Turhan, Member, IEEE, Pilar Rodriguez and Paul Ralph Abstract—One source of software project challenges and failures is the systematic errors introduced by human cognitive biases. Although extensively explored in cognitive psychology, investigations concerning cognitive biases have only recently gained popularity in software engineering research. This paper therefore systematically maps, aggregates and synthesizes the literature on cognitive biases in software engineering to generate a comprehensive body of knowledge, understand state of the art research and provide guidelines for future research and practise. Focusing on bias antecedents, effects and mitigation techniques, we identified 65 articles (published between 1990 and 2016), which investigate 37 cognitive biases. Despite strong and increasing interest, the results reveal a scarcity of research on mitigation techniques and poor theoretical foundations in understanding and interpreting cognitive biases. Although bias-related research has generated many new insights in the software engineering community, specific bias mitigation techniques are still needed for software professionals to overcome the deleterious effects of cognitive biases on their work. Index Terms—Antecedents of cognitive bias. cognitive bias. debiasing, effects of cognitive bias. software engineering, systematic mapping. 1 INTRODUCTION OGNITIVE biases are systematic deviations from op- knowledge. No analogous review of SE research exists. The timal reasoning [1], [2]. In other words, they are re- purpose of this study is therefore as follows: curring errors in thinking, or patterns of bad judgment Purpose: to review, summarize and synthesize the current observable in different people and contexts. A well-known state of software engineering research involving cognitive example is confirmation bias—the tendency to pay more at- biases. -
Mitigating Cognitive Biases in Risk Identification
https://ntrs.nasa.gov/search.jsp?R=20190025941 2019-08-31T14:18:02+00:00Z View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by NASA Technical Reports Server Mitigating Cognitive Biases in Risk Identification: Practitioner Checklist for the Aerospace Sector Debra Emmons, Thomas A. Mazzuchi, Shahram Sarkani, and Curtis E. Larsen This research contributes an operational checklist for mitigating cognitive biases in the aerospace sector risk management process. The Risk Identification and Evaluation Bias Reduction Checklist includes steps for grounding the risk identification and evaluation activities in past project experiences, through historical data, and the importance of incorporating multiple methods and perspectives to guard against optimism and a singular project instantiation focused view. The authors developed a survey to elicit subject matter expert (SME) judgment on the value of the checklist to support its use in government and industry as a risk management tool. The survey also provided insights on bias mitigation strategies and lessons learned. This checklist addresses the deficiency in the literature in providing operational steps for the practitioner for bias reduction in risk management in the aerospace sector. Two sentence summary: Cognitive biases such as optimism, planning fallacy, anchoring and ambiguity effect influence the risk identification and analysis processes used in the aerospace sector at the Department of Defense, other government organizations, and industry. This article incorporates practical experience through subject matter expert survey feedback into an academically grounded operational checklist and offers strategies for the project manager and risk management practitioner to reduce these pervasive biases. MITIGATING COGNITIVE BIASES IN RISK IDENTIFICATION Keywords: risk management; optimism bias; planning fallacy; cognitive bias reduction The research began with the review of the literature, which covered the areas of risk management, cognitive biases and bias enabling conditions. -
50 Cognitive and Affective Biases in Medicine (Alphabetically)
50 Cognitive and Affective Biases in Medicine (alphabetically) Pat Croskerry MD, PhD, FRCP(Edin), Critical Thinking Program, Dalhousie University Aggregate bias: when physicians believe that aggregated data, such as those used to develop clinical practice guidelines, do not apply to individual patients (especially their own), they are exhibiting the aggregate fallacy. The belief that their patients are atypical or somehow exceptional, may lead to errors of commission, e.g. ordering x-rays or other tests when guidelines indicate none are required. Ambiguity effect: there is often an irreducible uncertainty in medicine and ambiguity is associated with uncertainty. The ambiguity effect is due to decision makers avoiding options when the probability is unknown. In considering options on a differential diagnosis, for example, this would be illustrated by a tendency to select options for which the probability of a particular outcome is known over an option for which the probability is unknown. The probability may be unknown because of lack of knowledge or because the means to obtain the probability (a specific test, or imaging) is unavailable. The cognitive miser function (choosing an option that requires less cognitive effort) may also be at play here. Anchoring: the tendency to perceptually lock on to salient features in the patient’s initial presentation too early in the diagnostic process, and failure to adjust this initial impression in the light of later information. This bias may be severely compounded by the confirmation bias. Ascertainment bias: when a physician’s thinking is shaped by prior expectation; stereotyping and gender bias are both good examples. Attentional bias: the tendency to believe there is a relationship between two variables when instances are found of both being present. -
Decision Making Under Sunk Cost Gregory Kenneth Laing University of Wollongong
University of Wollongong Research Online University of Wollongong Thesis Collection University of Wollongong Thesis Collections 2002 Decision making under sunk cost Gregory Kenneth Laing University of Wollongong Recommended Citation Laing, Gregory Kenneth, Decision making under sunk cost, Doctor of Philosophy thesis, School of Accounting and Finance, University of Wollongong, 2002. http://ro.uow.edu.au/theses/1909 Research Online is the open access institutional repository for the University of Wollongong. For further information contact Manager Repository Services: [email protected]. DECISION MAKING UNDER SUNK COST A Thesis submitted in fulfilment of the requirements for the award of the degree DOCTOR OF PHILOSOPHY from UNIVERSITY OF WOLLONGONG by GREGORY KENNETH LAING, BBus(Acc), MCom(Hons), Grad Cert Ed(Higher Ed) VOLUME 1 OF 2 SCHOOL OF ACCOUNTING & FINANCE 2002 CERTIFICATION I, Gregory Kenneth Laing, declare that this thesis, submitted in fulfilment of the requirements for the award of Doctor of Philosophy, in the School of Accounting and Finance, University of Wollongong, is wholly my own work unless otherwise referenced or acknowledged. The document has not been submitted for qualifications at any other academic institution. Gregory Kenneth Laing 19Atmh October 2002 u ABSTRACT DECISION MAKING UNDER SUNK COST Gregory Kenneth Laing This dissertation investigates whether trained professionals can be adversely influenced by sunk cost information when making financial decisions. The rational decision making model posits that sunk costs are irrelevant to choices between alternative courses of action. However, a review of the literature reveals that, contrary to the assumptions of normative theory, people do not always act in a rational manner when making financial decisions. -
1 Embrace Your Cognitive Bias
1 Embrace Your Cognitive Bias http://blog.beaufortes.com/2007/06/embrace-your-co.html Cognitive Biases are distortions in the way humans see things in comparison to the purely logical way that mathematics, economics, and yes even project management would have us look at things. The problem is not that we have them… most of them are wired deep into our brains following millions of years of evolution. The problem is that we don’t know about them, and consequently don’t take them into account when we have to make important decisions. (This area is so important that Daniel Kahneman won a Nobel Prize in 2002 for work tying non-rational decision making, and cognitive bias, to mainstream economics) People don’t behave rationally, they have emotions, they can be inspired, they have cognitive bias! Tying that into how we run projects (project leadership as a compliment to project management) can produce results you wouldn’t believe. You have to know about them to guard against them, or use them (but that’s another article)... So let’s get more specific. After the jump, let me show you a great list of cognitive biases. I’ll bet that there are at least a few that you haven’t heard of before! Decision making and behavioral biases Bandwagon effect — the tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink, herd behaviour, and manias. Bias blind spot — the tendency not to compensate for one’s own cognitive biases. Choice-supportive bias — the tendency to remember one’s choices as better than they actually were. -
The Sunk-Cost Fallacy in Penny Auctions
The Sunk-Cost Fallacy in Penny Auctions Ned Augenblick July 2015 Abstract This paper theoretically and empirically analyzes behavior in penny auctions, a relatively new auction mechanism. As in the dollar auction or war-of-attrition, players in penny auctions commit higher non-refundable costs as the auction continues and only win if all other players stop bidding. I first show that, in any equilibria that does not end immediately, players bid probabilistically such that the expected profit from every bid is zero. Then, using two large datasets covering 166,000 auctions, I calculate that average profit margins actually exceed 50%. To explain this deviation, I incorporate a sunk cost fallacy into the theoretical model to generate a set of predictions about hazard rates and player behavior, which I confirm empirically. While players do (slowly) learn to correct this bias and there are few obvious barriers to competition, activity in the market is rising and concentration remains relatively high. Keywords: Internet Auctions, Market Design, Sunk Costs JEL Classification Numbers: D44, D03, D22 Address: Haas School of Business, 545 Student Services #1900, Berkeley, CA 94720-1900 [email protected]. This paper was previously circulated with the title "Consumer and Producer Be- havior in the Market for Penny Auctions: A Theoretical and Empirical Analysis". The author is grateful to Doug Bernheim, Jon Levin, and Muriel Niederle for advice and suggestions, and Oren Ahoobim, Aaron Bodoh-Creed, Tomas Buser, Jesse Cunha, Ernesto Dal Bo, Stefano DellaVigna, Jakub Kastl, Fuhito Ko- jima. Carlos Lever, David Levine, Scott Nicholson, Monika Piazzesi, Matthew Rabin, and Annika Todd for helpful comments. -
A Parametric Analysis of the Sunk Cost Effect
Western Michigan University ScholarWorks at WMU Dissertations Graduate College 4-2020 A Parametric Analysis of the Sunk Cost Effect Amanda F. Devoto Western Michigan University, [email protected] Follow this and additional works at: https://scholarworks.wmich.edu/dissertations Part of the Applied Behavior Analysis Commons Recommended Citation Devoto, Amanda F., "A Parametric Analysis of the Sunk Cost Effect" (2020). Dissertations. 3618. https://scholarworks.wmich.edu/dissertations/3618 This Dissertation-Open Access is brought to you for free and open access by the Graduate College at ScholarWorks at WMU. It has been accepted for inclusion in Dissertations by an authorized administrator of ScholarWorks at WMU. For more information, please contact [email protected]. A PARAMETRIC ANALYSIS OF THE SUNK COST EFFECT by Amanda F. Devoto A dissertation submitted to the Graduate College in partial fulfillment of the requirements for the degree of Doctor of Philosophy Psychology Western Michigan University April 2020 Doctoral Committee: Anthony DeFulio, Ph.D., Chair Bradley Huitema, Ph.D. Cynthia Pietras, Ph.D. Wayne Fuqua, Ph.D. Tracy Zinn, Ph.D. Copyright by Amanda F. Devoto 2020 A PARAMETRIC ANALYSIS OF THE SUNK COST EFFECT Amanda F. Devoto, Ph.D. Western Michigan University, 2020 Sunk costs are previous investments of time, effort, or money toward a goal that cannot be recovered. People often honor sunk costs by continuing to pursue a goal, despite the availability of an alternative path that would pay off faster, a phenomenon called the sunk cost effect. Prior research has identified variables that influence the sunk cost effect. One variable found in hypothetical scenario-based research and in behavior-based research (Pattison et al., 2011) has been percent of goal completed.