Tversky & Kahneman, 1983

Total Page:16

File Type:pdf, Size:1020Kb

Tversky & Kahneman, 1983 Psychological Review VOLUME 90 NUMBER 4 OCTOBER 1983 Extensional Versus Intuitive Reasoning: The Conjunction Fallacy in Probability Judgment Amos Tversky Daniel Kahneman Stanford University University of British Columbia, Vancouver, British Columbia, Canada Perhaps the simplest and the most basic qualitative law of probability is the con- junction rule: The probability of a conjunction, P(A&B), cannot exceed the prob- abilities of its constituents, P(A) and .P(B), because the extension (or the possibility set) of the conjunction is included in the extension of its constituents. Judgments under uncertainty, however, are often mediated by intuitive heuristics that are not bound by the conjunction rule. A conjunction can be more representative than one of its constituents, and instances of a specific category can be easier to imagine or to retrieve than instances of a more inclusive category. The representativeness and availability heuristics therefore can make a conjunction appear more probable than one of its constituents. This phenomenon is demonstrated in a variety of contexts including estimation of word frequency, personality judgment, medical prognosis, decision under risk, suspicion of criminal acts, and political forecasting. Systematic violations of the conjunction rule are observed in judgments of lay people and of experts in both between-subjects and within-subjects comparisons. Alternative interpretations of the conjunction fallacy are discussed and attempts to combat it are explored. Uncertainty is an unavoidable aspect of the the last decade (see, e.g., Einhorn & Hogarth, human condition. Many significant choices 1981; Kahneman, Slovic, & Tversky, 1982; must be based on beliefs about the likelihood Nisbett & Ross, 1980). Much of this research of such uncertain events as the guilt of a de- has compared intuitive inferences and prob- fendant, the result of an election, the future ability judgments to the rules of statistics and value of the dollar, the outcome of a medical the laws of-probability. The student of judg- operation, or the response of a friend. Because ment uses the probability calculus as a stan- we normally do not have adequate formal dard of comparison much as a student of per- models for computing the probabilities of such ception might compare the perceived sizes of events, intuitive judgment is often the only objects to their physical sizes. Unlike the cor- practical method for assessing uncertainty. rect size of objects, however, the "correct" The question of how lay people and experts probability of events is not easily defined. Be- evaluate the probabilities of uncertain events cause individuals who have different knowl- has attracted considerable research interest in edge or who hold different beliefs must be al- lowed to assign different probabilities to the This research was supported by Grant NR197-058 from same event> no sing!e value can be correct for the U.S. Office of Naval Research. We are grateful to friends all people. Furthermore, a correct probability and colleagues, too numerous to list by name, for their cannot always be determined even for a single useful comments and suggestions on an earlier draft of person Outside the domain of random sam- 'Xuets for reprints should be sent to Amos Tversky, PlinS> Probability theory does not determine Department of Psychology, Jordan Hall, Building 420, the probabilities of uncertain events—it merely Stanford University, Stanford, California 94305. imposes constraints on the relations among Copyright 1983 by the American Psychological Association, Inc. 293 294 AMOS TVERSKY AND DANIEL KAHNEMAN them. For example, if A is more probable than sessments that are routinely carried out as part B, then the complement of A must be less of the perception of events and the compre- probable than the complement of B. hension of messages. Such natural assessments The laws of probability derive from exten- include computations of similarity and rep- sional considerations. A probability measure resentativeness, attributions of causality, and is defined on a family of events and each event evaluations of the availability of associations is construed as a set of possibilities, such as and exemplars. These assessments, we propose, the three ways of getting a 10 on a throw of are performed even in the absence of a specific a pair of dice. The probability of an event task set, although their results are used to meet equals the sum of the probabilities of its dis- task demands as they arise. For example, the joint outcomes. Probability theory has tradi- mere mention of "horror movies" activates tionally been used to analyze repetitive chance instances of horror movies and evokes an as- processes, but the theory has also been applied sessment of their availability. Similarly, the to essentially unique events where probability statement that Woody Allen's aunt had hoped is not reducible to the relative frequency of that he would be a dentist elicits a comparison "favorable" outcomes. The probability that the of the character to the stereotype and an as- man who sits next to you on the plane is un- sessment of representativeness. It is presum- married equals the probability that he is a ably the mismatch between Woody Allen's bachelor plus the probability that he is either personality and our stereotype of a dentist that divorced or widowed. Additivity applies even makes the thought mildly amusing. Although when probability does not have a frequentistic these assessments are not tied to the estimation interpretation and when the elementary events of frequency or probability, they are likely to are not equiprobable. play a dominant role when such judgments The simplest and most fundamental qual- are required. The availability of horror movies itative law of probability is the extension rule: may be used to answer the question, "What If the extension of A includes the extension proportion of the movies produced last year of B (i.e., A D B) then P(A) > P(B). Because were horror movies?", and representativeness the set of possibilities associated with a con- may control the judgment that a particular junction A&B is included in the set of pos- boy is more likely to be an actor than a dentist. sibilities associated with B, the same principle The term judgmental heuristic refers to a can also be expressed by the conjunction rule strategy-whether deliberate or not—that relies /•(A&B) < P(B): A conjunction cannot be on a natural assessment to produce an esti- more probable than one of its constituents. mation or a prediction. One of the manifes- This rule holds regardless of whether A and tations of a heuristic is the relative neglect of B are independent and is valid for any prob- other considerations. For example, the resem- ability assignment on the same sample space. blance of a child to various professional ste- Furthermore, it applies not only to the stan- reotypes may be given too much weight in dard probability calculus but also to nonstan- predicting future vocational choice, at the ex- dard models such as upper and lower prob- pense of other pertinent data such as the base- ability (Dempster, 1967;Suppes, 1975), belief rate frequencies of occupations. Hence, the function (Shafer, 1976), Baconian probability use of judgmental heuristics gives rise to pre- (Cohen, 1977), rational belief (Kyburg, in dictable biases. Natural assessments can affect press), and possibility theory (Zadeh, 1978). judgments in other ways, for which the term In contrast to formal theories of belief, in- heuristic is less apt. First, people sometimes tuitive judgments of probability are generally misinterpret their task and fail to distinguish not extensional. People do not normally an- the required judgment from the natural as- alyze daily events into exhaustive lists of pos- sessment that the problem evokes. Second, the sibilities or evaluate compound probabilities natural assessment may act as an anchor to by aggregating elementary ones. Instead, they which the required judgment is assimiliated, commonly use a limited number of heuristics, even when the judge does not intend to use such as representativeness and availability the one to estimate the other. (Kahneman et al., 1982). Our conception of Previous discussions of errors of judgment judgmental heuristics is based on natural as- have focused on deliberate strategies and on PROBABILITY JUDGMENT 295 misinterpretations of tasks. The present treat- comparison of words of the form / y ment calls special attention to the processes with words of the form / _; the me- of anchoring and assimiliation, which are often dian estimates were 8.8 and 4.4, respectively. neither deliberate nor conscious. An example This example illustrates the structure of the from perception may be instructive: If two studies reported in this article. We constructed objects in a picture of a three-dimensional problems in which a reduction of extension scene have the same picture size, the one that was associated with an increase in availability appears more distant is not only seen as or representativeness, and we tested the con- "really" larger but also as larger in the picture. junction rule in judgments of frequency or The natural computation of real size evidently probability. In the next section we discuss the influences the (less natural) judgment of pic- representativeness heuristic and contrast it ture size, although observers are unlikely to with the conjunction rule in the context of confuse the two values or to use the former person perception. The third section describes to estimate the latter. conjunction fallacies in medical prognoses, The natural assessments of representative- sports forecasting, and choice among bets. In ness and availability do not conform to the the fourth section we investigate probability extensional logic of probability theory. In par- judgments for conjunctions of causes and ef- ticular, a conjunction can be more represen- fects and describe conjunction errors in sce- tative than one of its constituents, and in- narios of future events.
Recommended publications
  • Conjunction Fallacy' Revisited: How Intelligent Inferences Look Like Reasoning Errors
    Journal of Behavioral Decision Making J. Behav. Dec. Making, 12: 275±305 (1999) The `Conjunction Fallacy' Revisited: How Intelligent Inferences Look Like Reasoning Errors RALPH HERTWIG* and GERD GIGERENZER Max Planck Institute for Human Development, Berlin, Germany ABSTRACT Findings in recent research on the `conjunction fallacy' have been taken as evid- ence that our minds are not designed to work by the rules of probability. This conclusion springs from the idea that norms should be content-blind Ð in the present case, the assumption that sound reasoning requires following the con- junction rule of probability theory. But content-blind norms overlook some of the intelligent ways in which humans deal with uncertainty, for instance, when drawing semantic and pragmatic inferences. In a series of studies, we ®rst show that people infer nonmathematical meanings of the polysemous term `probability' in the classic Linda conjunction problem. We then demonstrate that one can design contexts in which people infer mathematical meanings of the term and are therefore more likely to conform to the conjunction rule. Finally, we report evidence that the term `frequency' narrows the spectrum of possible interpreta- tions of `probability' down to its mathematical meanings, and that this fact Ð rather than the presence or absence of `extensional cues' Ð accounts for the low proportion of violations of the conjunction rule when people are asked for frequency judgments. We conclude that a failure to recognize the human capacity for semantic and pragmatic inference can lead rational responses to be misclassi®ed as fallacies. Copyright # 1999 John Wiley & Sons, Ltd. KEY WORDS conjunction fallacy; probabalistic thinking; frequentistic thinking; probability People's apparent failures to reason probabilistically in experimental contexts have raised serious concerns about our ability to reason rationally in real-world environments.
    [Show full text]
  • A Task-Based Taxonomy of Cognitive Biases for Information Visualization
    A Task-based Taxonomy of Cognitive Biases for Information Visualization Evanthia Dimara, Steven Franconeri, Catherine Plaisant, Anastasia Bezerianos, and Pierre Dragicevic Three kinds of limitations The Computer The Display 2 Three kinds of limitations The Computer The Display The Human 3 Three kinds of limitations: humans • Human vision ️ has limitations • Human reasoning 易 has limitations The Human 4 ️Perceptual bias Magnitude estimation 5 ️Perceptual bias Magnitude estimation Color perception 6 易 Cognitive bias Behaviors when humans consistently behave irrationally Pohl’s criteria distilled: • Are predictable and consistent • People are unaware they’re doing them • Are not misunderstandings 7 Ambiguity effect, Anchoring or focalism, Anthropocentric thinking, Anthropomorphism or personification, Attentional bias, Attribute substitution, Automation bias, Availability heuristic, Availability cascade, Backfire effect, Bandwagon effect, Base rate fallacy or Base rate neglect, Belief bias, Ben Franklin effect, Berkson's paradox, Bias blind spot, Choice-supportive bias, Clustering illusion, Compassion fade, Confirmation bias, Congruence bias, Conjunction fallacy, Conservatism (belief revision), Continued influence effect, Contrast effect, Courtesy bias, Curse of knowledge, Declinism, Decoy effect, Default effect, Denomination effect, Disposition effect, Distinction bias, Dread aversion, Dunning–Kruger effect, Duration neglect, Empathy gap, End-of-history illusion, Endowment effect, Exaggerated expectation, Experimenter's or expectation bias,
    [Show full text]
  • Avoiding the Conjunction Fallacy: Who Can Take a Hint?
    Avoiding the conjunction fallacy: Who can take a hint? Simon Klein Spring 2017 Master’s thesis, 30 ECTS Master’s programme in Cognitive Science, 120 ECTS Supervisor: Linnea Karlsson Wirebring Acknowledgments: The author would like to thank the participants for enduring the test session with challenging questions and thereby making the study possible, his supervisor Linnea Karlsson Wirebring for invaluable guidance and good questions during the thesis work, and his fiancée Amanda Arnö for much needed mental support during the entire process. 2 AVOIDING THE CONJUNCTION FALLACY: WHO CAN TAKE A HINT? Simon Klein Humans repeatedly commit the so called “conjunction fallacy”, erroneously judging the probability of two events occurring together as higher than the probability of one of the events. Certain hints have been shown to mitigate this tendency. The present thesis investigated the relations between three psychological factors and performance on conjunction tasks after reading such a hint. The factors represent the understanding of probability and statistics (statistical numeracy), the ability to resist intuitive but incorrect conclusions (cognitive reflection), and the willingness to engage in, and enjoyment of, analytical thinking (need-for-cognition). Participants (n = 50) answered 30 short conjunction tasks and three psychological scales. A bimodal response distribution motivated dichotomization of performance scores. Need-for-cognition was significantly, positively correlated with performance, while numeracy and cognitive reflection were not. The results suggest that the willingness to engage in, and enjoyment of, analytical thinking plays an important role for the capacity to avoid the conjunction fallacy after taking a hint. The hint further seems to neutralize differences in performance otherwise predicted by statistical numeracy and cognitive reflection.
    [Show full text]
  • The Availability Heuristic
    CHAPTER 11 THE AVAILABILITY HEURISTIC According to Amos Tversky and Daniel Kahneman (1974, p. 1127), the availability heuristic is a rule of thumb in which decision makers "assess the frequency of a class or the probability of an event by the ease with which instances or occurrences can be brought to mind." Usually this heuristic works quite well; all things being equal, common events are easier to remember or imagine than are uncommon events. By rely­ ing on availability to estimate frequency and probability, decision mak­ ers are able to simplify what might otherwise be very difficult judg­ ments. As with any heuristic, however, there are cases in which the general rule of thumb breaks down and leads to systematic biases. Some events are more available than others not because they tend to occur frequently or with high probability, but because they are inherently easier to think about, because they have taken place recently, because they are highly emotional, and so forth. This chapter examines three general questions: (1) What are instances in which the availability heuristic leads to biased judgments? (2) Do decision makers perceive an event as more likely after they have imagined it happening? (3) How is vivid information dif­ ferent from other information? AVAILABILITY GOES AWRY Which is a more likely cause of death in the United States-being killed by falling airplane parts or by a shark? Most people rate shark attacks as more probable than death from falling airplane parts (see Item #7 of the Reader Survey for your answer). Shark attacks certainly receive more publicity than do deaths from falling airplane parts, and they are far easier to imagine (thanks in part to movies such as Jaws).
    [Show full text]
  • Graphical Techniques in Debiasing: an Exploratory Study
    GRAPHICAL TECHNIQUES IN DEBIASING: AN EXPLORATORY STUDY by S. Bhasker Information Systems Department Leonard N. Stern School of Business New York University New York, New York 10006 and A. Kumaraswamy Management Department Leonard N. Stern School of Business New York University New York, NY 10006 October, 1990 Center for Research on Information Systems Information Systems Department Leonard N. Stern School of Business New York University Working Paper Series STERN IS-90-19 Forthcoming in the Proceedings of the 1991 Hawaii International Conference on System Sciences Center for Digital Economy Research Stem School of Business IVorking Paper IS-90-19 Center for Digital Economy Research Stem School of Business IVorking Paper IS-90-19 2 Abstract Base rate and conjunction fallacies are consistent biases that influence decision making involving probability judgments. We develop simple graphical techniques and test their eflcacy in correcting for these biases. Preliminary results suggest that graphical techniques help to overcome these biases and improve decision making. We examine the implications of incorporating these simple techniques in Executive Information Systems. Introduction Today, senior executives operate in highly uncertain environments. They have to collect, process and analyze a deluge of information - most of it ambiguous. But, their limited information acquiring and processing capabilities constrain them in this task [25]. Increasingly, executives rely on executive information/support systems for various purposes like strategic scanning of their business environments, internal monitoring of their businesses, analysis of data available from various internal and external sources, and communications [5,19,32]. However, executive information systems are, at best, support tools. Executives still rely on their mental or cognitive models of their businesses and environments and develop heuristics to simplify decision problems [10,16,25].
    [Show full text]
  • Nudge Theory
    Nudge Theory http://www.businessballs.com/nudge-theory.htm Nudge theory is a flexible and modern concept for: • understanding of how people think, make decisions, and behave, • helping people improve their thinking and decisions, • managing change of all sorts, and • identifying and modifying existing unhelpful influences on people. Nudge theory was named and popularized by the 2008 book, 'Nudge: Improving Decisions About Health, Wealth, and Happiness', written by American academics Richard H Thaler and Cass R Sunstein. The book is based strongly on the Nobel prize- winning work of the Israeli-American psychologists Daniel Kahneman and Amos Tversky. This article: • reviews and explains Thaler and Sunstein's 'Nudge' concept, especially 'heuristics' (tendencies for humans to think and decide instinctively and often mistakenly) • relates 'Nudge' methods to other theories and models, and to Kahneman and Tversky's work • defines and describes additional methods of 'nudging' people and groups • extends the appreciation and application of 'Nudge' methodology to broader change- management, motivation, leadership, coaching, counselling, parenting, etc • offers 'Nudge' methods and related concepts as a 'Nudge' theory 'toolkit' so that the concept can be taught and applied in a wide range of situations involving relationships with people, and enabling people to improve their thinking and decision- making • and offers a glossary of Nudge theory and related terms 'Nudge' theory was proposed originally in US 'behavioral economics', but it can be adapted and applied much more widely for enabling and encouraging change in people, groups, or yourself. Nudge theory can also be used to explore, understand, and explain existing influences on how people behave, especially influences which are unhelpful, with a view to removing or altering them.
    [Show full text]
  • The Clinical and Ethical Implications of Cognitive Biases
    THE CLINICAL AND ETHICAL IMPLICATIONS OF COGNITIVE BIASES Brendan Leier, PhD Clinical Ethicist, University of Alberta Hospital/Stollery Children’s Hospital & Mazankowski Heart Institute & Clinical Assistant Professor Faculty of Medicine and Dentistry & John Dossetor Health Ethics Centre Some games 2 Wason selection task • choose two cards to turn over in order to test the following E4 hypothesis: • If the card has a vowel on one side, it must have an even number on the other side 7K 3 • Most subjects make the error of choosing E & 4 E4 • Traditionally subjects fail to select the cards E & 7 that can both correctly confirm and falsify the hypothesis 7K 4 The Monty Hall Problem Monty asks you to choose between three boxes. One box contains a valuable prize, the other two boxes do not. 5 The Monty Hall Problem Box A Box B Box C 6 The Monty Hall Problem After you choose Box A, Monty reveals Box C as empty, and then asks you if you would like to switch your choice. Of the remaining two Box A and Box B, do you switch your choice? 7 Do you switch from A to B? Box A Box B 8 Should you switch from A to B? Box A Box B 9 Yes, you should you switch from A to B Box A Box B 33% 50% 10 last one A bat and a ball cost $1.10 in total. The bat costs 1 dollar more than the ball. How much does the ball cost? 11 Reasoning and Rationality • Sub-field of epistemology • Looks for normative guidance in acquiring/establishing claims to knowledge or systems of inquiry.
    [Show full text]
  • CHALK TALK Thinking About Thinking: Medical Decision Making Under the Microscope Christiana Iyasere, MD, and Douglas Wright, MD, Phd
    SGIM FORUM 2011; 34(11) CHALK TALK Thinking about Thinking: Medical Decision Making Under the Microscope Christiana Iyasere, MD, and Douglas Wright, MD, PhD Drs. Iyasere and Wright are faculty in the Inpatient Clinician Educator Service of the Department of Medicine at Massachusetts General Hospital in Boson, MA. ase: A 36-year-old African-Ameri- athletic—he graduated from college than 10 seconds and says “20,160” C can woman, healthy except for with a degree in physics and has before moving breezily along with treated hypothyroidism, visits you in completed several triathlons. Neil is a her coffee. Having Katrina’s input, are clinic complaining of six months of fa- veteran of the US Navy, where he you tempted to change your answer tigue and progressive shortness of served as fleet naval aviator and land- to questions 3a and 3b? Go ahead, breath with exertion. You thoroughly ing signal officer. Is Neil more likely to admit it. Aren’t you now more confi- interview and examine the patient. be: a) a librarian or b) an astronaut? dent that the correct answer is that Physical examination reveals conjunc- Question 2: Jot down a list of the product is closest to 20,000? tival pallor and dullness to percussion English words that begin with the let- Question 5: You have known one third of the way up both lung ter “r” (e.g. rooster). Next, jot down your medical school roommate Jus- fields. Something tells you to ask her a list of words that have an r in the tice for four years.
    [Show full text]
  • A Statistical Guide for the Ethically Perplexed Preamble Introduction
    A Statistical Guide for the Ethically Perplexed Lawrence Hubert and Howard Wainer Preamble The meaning of \ethical" adopted here is one of being in accordance with the accepted rules or standards for right conduct that govern the practice of some profession. The profes- sions we have in mind are statistics and the behavioral sciences, and the standards for ethical practice are what we try to instill in our students through the methodology courses we offer, with particular emphasis on the graduate statistics sequence generally required for all the behavioral sciences. Our hope is that the principal general education payoff for competent statistics instruction is an increase in people's ability to be critical and ethical consumers and producers of the statistical reasoning and analyses faced in various applied contexts over the course of their careers. Introduction Generations of graduate students in the behavioral and social sciences have completed mandatory year-long course sequences in statistics, sometimes with difficulty and possibly with less than positive regard for the content and how it was taught. Prior to the 1960's, such a sequence usually emphasized a cookbook approach where formulas were applied un- thinkingly using mechanically operated calculators. The instructional method could be best characterized as \plug and chug," where there was no need to worry about the meaning of what one was doing, only that the numbers could be put in and an answer generated. This process would hopefully lead to numbers that could then be looked up in tables; in turn, p-values were sought that were less than the magical .05, giving some hope of getting an attendant paper published.
    [Show full text]
  • Probability, Confirmation, and the Conjunction Fallacy
    Probability, Confirmation, and the Conjunction Fallacy Vincenzo Crupi* Branden Fitelson† Katya Tentori‡ April, 2007 * Department of Cognitive Sciences and Education and CIMeC (University of Trento) and Cognitive Psychology Laboratory, CNRS (University of Aix-Marseille I). Contact: [email protected]. † Department of Philosophy (University of California, Berkeley). Contact: [email protected]. ‡ Department of Cognitive Sciences and Education and CIMeC (University of Trento). Contact: [email protected]. Probability, Confirmation, and the Conjunction Fallacy Abstract. The conjunction fallacy has been a key topic in debates on the rationality of human reasoning and its limitations. Despite extensive inquiry, however, the attempt of providing a satisfactory account of the phenomenon has proven challenging. Here, we elaborate the suggestion (first discussed by Sides et al., 2001) that in standard conjunction problems the fallacious probability judgments experimentally observed are typically guided by sound assessments of confirmation relations, meant in terms of contemporary Bayesian confirmation theory. Our main formal result is a confirmation- theoretic account of the conjuntion fallacy which is proven robust (i.e., not depending on various alternative ways of measuring degrees of confirmation). The proposed analysis is shown distinct from contentions that the conjunction effect is in fact not a fallacy and is compared with major competing explanations of the phenomenon, including earlier references to a confirmation-theoretic account. 2 Acknowledgements. Research supported by PRIN 2005 grant Le dinamiche della conoscenza nella società dell’informazione and by a grant from the SMC/Fondazione Cassa di Risparmio di Trento e Rovereto. Versions of this work have been presented at the Cognitive Psychology Laboratory, CNRS – University of Aix-Marseille I, at the workshop Bayesianism, fundamentally, Center for Philosophy of Science, University of Pittsburgh and at the conference Confirmation, induction and science, London School of Economics.
    [Show full text]
  • Mitigating Cognitive Bias and Design Distortion
    RSD2 Relating Systems Thinking and Design 2013 Working Paper. www.systemic-design.net Interliminal Design: Mitigating Cognitive Bias and Design Distortion 1 2 3 Andrew McCollough, PhD , DeAunne Denmark, MD, PhD , and Don Harker, MS “Design is an inquiry for action” - Harold Nelson, PhD Liminal: from the Latin word līmen, a threshold “Our values and biases are expressions of who we are. It isn’t so much that our prior commitments disable our ability to reason; it is that we need to appreciate that reasoning takes place, for the most part, in settings in which we are not neutral bystanders.” - Alva Noe In this globally interconnected and information-rich society, complexity and uncertainty are escalating at an unprecedented pace. Our most pressing problems are now considered wicked, or even super wicked, in that they commonly have incomplete, continuously changing, intricately interdependent yet contradictory requirements, urgent or emergency status, no central authority, and are often caused by the same entities charged with solving them (Rittel & Webber, 1973). Perhaps similar to philosopher Karl Jaspers’ Axial Age where ‘man asked radical questions’ and the ‘unquestioned grasp on life was loosened,’ communities of all kinds now face crises of “in-between”; necessary and large-scale dismantling of previous structures, institutions, and world-views has begun, but has yet to be replaced. Such intense periods of destruction and reconstruction, or liminality, eliminate boundary lines, reverse or dissolve social hierarchies, disrupt cultural continuity, and cast penetrating onto future outcomes. Originating in human ritual, liminal stages occur mid-transition, where initiates literally "stand at the threshold" between outmoded constructs of identity or community, and entirely new ways of being.
    [Show full text]
  • 1 Embrace Your Cognitive Bias
    1 Embrace Your Cognitive Bias http://blog.beaufortes.com/2007/06/embrace-your-co.html Cognitive Biases are distortions in the way humans see things in comparison to the purely logical way that mathematics, economics, and yes even project management would have us look at things. The problem is not that we have them… most of them are wired deep into our brains following millions of years of evolution. The problem is that we don’t know about them, and consequently don’t take them into account when we have to make important decisions. (This area is so important that Daniel Kahneman won a Nobel Prize in 2002 for work tying non-rational decision making, and cognitive bias, to mainstream economics) People don’t behave rationally, they have emotions, they can be inspired, they have cognitive bias! Tying that into how we run projects (project leadership as a compliment to project management) can produce results you wouldn’t believe. You have to know about them to guard against them, or use them (but that’s another article)... So let’s get more specific. After the jump, let me show you a great list of cognitive biases. I’ll bet that there are at least a few that you haven’t heard of before! Decision making and behavioral biases Bandwagon effect — the tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink, herd behaviour, and manias. Bias blind spot — the tendency not to compensate for one’s own cognitive biases. Choice-supportive bias — the tendency to remember one’s choices as better than they actually were.
    [Show full text]