Berkson's Paradox

Total Page:16

File Type:pdf, Size:1020Kb

Berkson's Paradox My bias is better than your bias Why we should all be aware of our cognitive biases SEMINAR SERIES: HOW TO RUIN YOUR CAREFULLY PLANNED STUDY? TIPS FOR IMPROVING DATA ANALYSIS – SESSION 3 BART JACOBS 2 3 4 We are all biased Psychologists claim it’s easier to recognize ``negative’’ traits in others. Actual research on cognitive biases is very difficult and subject to most biases they actually study! We won’t dive into the burden of proof whether a bias is real, or highly prevalent, which could fill a seminar series by itself. In this presentation, we will focus on: Recognising potential biases in ourselves and others. Avoiding the logical fallacies associated with them. Important sources of bias associated with data-analysis. 5 6 Confirmation bias Also known as “experimenter’s bias”. Key problem: researchers are convinced of a hypothesis, or hold more belief in it than is justified from actual data. May present itself in a variety of ways. What are the negative consequences in science? Examples? 7 Confirmation bias in practice Select information that confirms a previously held belief. Explain results in a way that matches a desirable hypothesis. Arguments are chosen or evaluated based on how strongly they match or support the envisioned conclusion. (“believe bias”) Ignore or dismiss results that don’t support the hypothesis. Failure to update one’s opinion when confronted with new and/or contradictory information. (“conservative bias”, “continued influence”) Design experiments to study the hypothesis, ignoring alternatives. Worst outcome of an experiment is “effect not found”, hypothesis can never be disproven by the experiment. (“congruence bias”) 8 9 Bandwagon effect Acceptance of popular ideas Based on amount of people who support them. Based on clout of the people who believe them. Problematic when scientific justification is lacking! What if pushed by established scientists without justification? Specialists with confirmation bias may block opposing theories. Conflicting interests may play a role. Major source of literature bias! Always judge the content, not the messenger 10 11 Prediction biases How good are we at judging our own prediction skills? Hindsight bias (“knew-it-all-along” attitude): viewing events as more predictable than they were after they occurred. May lead to false claims that “errors were preventable”. May introduce wrong explanations or imagined causes. May lead to wrongly predicting similar events. Could lead people to think that “they could have done better”. Always remember the original context and setting! Knowledge gap: “could have known” ≠ “should have known” 12 Other prediction biases Wishful thinking without sufficient justification (known as “exaggerated expectation”, “optimism bias”, “pro-innovation bias”) Overestimation of a desirable outcome. Expecting a large positive effect when previous data does not imply this. Unjustified optimism towards applicability of new methodology. Failure to predict extreme events (“normalcy bias”) Refusing to plan for a worst case outcome that never happened before. Misjudging one’s ability to distinguish true patterns from noise “clustering illusion” – We are trained to see patterns that aren’t there. 13 14 Which disease would you eliminate? Disease Active cases* Yearly deaths Comments (2017. Includes 808 694 children deaths, 15% of Pneumonia 450 000 000 4 000 000 deaths under the age of 5.) Diabetes 422 000 000 1 500 000 (2017. Does not include 3.7 million indirect deaths.) (2018. Includes 1.8 million children under age 15 HIV/Aids 37 000 000 770 000 living with HIV.) (2017, 1.6 million deaths if PLHIV are included. Tuberculosis 10 000 000 1 300 000 Includes 230 000 children deaths.) (2017. Includes 266 000 deaths of children under Malaria 219 000 000 435 000 the age of 5, 61% of all malaria deaths.) (2018-2019 outbreak, as of October 16, total cases Ebola 3 224 2 152 in history ~35 000, total deaths: ~15 000) * Calculated either as new cases/year, or people living with the disease, all numbers as officially reported by WHO or partners. 15 Extension neglect biases Ignore or insufficiently account for the size of a problem. Only a bias if the size is relevant! Often linked to “scope neglect” and “identifiable victim effect”: identifiable patients prioritized over number of people affected. Why I put pneumonia before diabetes: “children” – social desirability Ebola: sensational and striking images – impact of media The opposite may also happen! Overly trust promising results from studies with small sample size. Base rate fallacy: overconfidence in a positive test result, even in a low prevalence population. 16 Should we trust your answers to the questions? Probably not! Social desirability biases (“choice-supportive bias”, “courtesy bias”) Viewing or reporting your own actions more favourably. Give “the preferred answer” rather than honest opinion. Matches social norms, avoids conflict, “choosing the path of least resistance”. Problematic if patients do this, worst if the doctor does it. May be countered by “negativity bias”: easier to recall negative events. Selective refusal The reason to not answer is caused by the (missing) result Sample is not representative anymore. 17 Data-driven cognitive biases In the rest of the presentation, we will focus on some biases that can have critical impact on data! Overlooking these sources of bias can result in major mistakes! Beware: easy to illustrate with obvious examples, but often sneakily subtle in real data! Let’s start with a “textbook” example. 18 19 Sampling effects The million dollar question in statistics: Can we do inference on a population when only a (sub)sample was observed? The answer is never trivial! Key question: Is the sample representative for the population? Always ask yourself this question! Note: NOT directly linked to sample size 20 Surrogation Many things cannot be observed. Instead, a surrogate is measured. Surrogation: mistaking the original goal by the surrogate A problem when the surrogate is “improved”, but not the original goal! Example: improve vaccination rates in children Surrogate: vaccination rates in children who visit child care providers. Intervention: campaigns to vaccinate children who visit child care. Observed effect is surrogate effect, not representative for original goal. 21 Surrogacy surrogate True Intervention endpoint 22 23 Berkson’s paradox My friends are either very smart or really nice, but rarely both, so very smart people are typically not that nice? Students who take a second chance exam of statistics often don’t have to retake math, while those who retake math typically already got a passing grade for stats, so it turns out the scores are actually negatively correlated? Hospitalized patients who wore a helmet had greater injury severity, so helmets are associated with greater injury severity? 24 Berkson’s paradox 25 Berkson’s paradox 26 Berkson’s paradox 27 Simpson’s paradox Famous example: study on kidney stones Treatment A Treatment B Difference A – B 273/350 (78%) 289/350 (83%) [ -10% ; 1%] Yet treatment A may be better! Why? Treatment A Treatment B Difference A – B Small stones 81/87 (93%) 234/270 (87%) [ 0% ; 13% ] Large stones 192/263 (73%) 55/80 (69%) [ -7% ; 16%] 28 Blyth’s paradox If treatment A is more successful than treatment B and treatment A is more successful than treatment C, is it the most successful among all 3? Adult men Adult women Children A chosen B chosen C chosen (41%) (44%) (15%) A or B A B A 56% 44% A or C C A A 59% 41% Overall C B A 15% 44% 41% 29 Example: plane hits during World War II 30 31 32 Survivorship bias Also known as “immortal time bias” subjects were “immortal” until inclusion in the study (else they would never have been included). In the example: planes that returned Survivorship bias is all around us, at all times! We can only measure what still exists and is documented! Not always a problem – “survivors” may be representative for the purpose of the study. But typically it is a major issue that should be looked at. 33 Survivorship bias – practical issues Major source of publication bias – only positive results “survive” Cause of “hindsight bias” – survivor properties become the norm E.g. focus on “success stories”, failures are not noticed or ignored. Resulting predictions typically too positive. Unfair comparison when one group gets a “head-start” E.g.: HIV positive people live longer than HIV negative people. What about all children dying from pneumonia that may have become HIV- positive at a later age? HIV+ people are “immortal until HIV infection” Strong link with informative missingness/censoring. 34 Conclusions We are all biased – but that’s OK. Computers are not necessarily better! AIs copy human prejudice and bias in training data. Blackbox algorithms cannot always distinguish between signal and noise. My advice: be aware of these biases and try to recognize them. In yourself In others In your field of research Educate others without judging 35 Questions / Comments? Next seminar is on November 7 Making noise with results, not with data – Sources of variation Presenter: Bart Karl Jacobs Typically, we draw our conclusions from an estimate that we distil from the data. In most cases, it is equally important to understand how precise we can expect our result to be. We will discuss the importance of reporting the precision of an estimate in this session and give an introduction to different sources of variation. 36 .
Recommended publications
  • Bad Is Stronger Than Good
    Review of General Psychology Copyright 2001 by the Educational Publishing Foundation 2001. Vol. 5. No. 4. 323-370 1089-2680/O1/S5.O0 DOI: 10.1037//1089-2680.5.4.323 Bad Is Stronger Than Good Roy F. Baumeister and Ellen Bratslavsky Catrin Finkenauer Case Western Reserve University Free University of Amsterdam Kathleen D. Vohs Case Western Reserve University The greater power of bad events over good ones is found in everyday events, major life events (e.g., trauma), close relationship outcomes, social network patterns, interper- sonal interactions, and learning processes. Bad emotions, bad parents, and bad feedback have more impact than good ones, and bad information is processed more thoroughly than good. The self is more motivated to avoid bad self-definitions than to pursue good ones. Bad impressions and bad stereotypes are quicker to form and more resistant to disconfirmation than good ones. Various explanations such as diagnosticity and sa- lience help explain some findings, but the greater power of bad events is still found when such variables are controlled. Hardly any exceptions (indicating greater power of good) can be found. Taken together, these findings suggest that bad is stronger than good, as a general principle across a broad range of psychological phenomena. Centuries of literary efforts and religious pothesis that bad is stronger than good (see also thought have depicted human life in terms of a Rozin & Royzman, in press). That is, events struggle between good and bad forces. At the that are negatively valenced (e.g., losing metaphysical level, evil gods or devils are the money, being abandoned by friends, and receiv- opponents of the divine forces of creation and ing criticism) will have a greater impact on the harmony.
    [Show full text]
  • Pat Croskerry MD Phd
    Thinking (and the factors that influence it) Pat Croskerry MD PhD Scottish Intensive Care Society St Andrews, January 2011 RECOGNIZED Intuition Pattern Pattern Initial Recognition Executive Dysrationalia Calibration Output information Processor override T override Repetition NOT Analytical RECOGNIZED reasoning Dual Process Model Medical Decision Making Intuitive Analytical Orthodox Medical Decision Making (Analytical) Rational Medical Decision-Making • Knowledge base • Differential diagnosis • Best evidence • Reviews, meta-analysis • Biostatistics • Publication bias, citation bias • Test selection and interpretation • Bayesian reasoning • Hypothetico-deductive reasoning .„Cognitive thought is the tip of an enormous iceberg. It is the rule of thumb among cognitive scientists that unconscious thought is 95% of all thought – .this 95% below the surface of conscious awareness shapes and structures all conscious thought‟ Lakoff and Johnson, 1999 Rational blind-spots • Framing • Context • Ambient conditions • Individual factors Individual Factors • Knowledge • Intellect • Personality • Critical thinking ability • Decision making style • Gender • Ageing • Circadian type • Affective state • Fatigue, sleep deprivation, sleep debt • Cognitive load tolerance • Susceptibility to group pressures • Deference to authority Intelligence • Measurement of intelligence? • IQ most widely used barometer of intellect and cognitive functioning • IQ is strongest single predictor of job performance and success • IQ tests highly correlated with each other • Population
    [Show full text]
  • A Task-Based Taxonomy of Cognitive Biases for Information Visualization
    A Task-based Taxonomy of Cognitive Biases for Information Visualization Evanthia Dimara, Steven Franconeri, Catherine Plaisant, Anastasia Bezerianos, and Pierre Dragicevic Three kinds of limitations The Computer The Display 2 Three kinds of limitations The Computer The Display The Human 3 Three kinds of limitations: humans • Human vision ️ has limitations • Human reasoning 易 has limitations The Human 4 ️Perceptual bias Magnitude estimation 5 ️Perceptual bias Magnitude estimation Color perception 6 易 Cognitive bias Behaviors when humans consistently behave irrationally Pohl’s criteria distilled: • Are predictable and consistent • People are unaware they’re doing them • Are not misunderstandings 7 Ambiguity effect, Anchoring or focalism, Anthropocentric thinking, Anthropomorphism or personification, Attentional bias, Attribute substitution, Automation bias, Availability heuristic, Availability cascade, Backfire effect, Bandwagon effect, Base rate fallacy or Base rate neglect, Belief bias, Ben Franklin effect, Berkson's paradox, Bias blind spot, Choice-supportive bias, Clustering illusion, Compassion fade, Confirmation bias, Congruence bias, Conjunction fallacy, Conservatism (belief revision), Continued influence effect, Contrast effect, Courtesy bias, Curse of knowledge, Declinism, Decoy effect, Default effect, Denomination effect, Disposition effect, Distinction bias, Dread aversion, Dunning–Kruger effect, Duration neglect, Empathy gap, End-of-history illusion, Endowment effect, Exaggerated expectation, Experimenter's or expectation bias,
    [Show full text]
  • Gender De-Biasing in Speech Emotion Recognition
    INTERSPEECH 2019 September 15–19, 2019, Graz, Austria Gender de-biasing in speech emotion recognition Cristina Gorrostieta, Reza Lotfian, Kye Taylor, Richard Brutti, John Kane Cogito Corporation fcgorrostieta,rlotfian,ktaylor,rbrutti,jkaneg @cogitocorp.com Abstract vertisement delivery [9], object classification [10] and image search results for occupations [11] . Machine learning can unintentionally encode and amplify neg- There are several factors which can contribute to produc- ative bias and stereotypes present in humans, be they conscious ing negative bias in machine learning models. One major cause or unconscious. This has led to high-profile cases where ma- is incomplete or skewed training data. If certain demographic chine learning systems have been found to exhibit bias towards categories are missing from the training data, models developed gender, race, and ethnicity, among other demographic cate- on this can fail to generalise when applied to new data contain- gories. Negative bias can be encoded in these algorithms based ing those missing categories. The majority of models deployed on: the representation of different population categories in the in modern technology applications are based on supervised ma- training data; bias arising from manual human labeling of these chine learning and much of the labeled data comes from people. data; as well as modeling types and optimisation approaches. In Labels can include movie reviews, hotel ratings, image descrip- this paper we assess the effect of gender bias in speech emotion tions, audio transcriptions, and perceptual ratings of emotion. recognition and find that emotional activation model accuracy is As people are inherently biased, and because models are an es- consistently lower for female compared to male audio samples.
    [Show full text]
  • Information to Users
    INFORMATION TO USERS This manuscript has been reproduced from the microfilm master. UMI films the text directly from the original or copy submitted. Thus, some thesis and dissertation copies are in typewriter face, while others may be from any type of computer printer. The quality of this reproduction is dependent upon the quality of the copy submitted. Broken or indistinct print, colored or poor quality illustrations and photographs, print bleedthrough, substandard margins, and improper alignment can adversely affect reproduction. In the unlikely event that the author did not send UMI a complete manuscript and there are missing pages, these will be noted. Also, if unauthorized copyright material had to be removed, a note will indicate the deletion. Oversize materials (e.g., maps, drawings, charts) are reproduced by sectioning the original, beginning at the upper left-hand comer and continuing from left to right in equal sections with small overlaps. Each original is also photographed in one exposure and is included in reduced form at the back of the book. Photographs included in the original manuscript have been reproduced xerographically in this copy. Higher quality 6” x 9” black and white photographic prints are available for any photographs or illustrations appearing in this copy for an additional charge. Contact UMI directly to order. UMI A Bell & Howell Information Company 300 North Zeeb Road, Ann Arbor MI 48106-1346 USA 313/761-4700 800/521-0600 BIASES IN IMPRESSION FORMATION. A DEMONSTRATION OF A BIVARIATE MODEL OF EVALUATION DISSERTATION Presented in Partial Fulfillment for the Requirements for the Degree Doctor of Philosophy in the Graduate School of the Ohio State University By Wendi L.
    [Show full text]
  • A Neural Network Framework for Cognitive Bias
    fpsyg-09-01561 August 31, 2018 Time: 17:34 # 1 HYPOTHESIS AND THEORY published: 03 September 2018 doi: 10.3389/fpsyg.2018.01561 A Neural Network Framework for Cognitive Bias Johan E. Korteling*, Anne-Marie Brouwer and Alexander Toet* TNO Human Factors, Soesterberg, Netherlands Human decision-making shows systematic simplifications and deviations from the tenets of rationality (‘heuristics’) that may lead to suboptimal decisional outcomes (‘cognitive biases’). There are currently three prevailing theoretical perspectives on the origin of heuristics and cognitive biases: a cognitive-psychological, an ecological and an evolutionary perspective. However, these perspectives are mainly descriptive and none of them provides an overall explanatory framework for the underlying mechanisms of cognitive biases. To enhance our understanding of cognitive heuristics and biases we propose a neural network framework for cognitive biases, which explains why our brain systematically tends to default to heuristic (‘Type 1’) decision making. We argue that many cognitive biases arise from intrinsic brain mechanisms that are fundamental for the working of biological neural networks. To substantiate our viewpoint, Edited by: we discern and explain four basic neural network principles: (1) Association, (2) Eldad Yechiam, Technion – Israel Institute Compatibility, (3) Retainment, and (4) Focus. These principles are inherent to (all) neural of Technology, Israel networks which were originally optimized to perform concrete biological, perceptual, Reviewed by: and motor functions. They form the basis for our inclinations to associate and combine Amos Schurr, (unrelated) information, to prioritize information that is compatible with our present Ben-Gurion University of the Negev, Israel state (such as knowledge, opinions, and expectations), to retain given information Edward J.
    [Show full text]
  • Communication Science to the Public
    David M. Berube North Carolina State University ▪ HOW WE COMMUNICATE. In The Age of American Unreason, Jacoby posited that it trickled down from the top, fueled by faux-populist politicians striving to make themselves sound approachable rather than smart. (Jacoby, 2008). EX: The average length of a sound bite by a presidential candidate in 1968 was 42.3 seconds. Two decades later, it was 9.8 seconds. Today, it’s just a touch over seven seconds and well on its way to being supplanted by 140/280- character Twitter bursts. ▪ DATA FRAMING. ▪ When asked if they truly believe what scientists tell them, NEW ANTI- only 36 percent of respondents said yes. Just 12 percent expressed strong confidence in the press to accurately INTELLECTUALISM: report scientific findings. ▪ ROLE OF THE PUBLIC. A study by two Princeton University researchers, Martin TRENDS Gilens and Benjamin Page, released Fall 2014, tracked 1,800 U.S. policy changes between 1981 and 2002, and compared the outcome with the expressed preferences of median- income Americans, the affluent, business interests and powerful lobbies. They concluded that average citizens “have little or no independent influence” on policy in the U.S., while the rich and their hired mouthpieces routinely get their way. “The majority does not rule,” they wrote. ▪ Anti-intellectualism and suspicion (trends). ▪ Trump world – outsiders/insiders. ▪ Erasing/re-writing history – damnatio memoriae. ▪ False news. ▪ Infoxication (CC) and infobesity. ▪ Aggregators and managed reality. ▪ Affirmation and confirmation bias. ▪ Negotiating reality. ▪ New tribalism is mostly ideational not political. ▪ Unspoken – guns, birth control, sexual harassment, race… “The amount of technical information is doubling every two years.
    [Show full text]
  • Safe-To-Fail Probe Has…
    Liz Keogh [email protected] If a project has no risks, don’t do it. @lunivore The Innovaon Cycle Spoilers Differentiators Commodities Build on Cynefin Complex Complicated sense, probe, analyze, sense, respond respond Obvious Chaotic sense, act, categorize, sense, respond respond With thanks to David Snowden and Cognitive Edge EsBmang Complexity 5. Nobody has ever done it before 4. Someone outside the org has done it before (probably a compeBtor) 3. Someone in the company has done it before 2. Someone in the team has done it before 1. We all know how to do it. Esmang Complexity 5 4 3 Analyze Probe (Break it down) (Try it out) 2 1 Fractal beauty Feature Scenario Goal Capability Story Feature Scenario Vision Story Goal Code Capability Feature Code Code Scenario Goal A Real ProjectWhoops, Don’t need forgot this… Can’t remember Feature what this Scenario was for… Goal Capability Story Feature Scenario Vision Story Goal Code Capability Feature Code Code Scenario Goal Oops, didn’t know about Look what I that… found! A Real ProjectWhoops, Don’t need forgot this… Can’t remember Um Feature what this Scenario was for… Goal Oh! Capability Hmm! Story FeatureOoh, look! Scenario Vision Story GoalThat’s Code funny! Capability Feature Code Er… Code Scenario Dammit! Oops! Oh F… InteresBng! Goal Sh..! Oops, didn’t know about Look what I that… found! We are uncovering be^er ways of developing so_ware by doing it Feature Scenario Goal Capability Story Feature Scenario Vision Story Goal Code Capability Feature Code Code Scenario Goal We’re discovering how to
    [Show full text]
  • Ilidigital Master Anton 2.Indd
    services are developed to be used by humans. Thus, understanding humans understanding Thus, humans. by used be to developed are services obvious than others but certainly not less complex. Most products bioengineering, and as shown in this magazine. Psychology mightbusiness world. beBe it more the comparison to relationships, game elements, or There are many non-business flieds which can betransfered to the COGNTIVE COGNTIVE is key to a succesfully develop a product orservice. is keytoasuccesfullydevelopproduct BIASES by ANTON KOGER The Power of Power The //PsychologistatILI.DIGITAL WE EDIT AND REINFORCE SOME WE DISCARD SPECIFICS TO WE REDUCE EVENTS AND LISTS WE STORE MEMORY DIFFERENTLY BASED WE NOTICE THINGS ALREADY PRIMED BIZARRE, FUNNY, OR VISUALLY WE NOTICE WHEN WE ARE DRAWN TO DETAILS THAT WE NOTICE FLAWS IN OTHERS WE FAVOR SIMPLE-LOOKING OPTIONS MEMORIES AFTER THE FACT FORM GENERALITIES TO THEIR KEY ELEMENTS ON HOW THEY WERE EXPERIENCED IN MEMORY OR REPEATED OFTEN STRIKING THINGS STICK OUT MORE SOMETHING HAS CHANGED CONFIRM OUR OWN EXISTING BELIEFS MORE EASILY THAN IN OURSELVES AND COMPLETE INFORMATION way we see situations but also the way we situationsbutalsotheway wesee way the biasesnotonlychange Furthermore, overload. cognitive avoid attention, ore situations, guide help todesign massively can This in. take people information of kind explainhowandwhat ofperception egory First,biasesinthecat andappraisal. ory, self,mem perception, into fourcategories: roughly bedivided Cognitive biasescan within thesesituations. forusers interaction andeasy in anatural situationswhichresults sible toimprove itpos and adaptingtothesebiasesmakes ingiven situations.Reacting ways certain act sively helpstounderstandwhypeople mas into consideration biases ing cognitive Tak humanbehavior. topredict likely less or andmore relevant illusionsare cognitive In each situation different every havior day.
    [Show full text]
  • Cognitive Biases in Software Engineering: a Systematic Mapping Study
    Cognitive Biases in Software Engineering: A Systematic Mapping Study Rahul Mohanani, Iflaah Salman, Burak Turhan, Member, IEEE, Pilar Rodriguez and Paul Ralph Abstract—One source of software project challenges and failures is the systematic errors introduced by human cognitive biases. Although extensively explored in cognitive psychology, investigations concerning cognitive biases have only recently gained popularity in software engineering research. This paper therefore systematically maps, aggregates and synthesizes the literature on cognitive biases in software engineering to generate a comprehensive body of knowledge, understand state of the art research and provide guidelines for future research and practise. Focusing on bias antecedents, effects and mitigation techniques, we identified 65 articles (published between 1990 and 2016), which investigate 37 cognitive biases. Despite strong and increasing interest, the results reveal a scarcity of research on mitigation techniques and poor theoretical foundations in understanding and interpreting cognitive biases. Although bias-related research has generated many new insights in the software engineering community, specific bias mitigation techniques are still needed for software professionals to overcome the deleterious effects of cognitive biases on their work. Index Terms—Antecedents of cognitive bias. cognitive bias. debiasing, effects of cognitive bias. software engineering, systematic mapping. 1 INTRODUCTION OGNITIVE biases are systematic deviations from op- knowledge. No analogous review of SE research exists. The timal reasoning [1], [2]. In other words, they are re- purpose of this study is therefore as follows: curring errors in thinking, or patterns of bad judgment Purpose: to review, summarize and synthesize the current observable in different people and contexts. A well-known state of software engineering research involving cognitive example is confirmation bias—the tendency to pay more at- biases.
    [Show full text]
  • When Do Employees Perceive Their Skills to Be Firm-Specific?
    r Academy of Management Journal 2016, Vol. 59, No. 3, 766–790. http://dx.doi.org/10.5465/amj.2014.0286 MICRO-FOUNDATIONS OF FIRM-SPECIFIC HUMAN CAPITAL: WHEN DO EMPLOYEES PERCEIVE THEIR SKILLS TO BE FIRM-SPECIFIC? JOSEPH RAFFIEE University of Southern California RUSSELL COFF University of Wisconsin-Madison Drawing on human capital theory, strategy scholars have emphasized firm-specific human capital as a source of sustained competitive advantage. In this study, we begin to unpack the micro-foundations of firm-specific human capital by theoretically and empirically exploring when employees perceive their skills to be firm-specific. We first develop theoretical arguments and hypotheses based on the extant strategy literature, which implicitly assumes information efficiency and unbiased perceptions of firm- specificity. We then relax these assumptions and develop alternative hypotheses rooted in the cognitive psychology literature, which highlights biases in human judg- ment. We test our hypotheses using two data sources from Korea and the United States. Surprisingly, our results support the hypotheses based on cognitive bias—a stark contrast to expectations embedded within the strategy literature. Specifically, we find organizational commitment and, to some extent, tenure are negatively related to employee perceptions of the firm-specificity. We also find that employer-provided on- the-job training is unrelated to perceived firm-specificity. These results suggest that firm-specific human capital, as perceived by employees, may drive behavior in ways unanticipated by existing theory—for example, with respect to investments in skills or turnover decisions. This, in turn, may challenge the assumed relationship between firm-specific human capital and sustained competitive advantage.
    [Show full text]
  • John Collins, President, Forensic Foundations Group
    On Bias in Forensic Science National Commission on Forensic Science – May 12, 2014 56-year-old Vatsala Thakkar was a doctor in India but took a job as a convenience store cashier to help pay family expenses. She was stabbed to death outside her store trying to thwart a theft in November 2008. Bloody Footwear Impression Bloody Tire Impression What was the threat? 1. We failed to ask ourselves if this was a footwear impression. 2. The appearance of the impression combined with the investigator’s interpretation created prejudice. The accuracy of our analysis became threatened by our prejudice. Types of Cognitive Bias Available at: http://en.wikipedia.org/wiki/List_of_cognitive_biases | Accessed on April 14, 2014 Anchoring or focalism Hindsight bias Pseudocertainty effect Illusory superiority Levels-of-processing effect Attentional bias Hostile media effect Reactance Ingroup bias List-length effect Availability heuristic Hot-hand fallacy Reactive devaluation Just-world phenomenon Misinformation effect Availability cascade Hyperbolic discounting Recency illusion Moral luck Modality effect Backfire effect Identifiable victim effect Restraint bias Naive cynicism Mood-congruent memory bias Bandwagon effect Illusion of control Rhyme as reason effect Naïve realism Next-in-line effect Base rate fallacy or base rate neglect Illusion of validity Risk compensation / Peltzman effect Outgroup homogeneity bias Part-list cueing effect Belief bias Illusory correlation Selective perception Projection bias Peak-end rule Bias blind spot Impact bias Semmelweis
    [Show full text]