Clinical Science
Total Page:16
File Type:pdf, Size:1020Kb
Clinical Science APA Society for the Science of Clinical Psychology III Section III of the Division of Clinical Psychology of Division12 the American Psychological Association Ψ Developing clinical psychology as an experimental-behavioral science Newsletter Spring 2017: Volume 20, Issue 2 SSCP Executive Board Table of Contents President: Presidential Column Scott O. Lilienfeld, Ph.D. S. Lilienfeld...........................................................................................................2 Emory University SSCP Events at APS...........................................................................................3 Past-President: Steven D. Hollon, Ph.D. Diversity Corner Vanderbilt University G. Hurtado............................................................................................................4 President-Elect: Dean McKay, Ph.D. Awards & Recognition...........................................................................................6 Fordham University Updates from 2016 Varda Shoham Award Winners...........................................11 Secretary/Treasurer: Kate McLaughlin, Ph.D. University of Washington Clinical Science Early Career Path E. Driessen.........................................................................................................14 Division 12 Representative: David Smith, Ph.D. Student Perspective University of Notre Dame J. Iacovino..........................................................................................................16 Student Representatives: Jessica Hamilton, M.A. Clinician Perspective Temple University J. Hoffman...........................................................................................................18 Kelly Knowles, B.A. Updates from Student Representatives Vanderbilt University J. Hamilton & K. Knowles...................................................................................20 At Large Members: Leonard Simms, Ph.D. University at Buffalo Thomas Olino, Ph.D. Temple University Clinical Science Editor: Articles published in Clinical Science represent the views of the authors and not necessarily those Autumn Kujawa, Ph.D. of the Society for a Science of Clinical Psychology, the Society of Clinical Psychology, or the Penn State College of Medicine American Psychological Association. Submissions representing differing views, comments, and letters to the editor are welcome. Clinical Science Vol. 20 (2): Spring 2017 2 Presidential Column Clinical Psychology and the Replication Crisis: Where Have We Been? Scott O. Lilienfeld, Ph.D., Emory University Unless you have been living in a cave for the past few largely insulated from these important debates. Articles years, or have just returned from an extended vacation on the replication crisis in the pages of clinical psychol- on Mars, you probably know that the field of psychology ogy journals have been few-and-far between, as have has recently been embroiled in a crisis of sorts. Termed open discussions of this crisis at clinical psychology the “replication crisis,” it is every bit as much a crisis conferences. Even on the often far-ranging Society for of confidence as of data. Specifically, many of us have a Science of Clinical Psychology (SSCP) listserv, there come to doubt the robustness of at least some of the has been a surprising dearth of debate concerning the core findings in psychology that we had long taken for replication crisis and potential remedies for it. granted (Lilienfeld & Waldman, 2017). In a recent article that is “in press” at the Association The replication crisis is a perfect storm of sorts, reflect- for Psychological Science (APS) journal Perspectives ing the confluence of several separable but converging on Psychological Science (Tackett et al., in press), we trends. First, in 2005, medical epidemiologist John Io- briefly recounted the history of the replication crisis and annidis, now at Stanford University, wrote a bombshell examined potential reasons for clinical psychology’s article entitled “Why most published research findings virtually wholesale absence from the table with respect are false” (which has been cited over 4500 times as to ongoing replicability discussions. For example, we of this writing) in which he conducted simulations that observed that because many our clinical samples are appeared to show that most published results in medi- difficult, expensive, and time-intensive to collect, there is cine were very likely to be to erroneous or exaggerated often less of a “culture of replication” in our laboratories (Ioannidis, 2005). Second, six years later, my former compared with those of our colleagues in experimental undergraduate advisor at Cornell University, Daryl Bem psychology. In addition, because the bulk of the replica- (2011), published an article in the marquis journal in so- tion efforts have thus far been directed at social and cial and personality psychology, Journal of Personality cognitive psychology, we may assume that replicability and Social Psychology, that purported to find evidence problems do not apply to us. This sanguine conclusion of precognition (one of three ostensible forms of extra- seems implausible. For example, a recent survey of 83 sensory perception). Many critics howled in derision, widely cited studies in our sister discipline of psychiatry finding Bem’s results to be both highly implausible and found a comparable rate of nonreplication as Nosek’s based on problematic methodology. Third, the field of team had reported for social and cognitive psychology psychology was shaken by several cases of egregious (Tajika et al., 2015). Specifically, only 40 investigations but undetected fraud by several prominent researchers, had been subjected to replication attempts and, out of perhaps most notably that of Dutch social psychologist these 40, only 16 (40%) were deemed to have been Diederik Stapel (Carey, 2015). Fourth, in an ambitious successfully replicated. effort to gauge the magnitude of the reproducibility prob- lem in psychology, University of Virginia psychologist If anything, there may be reasons to suspect that Brian Nosek and his collaborators at the Open Science replicability problems may be even more pronounced Collaboration attempted to replicate 100 published in clinical psychology than in social and personality studies in social and cognitive psychology. Depending psychology. Our sample sizes are often modest; our on the metric used, only about 40 percent of the original samples are often highly heterogeneous; we often studies replicated (Open Science Collaboration, 2015). rely on psychiatric diagnoses that are themselves Although this figure does not demonstrate – despite heterogeneous; we often test patients whose behavior widespread media pronouncements - that the original is unstable across brief periods of time; we often rely findings were erroneous, it reminds us that we can no on indicators, such as laboratory and functional brain longer take the replicability of our findings for granted. imaging measures, that tend to display only modest levels of test-retest reliability; and so on (Lilienfeld & In response to these developments, there have been Treadway, 2016). numerous calls for reforming our standard ways of do- ing research business in psychology (Lindsay, Simons, Fortunately, there are a host of partial solutions to & Lilienfeld, 2016). Despite the enormous impact of the the replicability challenges confronting our field (Wa- replication crisis on research practices in many domains genmakers & Dutilh, 2016). First, preregistration of of psychology, especially social and cognitive psychol- hypotheses and analyses on publicly available web- ogy, our own field of clinical psychology has remained sites, such as AsPredicted.org and Open Science Clinical Science Vol. 20 (2): Spring 2017 3 Foundation, is a crucial step toward enhancing the Lindsay, D.S., Simons, D.J., & Lilienfeld, S.O. (2016, robustness of our science (indeed, in our own lab at December). Research preregistration 101. https:// Emory University we are now beginning, albeit belat- www.psychologicalscience.org/observer/research- edly, to routinely preregister all of our hypotheses and preregistration-101 analyses, as well as make explicit which analyses are Open Science Collaboration. (2015). Estimating the exploratory versus confirmatory). Preregistration is reproducibility of psychological science. Science, hardly a panacea, but it greatly minimizes the risk of 349. p-hacking (a broad set of post-hoc analytic decisions, Tackett, J. L., Lilienfeld, S.O., Patrick, C.P. Johnson, such as cherry-picking dependent measures, tossing S., Krueger, R., Miller, J., … Shrout, P. E. (in press). out outliers, transforming data distributions, pooling Clinical science and the replicability crisis. Perspec- or splitting samples, all designed to bring alpha levels tives on Psychological Science below the hallowed threshold of statistical significance, Tajika, A., Ogawa, Y., Takeshima, N., Hayasaka, Y., & usually .05) and HARKING (hypothesizing after results Furukawa, T. A. (2015). Replication and contradic- are known). These deeply problematic practices have tion of highly cited research papers in psychiatry: been normative in many psychology labs for decades, 10-year follow-up. British Journal of Psychiatry, 207, and transmitted implicitly (and in some cases explic- 357-362. itly) to generations of our graduate students. Second, Wagenmakers, E.J., & Dutilh, G. (2016, November). opening our datasets and stimulus materials to other