A Panel Presentation of the Center for Open Science Reproducibility

Total Page:16

File Type:pdf, Size:1020Kb

A Panel Presentation of the Center for Open Science Reproducibility The Hesburgh Libraries’ Center for Digital Scholarship and the Center for Research Computing present: A Panel Presentation of the Center for Open Science Reproducibility Projects a hybrid presentation on the center for open science (cos) reproducibility projects, psychology, and its companion project, cancer biology. Wednesday, September 9, 2015 Noon-1:00 p.m. Center for Digital Scholarship (CDS), Hesburgh Library The panel will be moderated onsite at Notre Dame by COS Partnerships Manager, Andrew Sallans. Sallans will engage Notre Dame’s faculty, staff and students in a discussion with the Reproducibility panelists. Panelists: Via Webcast: Tim Errington, Project Manager for Reproducibility Project: Cancer Biology and lead of the metascience efforts at COS Johanna Cohoon, Project Coordinator for Reproducibility Project: Psychology Mallory Kidwell, Project Coordinator for Reproducibility Project: Psychology On Site: Andrew Sallans, Partnerships, Collaborations, & Funding Manager, COS Lunch will be provided to on-site attendees and panelists. Please register here if you will be attending onsite at Notre Dame’s CDS. You may also attend this event virtually via google hangouts. Click here to join hangout. About: Launched nearly four years ago and coordinated by the Center for Open Science, the Reproducibility Project: Psychology has produced the most comprehensive investigation ever done about the rate and predictors of reproducibility in a field of science.On August 27th, 2015, 270 researchers investigating the reproducibility of psychological science published their findings in Science Magazine. The project conducted replications of 100 published findings of three prominent psychology journals. They found that regardless of the analytic method or criteria used, fewer than half of their replications produced the same findings as the original study. Read more about The Reproducibility Project here. This Reproducibility panel event is locally sponsored and co-organized by the Hesburgh Libraries’ Center for Digital Scholarship and the Center for Research Computing..
Recommended publications
  • Promoting an Open Research Culture
    Promoting an open research culture Brian Nosek University of Virginia -- Center for Open Science http://briannosek.com/ -- http://cos.io/ The McGurk Effect Ba Ba? Da Da? Ga Ga? McGurk & MacDonald, 1976, Nature Adelson, 1995 Adelson, 1995 Norms Counternorms Communality Secrecy Open sharing Closed Norms Counternorms Communality Secrecy Open sharing Closed Universalism Particularlism Evaluate research on own merit Evaluate research by reputation Norms Counternorms Communality Secrecy Open sharing Closed Universalism Particularlism Evaluate research on own merit Evaluate research by reputation Disinterestedness Self-interestedness Motivated by knowledge and discovery Treat science as a competition Norms Counternorms Communality Secrecy Open sharing Closed Universalism Particularlism Evaluate research on own merit Evaluate research by reputation Disinterestedness Self-interestedness Motivated by knowledge and discovery Treat science as a competition Organized skepticism Organized dogmatism Consider all new evidence, even Invest career promoting one’s own against one’s prior work theories, findings Norms Counternorms Communality Secrecy Open sharing Closed Universalism Particularlism Evaluate research on own merit Evaluate research by reputation Disinterestedness Self-interestedness Motivated by knowledge and discovery Treat science as a competition Organized skepticism Organized dogmatism Consider all new evidence, even Invest career promoting one’s own against one’s prior work theories, findings Quality Quantity Anderson, Martinson, & DeVries,
    [Show full text]
  • The Reproducibility Crisis in Research and Open Science Solutions Andrée Rathemacher University of Rhode Island, [email protected] Creative Commons License
    University of Rhode Island DigitalCommons@URI Technical Services Faculty Presentations Technical Services 2017 "Fake Results": The Reproducibility Crisis in Research and Open Science Solutions Andrée Rathemacher University of Rhode Island, [email protected] Creative Commons License This work is licensed under a Creative Commons Attribution 4.0 License. Follow this and additional works at: http://digitalcommons.uri.edu/lib_ts_presentations Part of the Scholarly Communication Commons, and the Scholarly Publishing Commons Recommended Citation Rathemacher, Andrée, ""Fake Results": The Reproducibility Crisis in Research and Open Science Solutions" (2017). Technical Services Faculty Presentations. Paper 48. http://digitalcommons.uri.edu/lib_ts_presentations/48http://digitalcommons.uri.edu/lib_ts_presentations/48 This Speech is brought to you for free and open access by the Technical Services at DigitalCommons@URI. It has been accepted for inclusion in Technical Services Faculty Presentations by an authorized administrator of DigitalCommons@URI. For more information, please contact [email protected]. “Fake Results” The Reproducibility Crisis in Research and Open Science Solutions “It can be proven that most claimed research findings are false.” — John P. A. Ioannidis, 2005 Those are the words of John Ioannidis (yo-NEE-dees) in a highly-cited article from 2005. Now based at Stanford University, Ioannidis is a meta-scientist who conducts “research on research” with the goal of making improvements. Sources: Ionnidis, John P. A. “Why Most
    [Show full text]
  • 2020 Impact Report
    Center for Open Science IMPACT REPORT 2020 Maximizing the impact of science together. COS Mission Our mission is to increase the openness, integrity, and reproducibility of research. But we don’t do this alone. COS partners with stakeholders across the research community to advance the infrastructure, methods, norms, incentives, and policies shaping the future of research to achieve the greatest impact on improving credibility and accelerating discovery. Letter from the Executive Director “Show me” not “trust me”: Science doesn’t ask for Science is trustworthy because it does not trust itself. Transparency is a replacement for trust. Transparency fosters self-correction when there are errors trust, it earns trust with transparency. and increases confidence when there are not. The credibility of science has center stage in 2020. A raging pandemic. Partisan Transparency is critical for maintaining science’s credibility and earning public interests. Economic and health consequences. Misinformation everywhere. An trust. The events of 2020 make clear the urgency and potential consequences of amplified desire for certainty on what will happen and how to address it. losing that credibility and trust. In this climate, all public health and economic research will be politicized. All The Center for Open Science is profoundly grateful for all of the collaborators, findings are understood through a political lens. When the findings are against partners, and supporters who have helped advance its mission to increase partisan interests, the scientists are accused of reporting the outcomes they want openness, integrity, and reproducibility of research. Despite the practical, and avoiding the ones they don’t. When the findings are aligned with partisan economic, and health challenges, 2020 was a remarkable year for open science.
    [Show full text]
  • 1 Maximizing the Reproducibility of Your Research Open
    1 Maximizing the Reproducibility of Your Research Open Science Collaboration1 Open Science Collaboration (in press). Maximizing the reproducibility of your research. In S. O. Lilienfeld & I. D. Waldman (Eds.), Psychological Science Under Scrutiny: Recent Challenges and Proposed Solutions. New York, NY: Wiley. Authors’ Note: Preparation of this chapter was supported by the Center for Open Science and by a Veni Grant (016.145.049) awarded to Hans IJzerman. Correspondence can be addressed to Brian Nosek, [email protected]. 1 Alexander A. Aarts, Nuenen, The Netherlands; Frank A. Bosco, Virginia Commonwealth University; Katherine S. Button, University of Bristol; Joshua Carp, Center for Open Science; Susann Fiedler, Max Planck Institut for Research on Collective Goods; James G. Field, Virginia Commonwealth University; Roger Giner-Sorolla, University of Kent; Hans IJzerman, Tilburg University; Melissa Lewis, Center for Open Science; Marcus Munafò, University of Bristol; Brian A. Nosek, University of Virginia; Jason M. Prenoveau, Loyola University Maryland; Jeffrey R. Spies, Center for Open Science 2 Commentators in this book and elsewhere describe evidence that modal scientific practices in design, analysis, and reporting are interfering with the credibility and veracity of the published literature (Begley & Ellis, 2012; Ioannidis, 2005; Miguel et al., 2014; Simmons, Nelson, & Simonsohn, 2011). The reproducibility of published findings is unknown (Open Science Collaboration, 2012a), but concern that is lower than desirable is widespread -
    [Show full text]
  • Using the OSF (Open Science Framework)
    Using the OSF Center for Open Science at BITSS 2014 Johanna Cohoon & Caner Uguz Pull out your laptop and visit www.osf.io Anderson, Martinson, & DeVries, 2007 NORMS COUNTERNORMS Communality Secrecy Open sharing Closed Universalism Particularism Evaluate research on own merit Evaluate research by reputation Disinterestedness Self interestedness Motivated by knowledge and discovery Treat science as a competition Organized skepticism Organized dogmatism Consider all new evidence, even Invest career promoting one’s own against one’s prior work theories, findings Quality Quantity COUNTERNORMS NORMS Incentives Incentives for individual success are focused on getting it published, not getting it right. Change the incentives. Center for Open Science COMMUNITY INFRASTRUCTURE METASCIENCE Community ■ Foster discussion of transparency and reproducibility issues ■ Support community efforts to increase openness in the sciences ■ Promote the establishment of reporting standards Center for Open Science COMMUNITY INFRASTRUCTURE METASCIENCE Infrastructure ■ Tools for scientists that increase transparency and access to resources ■ Connect services and allow users to easily organize and share their work Center for Open Science COMMUNITY INFRASTRUCTURE METASCIENCE Metascience ■ How are scientists currently behaving? ■ Do transparency and replication have a positive impact on research? ■ Reproducibility Project: Psychology and Cancer Biology; Many Labs 1, 2, and 3; Archival Project; CREP Open your laptop! www.osf.io OSF Nudge the incentive system ■ Support pre-registration ■ Create alternative means of measuring impact ■ Enable sharing of materials Facilitate good research ■ Organize files ■ Allow collaboration meanwhile Logs Commenting Tags Statistics Download counts Addons OSF Nudge the incentive system ■ Support pre-registration ■ Create alternative means of measuring impact ■ Enable sharing of materials Facilitate good research ■ Organize files ■ Allow collaboration In the future ■ Check lists ■ Notifications ■ Messaging ■ Dataverse, Evernote, Trello, Plot.ly.
    [Show full text]
  • Researchers Overturn Landmark Study on the Replicability of Psychological Science
    Researchers overturn landmark study on the replicability of psychological science By Peter Reuell Harvard Staff Writer Category: HarvardScience Subcategory: Culture & Society KEYWORDS: psychology, psychological science, replication, replicate, reproduce, reproducibility, Center for Open Science, Gilbert, Daniel Gilbert, King, Gary King, Science, Harvard, FAS, Faculty of Arts and Sciences, Reuell, Peter Reuell Summary: A 2015 study claiming that more than half of all psychology studies cannot be replicated turns out to be wrong. Harvard researchers have discovered that the study contains several statistical and methodological mistakes, and that when these are corrected, the study actually shows that the replication rate in psychology is quite high – indeed, it is statistically indistinguishable from 100%. RELATED LINKS: According to two Harvard professors and their collaborators, a 2015 landmark study showing that more than half of all psychology studies cannot be replicated is actually wrong. In an attempt to determine the replicability of psychological science, a consortium of 270 scientists known as The Open Science Collaboration (OSC) tried to replicate the results of 100 published studies. More than half of them failed, creating sensational headlines worldwide about the “replication crisis” in psychology. But an in-depth examination of the data by Daniel Gilbert (Edgar Pierce Professor of Psychology at Harvard University), Gary King (Albert J. Weatherhead III University Professor at Harvard University), Stephen Pettigrew (doctoral student in the Department of Government at Harvard University), and Timothy Wilson (Sherrell J. Aston Professor of Psychology at the University of Virginia) has revealed that the OSC made some serious mistakes that make this pessimistic conclusion completely unwarranted: The methods of many of the replication studies turn out to be remarkably different from the originals and, according to Gilbert, King, Pettigrew, and Wilson, these “infidelities” had two important consequences.
    [Show full text]
  • Replicability, Robustness, and Reproducibility in Psychological Science
    1 Replicability, Robustness, and Reproducibility in Psychological Science Brian A. Nosek Center for Open Science; University of Virginia Tom E. Hardwicke University of Amsterdam Hannah Moshontz University of Wisconsin-Madison Aurélien Allard University of California, Davis Katherine S. Corker Grand Valley State University Anna Dreber Stockholm School of Economics; University of Innsbruck Fiona Fidler University of Melbourne Joe Hilgard Illinois State University Melissa Kline Struhl Center for Open Science Michèle Nuijten Meta-Research Center; Tilburg University Julia Rohrer Leipzig University Felipe Romero University of Groningen Anne Scheel Eindhoven University of Technology Laura Scherer University of Colorado Denver - Anschutz Medical Campus Felix Schönbrodt Ludwig-Maximilians-Universität München, LMU Open Science Center Simine Vazire University of Melbourne In press at the Annual Review of Psychology Final version Completed: April 6, 2021 Authors’ note: B.A.N. and M.K.S. are employees of the nonprofit Center for Open Science that has a mission to increase openness, integrity, and reproducibility of research. K.S.C. is the unpaid executive officer of the nonprofit Society for the Improvement of Psychological Science. This work was supported by grants to B.A.N. from Arnold Ventures, John Templeton Foundation, Templeton World Charity Foundation, and Templeton Religion Trust. T.E.H. received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Skłodowska-Curie grant agreement No. 841188. We thank Adam Gill for assistance creating Figures 1, 2, and S2. Data, materials, and code are available at https://osf.io/7np92/. B.A.N. Drafted the outline and manuscript sections, collaborated with section leads on conceptualizing and drafting their components, coded replication study outcomes for Figure 1, coded journal policies for Figure 5.
    [Show full text]
  • Open Science 1
    Running head: OPEN SCIENCE 1 Open Science Barbara A. Spellman, University of Virginia Elizabeth A. Gilbert, Katherine S. Corker, Grand Valley State University Draft of: 20 September 2017 OPEN SCIENCE 2 Abstract Open science is a collection of actions designed to make scientific processes more transparent and results more accessible. Its goal is to build a more replicable and robust science; it does so using new technologies, altering incentives, and changing attitudes. The current movement toward open science was spurred, in part, by a recent series of unfortunate events within psychology and other sciences. These events include the large number of studies that have failed to replicate and the prevalence of common research and publication procedures that could explain why. Many journals and funding agencies now encourage, require, or reward some open science practices, including pre-registration, providing full materials, posting data, distinguishing between exploratory and confirmatory analyses, and running replication studies. Individuals can practice and promote open science in their many roles as researchers, authors, reviewers, editors, teachers, and members of hiring, tenure, promotion, and awards committees. A plethora of resources are available to help scientists, and science, achieve these goals. Keywords: data sharing, file drawer problem, open access, open science, preregistration, questionable research practices, replication crisis, reproducibility, scientific integrity Thanks to Brent Donnellan (big thanks!), Daniël Lakens, Calvin Lai, Courtney Soderberg, and Simine Vazire OPEN SCIENCE 3 Open Science When we (the authors) look back a couple of years, to the earliest outline of this chapter, the open science movement within psychology seemed to be in its infancy.
    [Show full text]
  • Community of Open Science Grassroots Networks (COSGN) Project Summary
    Center for Open Science NSF 21-511 AccelNet-Implementation: Community of Open Science Grassroots Networks (COSGN) Project Summary Overview. The Community of Open Scholarship Grassroots Networks (COSGN), includes 107 grassroots networks, representing virtually every region of the world and every research discipline. These networks communicate and coordinate on topics of common interest. We propose, using an NSF 21-511 Implementation grant, to formalize governance and coordination of the networks to maximize impact and establish standard practices for sustainability. In the project period, we will increase the capacity of COSGN to advance the research and community goals of the participating networks individually and collectively, and establish governance, succession planning, shared resources, and communication pathways to ensure an active, community-sustained network of networks. By the end of the project period, we will have established a self-sustaining network of networks that leverages disciplinary and regional diversity, actively collaborates across networks for grassroots organizing, and shares resources for maximum impact on culture change for open scholarship. Intellectual Merit. The open scholarship community is fueled by recognition that the social structure and culture of research does not promote practices and reward behaviors in line with scholarly values. Networks promoting open scholarship represent a variety of aims, including: increasing the transparency and accessibility of research processes, content, and outputs; improving the rigor and reproducibility of research practices; and advancing inclusivity of who can contribute to scholarship and how to diversify reward systems to encourage their contributions. The challenges and opportunities to improve research practices exist in every scholarly discipline, every region of the world, and every stakeholder group (e.g., researchers, institutions, publishers, funders, consumers of science).
    [Show full text]
  • Task Force Report
    Global Engagement Task Force Report Dr. Crystal N. Steltenpohl EVANSVILLE, IN • USA James Montilla Doble METRO MANILA • PHILIPPINES Dr. Dana M. Basnight-Brown NAIROBI • KENYA Dr. Natalia B. Dutra NATAL, RN • BRAZIL Anabel Belaus CÓRDOBA • ARGENTINA Dr. Chun-Chia Kung TAINAN • TAIWAN Dr. Sandersan Onie SYDNEY • AUSTRALIA Dr. Divya Seernani FREIBURG • GERMANY Dr. Sau-Chin Chen HUALIEN • TAIWAN Dr. D. I. Burin BUENOS AIRES • ARGENTINA Dr. Kohinoor Darda PUNE • INDIA AND SYDNEY • AUSTRALIA FEBRUARY 14, 2021 TABLE OF CONTENTS 1 Acknowledgments 2 Executive Summary 5 Building Partnerships 10 SIPS Presence at Other Meetings 13 Diversifying Remote Meetings 16 Geographically Diverse Conference Locations 22 Membership and Financial Resources 26 Surveying Open Science Practitioners 30 Appendix Potential Conference Locations ACKNOWLEDGMENTS The Society for the Improvement of Psychological Science Global Engagement Task Force would like to thank Dr. Ljiljana B. Lazarević (University of Belgrade, Serbia), Dr. Joseph Hilgard (Illinois State University, USA), Dr. Koki Ikeda (Meiji Gakuin University, Japan), Linh Nguy n (University of Minnesota, USA), Miguel Silan (University of the Philippines Diliman, ễ Philippines), Neha Moopen (Utrecht University Library, the Netherlands), and Rizqy Amelia Zein (Universitas Airlangga, Indonesia) for their assistance with the task force formation and early information gathering. We would also like to thank Dr. Irma Serrano García (University of Puerto Rico, Puerto Rico), Dr. Lenny Jason (DePaul University, USA), Dr. Hu Chuan-Peng (Nanjing Normal University, China), Lou Shomette (Executive Director, Psychonomic Society, USA), Dr. Nurit Shnabel (Tel-Aviv University, Israel), Dr. Ola Shobowale (European Association of Social Psychology, the Netherlands), Dr. Katie Corker (Grand Valley State University, USA), and Dr.
    [Show full text]
  • Brian A. Nosek
    Last Updated: July 2, 2019 1 BRIAN A. NOSEK University of Virginia, Department of Psychology, Box 400400, Charlottesville, VA 22904-4400 Center for Open Science, 210 Ridge McIntire Rd, Suite 500, Charlottesville, VA 22903-5083 http://briannosek.com/ | http://cos.io/ | [email protected] Positions 2014- Professor University of Virginia 2013- Executive Director Center for Open Science 2008-2014 Associate Professor University of Virginia 2003-2013 Executive Director Project Implicit 2011-2012 Visiting Scholar CASBS, Stanford University 2008-2011 Director of Graduate Studies University of Virginia 2002-2008 Assistant Professor University of Virginia 2005 Visiting Scholar Stanford University 2001-2002 Exchange Scholar Harvard University Education Ph.D. 2002, Yale University, Psychology Thesis: Moderators of the relationship between implicit and explicit attitudes Advisor: Mahzarin R. Banaji M.Phil. 1999, Yale University, Psychology Thesis: Uses of response latency in social psychology M.S. 1998, Yale University, Psychology Thesis: Gender differences in implicit attitudes toward mathematics B.S. 1995, California Polytechnic State University, San Luis Obispo, Psychology Minors: Computer Science and Women's Studies Center for Open Science: co-Founder, Executive Director Web site: http://cos.io/ Primary infrastructure: http://osf.io/ A non-profit organization that aims to increase openness, integrity, and reproducibility of research. Building tools to facilitate scientists’ workflow, project management, transparency, and sharing. Community-building for open science practices. Supporting metascience research. Project Implicit: co-Founder Information Site: http://projectimplicit.net/ Research and Education Portal: https://implicit.harvard.edu/ Project Implicit is a multidisciplinary collaboration and non-profit for research and education in the social and behavioral sciences, especially for research in implicit social cognition.
    [Show full text]
  • Teaching Good Research Practices: Protocol of a Research Master Course
    UvA-DARE (Digital Academic Repository) Teaching Good Research Practices: Protocol of a Research Master Course Sarafoglou, A.; Hoogeveen, S.; Matzke, D.; Wagenmakers, E.-J. DOI 10.1177/1475725719858807 Publication date 2020 Document Version Final published version Published in Psychology Learning and Teaching License CC BY-NC Link to publication Citation for published version (APA): Sarafoglou, A., Hoogeveen, S., Matzke, D., & Wagenmakers, E-J. (2020). Teaching Good Research Practices: Protocol of a Research Master Course. Psychology Learning and Teaching, 19(1), 46-59. https://doi.org/10.1177/1475725719858807 General rights It is not permitted to download or to forward/distribute the text or part of it without the consent of the author(s) and/or copyright holder(s), other than for strictly personal, individual use, unless the work is under an open content license (like Creative Commons). Disclaimer/Complaints regulations If you believe that digital publication of certain material infringes any of your rights or (privacy) interests, please let the Library know, stating your reasons. In case of a legitimate complaint, the Library will make the material inaccessible and/or remove it from the website. Please Ask the Library: https://uba.uva.nl/en/contact, or a letter to: Library of the University of Amsterdam, Secretariat, Singel 425, 1012 WP Amsterdam, The Netherlands. You will be contacted as soon as possible. UvA-DARE is a service provided by the library of the University of Amsterdam (https://dare.uva.nl) Download date:26 Sep 2021 Article
    [Show full text]