Reinventing Technology Assessment a 21St Century Model

Total Page:16

File Type:pdf, Size:1020Kb

Reinventing Technology Assessment a 21St Century Model STIP 01 2 April 2010 by richard sclove, Ph.d. Reinventing technology Assessment a 21st Century Model using Citizen PartiCiPation, Collaboration and exPert analysis to inforM and iMProve deCision-Making on issues involving sCienCe and teChnology 3 STIP | reinventing technology Assessment Contents ii List of Boxes iii Abbreviations and Acronyms iv About the Author v Acknowledgments vi Preface vii Executive Summary 1 Introduction 2 1. Rationale for a New U.S. Technology Assessment Capability 10 2. The Former Office of Technology Assessment: Its Limitations 18 3. Technology Assessment for the U.S. Congress: Recent Political History and Context 21 4. Reflection on Post-1995 U.S. Politics of Technology Assessment 24 5. Virtues of Participatory Technology Assessment 31 6. Criteria for a New U.S. Technology Assessment Capacity 33 7. Can Existing Institutions Do the Job? 34 8. Is Participatory Technology Assessment Detrimental, Redundant or Too Costly? 37 9. Practical Options for Establishing a 21st-Century U.S. Technology Assessment Capability 42 Appendix 42 A. Additional Information about Danish Consensus Conferences 43 B. U.S. Experience with Participatory Technology Assessment: Four Examples 49 C. Ethics and Social Values in Expert vs. Participatory Technology Assessment: Two Comparative Case Studies 53 D. Experts, Laypeople and the Common Good 54 E. On Innovation in Expert and Participatory TA Concepts and Methods 57 Notes 77 References Suggested citation: Richard Sclove, Reinventing Technology Assessment: A 21st Century Model (Washington, DC: Science and Technology Innovation Program, Woodrow Wilson International Center for Scholars, April 2010). Available for download free of charge at http://wilsoncenter.org/techassessment STIP 01 i April 2010 by Richard Sclove, Ph.D. Reinventing technology Assessment a 21St CentuRy MoDel uSing Citizen PaRtiCiPation, CollaboRation anD exPeRt analySiS to infoRM anD iMPRove DeCiSion-Making on iSSueS involving SCienCe anD teChnology ii STIP | reinventing technology Assessment List of Boxes 7 1. Danish-Style Consensus Conferences – A European Participatory Technology Assessment Method 8 2. Danish Consensus Conferences: Sample Results and Public Influence 13 3. Structural Bias among Expert Analysts 14 4. Examples of Technologies’ Indirect Consequences on Political Structure 16 5. Social Consequences, Synergisms and Value-Enriched Technology Assessment 17 6. Stakeholder vs. Layperson Participation 26 7. The Berger Inquiry: An Example of Lay Knowledge Contributing to Technology Assessment 28 8. On Deliberation and Collaboration in Participatory Technology Assessment 30 9. Designing and Evaluating pTA Projects 36 10. Comparative Costs of Large-Scale pTA and Other Citizen-Deliberation Exercises 40 11. Institutional Capabilities within the Proposed ECAST Network Appendix Tables 42 A. Danish Consensus Conference Topics 46 B. Questions to Help Examine the Effects of Technologies upon a Community’s Democratic Structure reinventing technology Assessment | STIP Abbreviations & Acronyms CDC Centers for Disease Control and Prevention CRS Congressional Research Service CTA constructive technology assessment eCaSt Expert & Citizen Assessment of Science & Technology Network Dbt Danish Board of Technology GAO Government Accountability Office naS National Academy of Sciences nCtf National Citizens’ Technology Forum nih National Institutes of Health niSenet Nanoscale Informal Science Education Network nRC National Research Council oStP White House Office of Science and Technology Policy OTA Office of Technology Assessment pTA participatory technology assessment PTA Parliamentary Technology Assessment R&D research and development S&t science and technology StiP Science and Technology Innovation Program StS science, technology and society TA technology assessment WWviews World Wide Views on Global Warming iv STIP | reinventing technology Assessment About the Author Richard Sclove is Founder and Senior Fellow of The Loka Institute, a non- profit organization dedicated to making research, science and technology responsive to democratically decided priorities. He has been a U.S. pioneer in participatory technology assessment (pTA), having initiated the first U.S. adaptations of a Danish-style consensus conference (1997) and of a European scenario workshop (2002) – both funded in part by the National Science Foundation. He served recently as U.S. Advisor to the global secre- tariat of the World Wide Views on Global Warming project, the first globe- encompassing pTA exercise in world history. He has briefed U.S. and other national decision-makers on science and technology policy, and prepared testimony for the House Science Committee of the U.S. Congress. The American Political Science Association honored Sclove’s book, Democracy and Technology, with the Don K. Price Award as “the year’s best book on science, technology and politics.” Sclove is also the senior author of The Loka Institute’s influential 1998 report on Community- Based Research in the United States. Dr. Sclove lectures widely around the world, and has published extensively in both scholarly and popular venues, including The Washington Post, The New York Times, Science magazine,The Huffington Post, The Christian Science Monitor, The Chronicle of Higher Education, Technology Review and Science, Technology & Human Values. He is a Fellow of the American Association for the Advancement of Science, and he has held the Ciriacy-Wantrup Postdoctoral Fellowship in Economics at the University of California at Berkeley and the Copeland Fellowship at Amherst College. Sclove earned a B.A. in environmental studies from Hampshire College and graduate de- grees in nuclear engineering (M.S.) and political science (Ph.D.) from the Massachusetts Institute of Technology. Sclove lives with his family in Amherst, Massachusetts, and is active in social and environmental service projects in the ancient city of Varanasi, India. He may be contacted at [email protected]. reinventing technology Assessment | STIP v Acknowledgments The author is grateful to many people for their contributions to this study. Peer reviewers of the full report, Joanna Goven (University of Canterbury, New Zealand), David H. Guston (Consortium for Science, Policy and Outcomes, Arizona State University) and Paul C. Stern (National Research Council/National Academy of Sciences), offered insightful, construc- tive criticism. David Rejeski (Woodrow Wilson International Center for Scholars) shepherded the study toward publication and made important suggestions for improving the flow of the argument. Lars Klüver (Danish Board of Technology) reviewed a precursor document and patiently an- swered many questions about developments in European technology as- sessment. Additional reviewers of early drafts, many of whom offered de- tailed comments, critique and suggestions, include Ida-Elisabeth Andersen (Danish Board of Technology), Vary Coates (Washington Academy of Sciences), Colleen Cordes (Psychologists for Social Responsibility), Daryl Chubin (Center for Advancing Science & Engineering Capacity, American Association for the Advancement of Science), Gerald L. Epstein (Center for Science, Technology & Security Policy, American Association for the Advancement of Science ), Rachelle Hollander (Center for Engineering, Ethics and Society, National Academy of Engineering), Todd M. LaPorte (George Mason University), Ceasar McDowell (Massachusetts Institute of Technology and Dropping Knowledge, Inc.), Shawn Otto (Science Debate), Anne C. Petersen (University of Michigan), Pete Peterson (Public Agenda), John Porter (Hogan & Hartson), Andrew Pratt (ScienceProgress.org), David Rabkin (Museum of Science, Boston), Tind Shepper Ryen (University of Colorado), Doug Schuler (The Evergreen State College), Marcie Sclove (Green River Doula Network), Al Teich (Science and Policy Programs, American Association for the Advancement of Science), Frank von Hippel (Princeton University) and Rick Worthington (Pomona College and The Loka Institute). Patrick Hamlett (North Carolina State University), Daniel Lee Kleinman (University of Wisconsin-Madison), Clark A. Miller (Arizona State University) and Madeleine Kangsen Scammell (Boston University) kindly provided information about U.S. participatory technology assess- ment projects that they helped initiate and organize. Darlene Cavalier (ScienceCheerleader.com) encouraged initiation of this study, helped iden- tify key actors and institutions interested in contemporary American poli- tics of technology assessment and conducted interviews in Washington, D.C., that have informed the research. The author of course retains responsibility for the content, including all errors of fact and interpretation. vi STIP | reinventing technology Assessment Preface Five years ago, our program at the Wilson Center began the first of dozens of focus groups and national surveys to better understand public perceptions, aspirations and concerns around emerging areas of science such as nanotechnology and synthetic biology. Again and again, we found that informed groups of citizens could identify a wide and rich range of issues associated with new technologies, often adding nuances to the views of experts and the policy-making community. Over time, taking the public’s pulse became integrated into our work on understanding the risks and benefits of new technologies and convinced us that public policy can be improved through sustained and carefully crafted dialogue with laypeople. But it also became obvious that
Recommended publications
  • Disruptive Technology Assessment Game 'DTAG' Handbook V0.1
    Disruptive Technology Assessment Game ‘DTAG’ Handbook V0.1 Executive Summary What is the DTAG? The Disruptive Technology Assessment Game (DTAG) is a table-top seminar wargame, used to assess potential future technologies and their impact on military operations and operating environment. How is it played? There are four distinct steps in executing a DTAG experiment. The first two are part of the planning process, prior to playing the game. Step 1: Identify Possible Future Technologies that are under development and are of interest to the military. Step 2 – Create Ideas of Systems (IoS) cards from the technologies identified. Specific technologies, or combinations of technologies are combined with equipment to create new systems that could be employed by the military or by an adversary. These are described on cards. Step 3 – Play the DTAG. Figure 1 summarizes the process. Red teams and Blue teams both plan courses of action in the context of a scenario & vignette. After the initial confrontation of plans, which establishes the baseline, the teams plan again with the addition of future technology from the IoS cards. The second confrontation highlights the effect that the technology has on the plans and the wargame outcome. Figure 1: The DTAG Process Step 4 – Assess the results of the wargame through questions and analysis of data captured. Who plays it? The DTAG unites Technology experts, Military and Analysts providing a broad perspective on the potential impacts of future technology on military operations. The approach allows for an element of unconstrained thinking and the wargame-like rounds encourage open communication opportunities. Why should a DTAG be played, and when? It is most useful when assessing technology that is a prototype or at early stage of development, or technology that is not in widespread use by the military.
    [Show full text]
  • Gao-20-246G, Technology Assessment Design Handbook
    HANDBOOK Technology Assessment Design Handbook Handbook for Key Steps and Considerations in the Design of Technology Assessments GAO-20-246G December 2019 Contents Preface 1 Chapter 1 The Importance of Technology Assessment Design 6 1.1 Reasons to Conduct and Uses of a Technology Assessment 6 1.2 Importance of Spending Time on Design 8 Chapter 2 Technology Assessment Scope and Design 8 2.1 Sound Technology Assessment Design 9 2.2 Phases and Considerations for Technology Assessment Design 9 2.2.1 GAO Technology Assessment Design Examples 14 Chapter 3 Approaches to Selected Technology Assessment Design and Implementation Challenges 18 3.1 Ensuring Technology Assessment Products are Useful for Congress and Others 19 3.2 Determining Policy Goals and Measuring Impact 20 3.3 Researching and Communicating Complicated Issues 20 3.4 Engaging All Relevant Stakeholders 21 Appendix I Objectives, Scope, and Methodology 22 Appendix II Summary of Steps for GAO’s General Engagement Process 35 Appendix III Example Methods for Technology Assessment 38 Appendix IV GAO Contact and Staff Acknowledgments 44 Page i GAO-20-246G Technology Assessment Handbook Tables Table 1: Summary of GAO’s Technology Assessment Process 3 Table 2: Examples for Technology Assessment Objectives that Describe Status and Challenges to Development of a Technology 15 Table 3: Examples for Technology Assessment Objectives that Assess Opportunities and Challenges that May Result from the Use of a Technology 16 Table 4: Examples for Technology Assessment Objectives that Assess Cost-Effectiveness,
    [Show full text]
  • A Systematic Review of Methods for Health Care Technology Horizon Scanning
    AHRQ Health Care Horizon Scanning System A Systematic Review of Methods for Health Care Technology Horizon Scanning Prepared for: Agency for Healthcare Research and Quality U.S. Department of Health and Human Services 540 Gaither Road Rockville, MD 20850 Contract No. 290-2010-00006-C Prepared by: Fang Sun, M.D., Ph.D. Karen Schoelles, M.D., S.M., F.A.C.P ECRI Institute 5200 Butler Pike Plymouth Meeting, PA 19462 AHRQ Publication No. 13-EHC104-EF August 2013 This report incorporates data collected during implementation of the U.S. Agency for Healthcare Research and Quality (AHRQ) Health Care Horizon Scanning System by ECRI Institute under contract to AHRQ, Rockville, MD (Contract No. 290-2010-00006-C). The findings and conclusions in this document are those of the authors, who are responsible for its content, and do not necessarily represent the views of AHRQ. No statement in this report should be construed as an official position of AHRQ or of the U.S. Department of Health and Human Services. The information in this report is intended to identify resources and methods for improving the AHRQ Health Care Horizon Scanning System in the future. The purpose of the AHRQ Health Care Horizon Scanning System is to assist funders of research in making well-informed decisions in designing and funding comparative-effectiveness research. This report may periodically be assessed for the urgency to update. If an assessment is done, the resulting surveillance report describing the methodology and findings will be found on the Effective Health Care Program website at: www.effectivehealthcare.ahrq.gov.
    [Show full text]
  • Horizon Scanning for New and Emerging Technologies in Healthtech What Do the Present and Future Hold? Foreword
    www.pwc.com/sg Horizon Scanning for New and Emerging Technologies in HealthTech What do the present and future hold? Foreword A collaborative, data-driven and evidence based study The last few years have seen an explosion of technology along with an increasing convergence of the Healthcare, Medical Devices, HealthTech, Pharma and Digital realms. It is imperative that in the midst of this, we keep the patients and their problems at the heart of it all. To effectively do so, understanding continuously evolving patient needs will be critical. And by doing so, we can better solve the real challenges they face and provide solutions to complement current clinical practices and technologies to improve what we at PwC call the 3 As in healthcare: Affordable, Accessible and A+ quality care. However, with the rapid and exponential pace of technological advancement, how do we keep track of the game-changing and clinically impactful developments? What are the present trends driving these developments, and what are likely future trends? Is there a fit-for- purpose framework that can be applied to assist in the assessment of these technologies? What will be the implications to regulators, health technology assessments (HTA), policy makers, payers, healthcare professionals and any and all other stakeholders? Horizon Scanning for New and Emerging Technologies in HealthTech aims to answer these questions. For the purposes of this paper, MedTech refers to the traditional innovation-led, fully integrated medical device industry. HealthTech on the other hand, refers to information technology (IT)-led solutions which are more patient-focused and comprise start-ups and non-traditional players who are causing an industry paradigm shift.
    [Show full text]
  • TECHNOLOGY and INNOVATION REPORT 2021 Catching Technological Waves Innovation with Equity
    UNITED NATIONS CONFERENCE ON TRADE AND DEVELOPMENT TECHNOLOGY AND INNOVATION REPORT 2021 Catching technological waves Innovation with equity Geneva, 2021 © 2021, United Nations All rights reserved worldwide Requests to reproduce excerpts or to photocopy should be addressed to the Copyright Clearance Center at copyright.com. All other queries on rights and licences, including subsidiary rights, should be addressed to: United Nations Publications 405 East 42nd Street New York, New York 10017 United States of America Email: [email protected] Website: https://shop.un.org/ The designations employed and the presentation of material on any map in this work do not imply the expression of any opinion whatsoever on the part of the United Nations concerning the legal status of any country, territory, city or area or of its authorities, or concerning the delimitation of its frontiers or boundaries. This publication has been edited externally. United Nations publication issued by the United Nations Conference on Trade and Development. UNCTAD/TIR/2020 ISBN: 978-92-1-113012-6 eISBN: 978-92-1-005658-8 ISSN: 2076-2917 eISSN: 2224-882X Sales No. E.21.II.D.8 ii TECHNOLOGY AND INNOVATION REPORT 2021 CATCHING TECHNOLOGICAL WAVES Innovation with equity NOTE Within the UNCTAD Division on Technology and Logistics, the STI Policy Section carries out policy- oriented analytical work on the impact of innovation and new and emerging technologies on sustainable development, with a particular focus on the opportunities and challenges for developing countries. It is responsible for the Technology and Innovation Report, which seeks to address issues in science, technology and innovation that are topical and important for developing countries, and to do so in a comprehensive way with an emphasis on policy-relevant analysis and conclusions.
    [Show full text]
  • A Science-Technology-Society Approach to Teacher Education for the Foundation Phase: Students’ Empiricist Views
    Lyn Kok & Rika van Schoor A science-technology-society approach to teacher education for the foundation phase: Students’ empiricist views Abstract Teacher education for South African foundation phase education requires student teachers to be prepared for teaching science concepts in an integrated programme in a learning area known as life skills . This study examined the challenges faced by university teachers of foundation phase student teachers in the development of science modules/ courses. The national curriculum for this subject aims to strengthen learner awareness of social relationships, technological processes and elementary science (DBE 2011a). We developed an integrated numeracy, science and technology module for foundation phase student teachers, based on the science-technology-society (STS) approach to teaching science concepts. Students’ understanding of science concepts was assessed, using a project method in which they solved a problem derived from children’s literature. Then students’ views of this integrated approach to teaching science concepts were gathered. The negative views of the foundation phase student teachers towards the integrated STS approach was thought to indicate an empiricist view of the nature of science that could impede their future teaching. Keywords: Foundation phase, science teacher education, science-technology-society, project method, nature of science Lyn Kok, University of Zululand. Email: [email protected] Rika van Schoor, University of the Free State. Email: [email protected] South African Journal of Childhood Education | 2014 4(1): 95-110 | ISSN: 2223-7674 |© UJ SAJCE– June 2014 Introduction In this study the object of inquiry was pre-service teachers’ knowledge of and views about an integrated pedagogy for science teaching in the first years of school.
    [Show full text]
  • Aalborg Universitet Technology Assessment in Techno
    Aalborg Universitet Technology Assessment in Techno-Anthropological Perspective Botin, Lars; Børsen, Tom Holmgaard Creative Commons License CC BY-NC-ND 4.0 Publication date: 2021 Document Version Publisher's PDF, also known as Version of record Link to publication from Aalborg University Citation for published version (APA): Botin, L., & Børsen, T. H. (Eds.) (2021). Technology Assessment in Techno-Anthropological Perspective. (OA ed.) Aalborg Universitetsforlag. Serie om lærings-, forandrings- og organisationsudviklingsprocesser Vol. 10 General rights Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights. ? Users may download and print one copy of any publication from the public portal for the purpose of private study or research. ? You may not further distribute the material or use it for any profit-making activity or commercial gain ? You may freely distribute the URL identifying the publication in the public portal ? Take down policy If you believe that this document breaches copyright please contact us at [email protected] providing details, and we will remove access to the work immediately and investigate your claim. Downloaded from vbn.aau.dk on: October 09, 2021 Technology Assessment in Techno-Anthropological Perspective Edited by Lars Botin Tom Børsen Technology Assessment in Techno-Anthropological Perspective Edited
    [Show full text]
  • Describing the Technological Scope of Industry 4.0 – a Review of Survey Publications
    LogForum 2018, 14 (3), 341-353 > Scientific Journal of Logistics < http://dx.doi.org/10.17270/J.LOG.2018.289 http://www.logforum.net p-ISSN 1895-2038 e-ISSN 1734-459X ORIGINAL PAPER DESCRIBING THE TECHNOLOGICAL SCOPE OF INDUSTRY 4.0 – A REVIEW OF SURVEY PUBLICATIONS Sebastian Schlund 1, Ferdinand Baaij 2 1) TU Wien, Vienna, Austria, 2) TRUMPF Werkzeugmaschinen GmbH + Co. KG, Ditzingen, Germany ABSTRACT . Background: The paper addresses common difficulties of understanding the scope and the underlying technologies of “Industry 4.0”. Existing definitions comprise a variety of technologies and applications, processes as well as business models. Their difficult differentiation has led to a complicated understanding of the topic altogether. Therefore, this study aims at a structuring of the scope of “Industry 4.0” using the average importance of its underlying technologies, as it is represented in 38 survey publications dedicated on Industry 4.0. Methods : Based on a review of existing survey literature on Industry 4.0, relevant technologies are identified. Next, these technologies are recapped in five technology areas. Furthermore, all technologies are assessed according to their relevance to Industry 4.0 using citation indices of the respective publication. Finally, two-dimensional figures are used to present an overview structure of all cited technologies, their structural connections and their relevance. In summary, a structuring of “Industry 4.0” through the cited technologies and their evolution over the years 2013 until 2016 is displayed to facilitate the understanding of significant research trends and promising application areas within “Industry 4.0”. Conclusion: Compared to existing reviews and empirical approaches on the topic, this paper focusses on a review of survey literature specifically dedicated to an understanding of the concept of Industry 4.0.
    [Show full text]
  • International Perspectives on Technology Assessment
    TECHNOLOGICAL FORECASTING AND SOCIAL CHANGE 13, 213-233 (1979) International Perspectives on Technology Assessment KAN CHEN ABSTRACT Technology assessment (TA) has attracted worldwide attention, but at present TA means different things to different countries. In industrialized market economy countries, TA consists of policy studies that deal with the side effects of technology. In centrally planned economy countries, TA is considered another tool for social management of technology. For developing countries, TA is expected to help in selecting appropriate technologies for development. These different perspectives have significant implications not only in how TA is done differently in different countries, but also in how international and global TA can be done effectively. Introduction Technology assessment (TA) has attracted worldwide interest. Discourses in TA have appeared in the literature of many countries and have taken place within several parts of the United Nations system. The professional society in this field, International Society for Technology Assessment (ISTA), has held conferences in a number of countries. Its Second International Congress on Technology Assessment was held at The University of Michigan, Ann Arbor (U.S.A.) in October, 1976, and was attended by participants from over fifteen countries [ 11. Subsequently, an international conference on “Technology Assessing: The Quest for Coherence” was sponsored by the East-West Center in Hon- olulu, Hawaii in May/June, 1977, attended by a number of experts from Asian developing countries as well as from industrialized market economy countries. Another international workshop on “Systems Assessment of New Technologies: International Perspectives” was later sponsored by the International Institute for Applied Systems Analysis (IIASA) in Laxenburg, Austria in July 1977.
    [Show full text]
  • Innovation Assessment: Governing Through Periods of Disruptive Technological Change
    Innovation assessment: governing through periods of disruptive technological change Jacob A. Hasselbalch Published at Journal of European Public Policy: http://dx.doi.org/10.1080/13501763.2017.1363805 ABSTRACT Current regulatory approaches are ill-equipped to address the challenges of governing through periods of disruptive technological change. This article hones in on the use of assessment regimes at the level of the European Union, particularly in the work of the Commission, to argue for a missing middle between technology assessment and impact assessment. Technology assessment focuses on the upstream governance of science and technology, while impact assessment focuses on the downstream governance of the impacts of specific policy options. What is missing is a form of midstream governance, which I label innovation assessment, to steer polities through periods of disruptive technological change, during which innovations have taken concrete forms and are beginning to diffuse, but still exhibit much scope for rapid, unexpected change and alternative trajectories of development. By juxtaposing these three forms of assessment regimes, I define the main dimensions along which they vary. KEYWORDS: disruptive innovation, governance, impact assessment, innovation assessment, technology assessment, policy appraisal 1 1 Introduction There is a growing sense of unease among the policymaking élite that they are increasingly placed on the back foot when it comes to addressing disruption, innovation and technological change, forced to suddenly react to such changes rather than shape them from the outset. We can observe this in, among other things, the increasing prevalence of the term ‘disruption’ in popular and political discourse. This observation challenges the governing logic of the relationship between regulation and innovation, namely that the purpose of regulation is to promote and support innovation, because more innovation is generally assumed to be a good thing.
    [Show full text]
  • A Meta-Meta-Analysis
    Journal of Intelligence Article Effect Sizes, Power, and Biases in Intelligence Research: A Meta-Meta-Analysis Michèle B. Nuijten 1,* , Marcel A. L. M. van Assen 1,2, Hilde E. M. Augusteijn 1, Elise A. V. Crompvoets 1 and Jelte M. Wicherts 1 1 Department of Methodology & Statistics, Tilburg School of Social and Behavioral Sciences, Tilburg University, Warandelaan 2, 5037 AB Tilburg, The Netherlands; [email protected] (M.A.L.M.v.A.); [email protected] (H.E.M.A.); [email protected] (E.A.V.C.); [email protected] (J.M.W.) 2 Section Sociology, Faculty of Social and Behavioral Sciences, Utrecht University, Heidelberglaan 1, 3584 CS Utrecht, The Netherlands * Correspondence: [email protected]; Tel.: +31-13-466-2053 Received: 7 May 2020; Accepted: 24 September 2020; Published: 2 October 2020 Abstract: In this meta-study, we analyzed 2442 effect sizes from 131 meta-analyses in intelligence research, published from 1984 to 2014, to estimate the average effect size, median power, and evidence for bias. We found that the average effect size in intelligence research was a Pearson’s correlation of 0.26, and the median sample size was 60. Furthermore, across primary studies, we found a median power of 11.9% to detect a small effect, 54.5% to detect a medium effect, and 93.9% to detect a large effect. We documented differences in average effect size and median estimated power between different types of intelligence studies (correlational studies, studies of group differences, experiments, toxicology, and behavior genetics).
    [Show full text]
  • Values, Ethics and Innovation Rethinking Technological Development in the Fourth Industrial Revolution
    White Paper Values, Ethics and Innovation Rethinking Technological Development in the Fourth Industrial Revolution August 2018 Authors: Thomas Philbeck Head of Technology, Society and Policy, World Economic Forum Nicholas Davis Head of Society and Innovation, Member of the Executive Committee, World Economic Forum Anne Marie Engtoft Larsen Knowledge Lead, Fourth Industrial Revolution, World Economic Forum World Economic Forum® The views expressed in this White Paper are those of the author(s) and do not necessarily represent the views of the World Economic Forum or its Members and Partners. White Papers are © 2018 – All rights reserved. No part of this publication may be reproduced or submitted to the World Economic Forum as contributions to its insight areas and interactions, and Transmitted in any form or by any means, including the Forum makes the final decision on the publication of the White Paper. White Papers describe Photocopying and recording, or by any information Storage research in progress by the author(s) and are published to elicit comments and further debate. and retrieval system. Contents Introduction 4 Towards a human-centred approach 5 A. Adopting a systems view of technologies 6 B. Appreciating and shaping the moral role of technologies 7 C. Engaging with a wide variety of stakeholders 9 D. The need for new disciplines 10 Achieving transformative innovation 11 A. New tools 12 B. New skills 13 C. New partnerships 14 D. New institutions 14 Conclusion 16 Endnotes 17 Bibliography 18 White Paper 3 direction. It recognizes that technologies will play a role in Introduction whether the Sustainable Development Goals are reached, and establishes a multistakeholder “Technology Facilitation Mechanism” to maximize the chances.5 The World Economic Forum is also pioneering a future- Technologies enable us to live longer, healthier, more fulfilling oriented agenda – one that promotes responsible lives.
    [Show full text]