Meta-Analyses Were Supposed to End Scientific Debates. Often, They Only Cause More Controversy | Science | AAAS
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
JOANNA BRIGGS INSTITUTE Annual Report 2018
JOANNA BRIGGS INSTITUTE Annual Report 2018 joannabriggs.org BETTER EVIDENCE. BETTER OUTCOMES. BRIGHTER FUTURE. 2 JBI’S IMPACT IN 2018 CONTENTS 4 Message from the Executive Director 5 JBI Strategic Plan update 6 Our team 7 Governance 8 JBI Model of Evidence-based Healthcare 10 GLOBAL REACH 11 Joanna Briggs Collaboration 12 JBC Regions 14 Africa 15 Americas 16 Asia 17 Australasia 18 Europe 19 JBI Colloquium 2018 20 Joanna Briggs Foundation 22 JBI Endorsement 24 Groups we work with 26 EDUCATION 27 CSRTP 28 Evidence-based Clinical Fellowship Program 29 Train the Trainer programs 30 Postgraduate Research Degrees 32 RESEARCH CONSULTANCY 34 EBP RESOURCES AND PUBLICATIONS 35 JBI EBP Database 36 JBI Journals 38 JBI Tools 40 Oral presentations by JBI staff 42 Publications by JBI staff Joanna Briggs Institute - Annual Report 2018 1 JBI’S IMPACT JBI EBP DATABASE IN 2018 Evidence- 4365 based point of care resources resources The widespread impact of JBI, both in Australia and worldwide, is attributable to our local and global partnerships that ensure evidence-based activities are context-specific and driven by individuals SCIENTIFIC WRITERS and groups that understand their specific health environments. Scientific 45 Writers partnered writers with JBI JBI ENDORSEMENT Granted 4 full JBI Endorsement organisations status JOANNA BRIGGS COLLABORATION 7 new groups 2 Joanna Briggs Institute - Annual Report 2018 POSTGRADUATE RESEARCH DEGREES JBI’s Master of Clinical Science 388 80+ 8 completions audit topics projects JBI’s Doctor of Philosophy 3 completions -
1 How to Make Experimental Economics
This article is © Emerald Group Publishing and permission has been granted for this version to appear here (http://eprints.nottingham.ac.uk/id/eprint/32996). Emerald does not grant permission for this article to be further copied/distributed or hosted elsewhere without the express permission from Emerald Group Publishing Limited. How to Make Experimental Economics Research More Reproducible: Lessons from Other Disciplines and a New Proposal1 Zacharias Maniadis, University of Southampton Fabio Tufano, University of Nottingham John A. List University of Chicago2 Abstract: Efforts in the spirit of this special issue aim at improving the reproducibility of experimental economics, in response to the recent discussions regarding the “research reproducibility crisis.” We put this endeavour in perspective by summarizing the main ways (to our knowledge) that have been proposed – by researchers from several disciplines – to alleviate the problem. We discuss the scope for economic theory to contribute to evaluating the proposals. We argue that a potential key impediment to replication is the expectation of negative reactions by the authors of the individual study, and suggest that incentives for having one’s work replicated should increase. JEL Codes: B40, C90 Keywords: False-Positives, Reproducibility, Replication Introduction Nearly a century ago Ronald Fisher began laying the groundwork of experimental practice, which remains today. Fisher’s work was highlighted by the experimental tripod: the concepts of replication, blocking, and randomization were the foundation on which the analysis of the experiment was based. The current volume of Research in Experimental Economics (REE for short) aims at helping experimental economists think more deeply about replication, which has attracted the attention of scholars in other disciplines for some time. -
(STROBE) Statement: Guidelines for Reporting Observational Studies
Policy and practice The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Statement: guidelines for reporting observational studies* Erik von Elm,a Douglas G Altman,b Matthias Egger,a,c Stuart J Pocock,d Peter C Gøtzsche e & Jan P Vandenbroucke f for the STROBE Initiative Abstract Much biomedical research is observational. The reporting of such research is often inadequate, which hampers the assessment of its strengths and weaknesses and of a study’s generalizability. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Initiative developed recommendations on what should be included in an accurate and complete report of an observational study. We defined the scope of the recommendations to cover three main study designs: cohort, case-control and cross-sectional studies. We convened a two-day workshop, in September 2004, with methodologists, researchers and journal editors to draft a checklist of items. This list was subsequently revised during several meetings of the coordinating group and in e-mail discussions with the larger group of STROBE contributors, taking into account empirical evidence and methodological considerations. The workshop and the subsequent iterative process of consultation and revision resulted in a checklist of 22 items (the STROBE Statement) that relate to the title, abstract, introduction, methods, results and discussion sections of articles. Eighteen items are common to all three study designs and four are specific for cohort, case-control, or cross-sectional studies. A detailed Explanation and Elaboration document is published separately and is freely available on the web sites of PLoS Medicine, Annals of Internal Medicine and Epidemiology. We hope that the STROBE Statement will contribute to improving the quality of reporting of observational studies. -
Drug Information Associates
Drug Information Associates BRIEF: INFORMATION SERVICES IN AN ERA OF the use of new, expensive therapies (such as specialty UNPRECEDENTED EVIDENCE EXPANSION drugs, for example) be justified when compared to Heath Ford, PharmD, PhD | May 23, 2020 established practice?” In education, “can learners in certain outcome segments be expected to show Have you ever wondered … how effective is the statin improvement given new programs and substantial drug your doctor prescribed? Or the anti-depressant, increases in investment?” And in manufacturing, “are antipsychotic, or amphetamine? What about the recommended process improvements reasonable given expensive injectable medication your discount card or projected output expectations and investment costs?” drug coupon allows you to get every month for a low copay? And what about medical education (eg, medicine, These considerations validate the need for specialized nursing, pharmacy, dentistry)? Have you ever considered information services, specifically those centering on both the differences between your educational experience and retrieving and appraising scientific evidence. that of today’s professional students? How effective Information Retrieval & Appraisal Services in Medicine were educational strategies then compared to those employed today? In light of the extraordinary growth of the scientific literature, it is said that perhaps no industry is in greater The “Evidence Base” need of specialized information services than medicine. In all industries, especially medicine, the growing level of According to estimates, approximately 2.5 million experimentation and the pervasive expectation for an scientific journal manuscripts were published in 2009 in “evidence base” is very well established. Municipalities more than 29,000 journals.1 Between 1994 and 2001, and not-for-profit organizations, for instance, search for approximately 400,000 medical manuscripts were evidence-based interventions that address problems of published per year (on average),2 making medicine the homelessness, food insecurity, or public health. -
Systematic Review Template
Protocol: The effect of interventions for women’s empowerment on children’s health and education: A systematic review of evidence from low- and middle-income countries Sebastian Vollmer, Sarah Khan, Le Thi Ngoc Tu, Atika Pasha, Soham Sahoo Submitted to the Coordinating Group of: Crime and Justice Education Disability International Development Nutrition Social Welfare Methods Knowledge Translation and Implementation Other: Plans to co-register: No Yes Cochrane Other Maybe Date Submitted: 4.11.2016 Date Revision Submitted: 31.05.2017 and 17.08.2017 Publication date: 21 November 2017 1 The Campbell Collaboration | www.campbellcollaboration.org Background The problem, condition or issue While numerous facts and figures point towards the increasing opportunities for women and their growing participation in economic, political and public decision-making processes, women remain in disadvantaged positions compared to men in many places of the world (World Economic Forum, 2015). Gender bias is deeply embedded in cultures, economies, and political and social institutions around the world, denying women an equal part in society and decision-making. Gender based discrimination in access to economic opportunities, lack of representation and participation in economic and social spheres, and limited opportunities to accumulate resources could perpetuate vulnerability to poverty among women, limit human capital accumulation and restrict economic growth. Women’s economic involvement, their political participation, and their empowerment in the two crucial domains of education and health, are often seen as the four fundamental factors to create just and equitable societies where both women and men have control over their lives and exert similar, if not equal, influence in society. -
John P.A. Ioannidis | Stanford Medicine Profiles
06/01/2021 John P.A. Ioannidis | Stanford Medicine Profiles CAP Profiles John P.A. Ioannidis PROFESSOR OF MEDICINE (STANFORD PREVENTION RESEARCH), OF EPIDEMIOLOGY AND POPULATION HEALTH AND BY COURTESY, OF STATISTICS AND OF BIOMEDICAL DATA SCIENCE Medicine - Stanford Prevention Research Center PRINT PROFILE EMAIL PROFILE Profile Tabs Menu Bio C.F. Rehnborg Chair in Disease Prevention, Professor of Medicine, of Epidemiology and Population Health, and (by courtesy) of Biomedical Data Science, and of Statistics; co-Director, Meta-Research Innovation Center at Stanford (METRICS). Born in New York City in 1965 and raised in Athens, Greece. Valedictorian (1984) at Athens College; National Award of the Greek Mathematical Society (1984); MD (top rank of medical school class) from the National University of Athens in 1990; also received DSc in biopathology from the same institution. Trained at Harvard and Tus (internal medicine and infectious diseases), then held positions at NIH, Johns Hopkins and Tus. Chaired the Department of Hygiene and Epidemiology, University of Ioannina Medical School in 1999-2010 while also holding adjunct professor positions at Harvard, Tus, and Imperial College. Senior Advisor on Knowledge Integration at NCI/NIH (2012-6). Served as President, Society for Research Synthesis Methodology, and editorial board member of many leading journals (including PLoS Medicine, Lancet, Annals of Internal Medicine, JNCI among others) and as Editor-in-Chief of the European Journal of Clinical Investigation (2010-2019). Delivered ~600 invited and honorary lectures. Recipient of many awards (e.g. European Award for Excellence in Clinical Science [2007], Medal for Distinguished Service, Teachers College, Columbia University [2015], Chanchlani Global Health Award [2017], Epiphany Science Courage Award [2018], Einstein fellow [2018]). -
Developing PRISMA-RR, a Reporting Guideline for Rapid Reviews of Primary Studies (Protocol)
Developing PRISMA-RR, a reporting guideline for rapid reviews of primary studies (Protocol) Adrienne Stevens1,2, Chantelle Garritty1,2, Mona Hersi1, David Moher1,3,4 1Ottawa Methods Centre, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Canada; 2TRIBE Graduate Program, University of Split School of Medicine, Croatia; 3Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Canada; 4School of Epidemiology and Public Health, University of Ottawa, Canada; Funding: The development of PRISMA-RR is supported by a grant from the Canadian Institutes of Health Research (CIHR FRN-142310). The funder had no role in the development of this protocol and will not have a role in the data collection, analyses, interpretation of the data, and publication of the findings. Funding will be sought to develop the Explanation and Elaboration manuscript. Developing PRISMA-RR, a reporting guideline for rapid reviews of primary studies February 2018 INTRODUCTION Systematic reviews are known to be the best evidence upon which to make healthcare decisions, but can take years to complete (1). Rapid reviews have emerged as accelerated and/or abbreviated versions of systematic reviews with certain concessions made in the systematic review process to accommodate decision-making situations that require an expedited compilation of the evidence (2). Rapid reviews are usually conducted for a specific requestor and are typically understood to take 6 months or less to complete. Rapid reviews may differ from systematic reviews in a number of ways, such as the number of abbreviated methods or shortcuts employed and the breadth or scope of the question(s) posed (2–4). Some rapid reviews involve accelerated data mining processes and targeted screening of studies. -
208 Environmental Scan and Evaluation of Best Practices For
208 ORIGINAL INVESTIGATION DOI: dx.doi.org/10.5195/jmla.2018.241 Environmental scan and evaluation of best practices for online systematic review resources Robin M. N. Parker, MLIS; Leah Boulos; Sarah Visintini; Krista Ritchie; Jill Hayden See end of article for authors’ affiliations. Objective: Online training for systematic review methodology is an attractive option due to flexibility and limited availability of in-person instruction. Librarians often direct new reviewers to these online resources, so they should be knowledgeable about the variety of available resources. The objective for this project was to conduct an environmental scan of online systematic review training resources and evaluate those identified resources. Methods: The authors systematically searched for electronic learning resources pertaining to systematic review methods. After screening for inclusion, we collected data about characteristics of training resources and assigned scores in the domains of (1) content, (2) design, (3) interactivity, and (4) usability by applying a previously published evaluation rubric for online instruction modules. We described the characteristics and scores for each training resource and compared performance across the domains. Results: Twenty training resources were evaluated. Average overall score of online instructional resources was 61%. Online courses (n=7) averaged 73%, web modules (n=5) 64%, and videos (n=8) 48%. The top 5 highest scoring resources were in course or web module format, featured high interactivity, and required a longer (>5hrs) time commitment from users. Conclusion: This study revealed that resources include appropriate content but are less likely to adhere to principles of online training design and interactivity. Awareness of these resources will allow librarians to make informed recommendations for training based on patrons’ needs. -
C2 MPB Economics Methods 21-04-2008
The Campbell Collaboration Economics Methods Policy Brief Prepared for the Campbell Collaboration Steering Committee by: Co-convenors of the Campbell & Cochrane Economics Methods Group (CCEMG): Ian Shemilt - University of East Anglia, UK. Miranda Mugford - University of East Anglia, UK. Sarah Byford - King’s College London, UK. Mike Drummond - University of York, UK. Eric Eisenstein - Duke University Medical Center, USA. Martin Knapp - London School of Economics and Political Science, UK. Jacqueline Mallender - The Matrix Knowledge Group, UK. Kevin Marsh - The Matrix Knowledge Group, UK. David McDaid - London School of Economics and Political Science, UK. Luke Vale - University of Aberdeen, UK. Damian Walker - John Hopkins Bloomberg School of Public Health, USA. Correspondence to: Ian Shemilt, Research Coordinator, Campbell & Cochrane Economics Methods Group, Elizabeth Fry Building, University of East Anglia, Norwich NR4 7TJ, United Kingdom. Tel: +44 (0)1603 591086; e-mail: [email protected] ; web: www.c- cemg.org C2 Methods Policy Brief: Economics Methods Version 1.0 - April 2008 Contents Executive Summary 2 Preface 5 Process 6 Introduction 7 Key Issues 9 References 34 Acknowledgements 37 1 C2 Methods Policy Brief: Economics Methods Version 1.0 - April 2008 Executive Summary The overall aim of this Brief is to provide an initial source of guidance for authors of Campbell Collaboration Reviews on key issues concerning the use of economics methods. The core objective of The Campbell Collaboration (C2) is preparation, maintenance and dissemination of systematic reviews in order to help people make well-informed decisions about the effects of criminal justice, education and social welfare interventions. In the face of scarce resources, decision-makers often need to consider not only whether an intervention works, but also whether its adoption will lead to a more efficient use of resources. -
Mass Production of Systematic Reviews and Meta-Analyses: an Exercise in Mega-Silliness?
Commentary Mass Production of Systematic Reviews and Meta-analyses: An Exercise in Mega-silliness? MATTHEW J. PAGE∗,† and DAVID MOHER‡,§ ∗School of Public Health and Preventive Medicine, Monash University; †School of Social and Community Medicine, University of Bristol; ‡Centre for Practice Changing Research, Ottawa Hospital Research Institute; §School of Epidemiology, Public Health and Preventive Medicine, Faculty of Medicine, University of Ottawa n 1978, the distinguished professor of psychology Hans Eysenck delivered a scathing critique of what was then a new I method, that of meta-analysis, which he described as “an exer- cise in mega-silliness.” A provocative article by John Ioannidis in this issue of the journal1 suggests that “mega-silliness” may be an appro- priate characterization of what the meta-analysis literature has become. With surveys of the PubMed database and other empirical evaluations, Ioannidis paints a disturbing picture of the current state of affairs, where researchers are producing, in epidemic proportions, systematic reviews and meta-analyses that are redundant, misleading, or serving vested interests. Ioannidis presents an astounding case of 21 different meta-analyses of statins for atrial fibrillation in cardiac surgery published within a period of 7 years, with some of these having practically identical results.1 Moreover, his findings are in line with our recent cross-sectional study of systematic reviews of biomedical research.2 We identified 682 systematic reviews indexed in MEDLINE in a single month (February 2014), which is equivalent to 22 reviews published per day. The majority of reviews did not consider study risk of biases or other reporting biases when drawing conclusions. -
Sean Grant – Curriculum Vitae 1 SEAN PATRICK GRANT Assistant
SEAN PATRICK GRANT Assistant Professor in Social & Behavioral Sciences Richard M. Fairbanks School of Public Health Indiana University Contact Details 1050 Wishard Blvd., Room 6046 Indianapolis, IN 46202 United States of America Phone: (+1)-317-274-3245 Email: [email protected] Website: https://fsph.iupui.edu/about/directory/grant-sean.html EDUCATION 2011-14 DPhil (Ph.D.) in Social Intervention Institution: University of Oxford (United Kingdom) Funding: Clarendon – Green Templeton College Annual Fund Scholarship Thesis: Development of a CONSORT extension for social and psychological intervention trials Supervisor: Professor Paul Montgomery Examiners: Professor Robert West (University College London) Professor Lucy Bowes (University of Oxford) Result: No Corrections (highest possible mark) 2010-11 MSc (M.S.) in Evidence-Based Social Intervention Institution: University of Oxford (United Kingdom) Thesis: Assessing the reporting quality of psychosocial intervention trials: A pilot study Supervisor: Professor Paul Montgomery Result: Distinction for thesis (highest mark in cohort) and overall degree 2006-10 Honors B.A. in Psychology (with a minor in Economics) Institution: Loyola Marymount University (United States) Funding: Trustee Scholarship Thesis: Event-level drinking among college students Supervisor: Professor Joseph LaBrie Result: Summa cum laude 2006-10 Honors B.A. in Philosophy Institution: Loyola Marymount University (United States) Funding: Trustee Scholarship Thesis: More than a feeling: The significance of emotion in attaining the -
“A Formidable Share of the Research We Pay for Is Simply Wrong.” by JON KÅRE TIME, Appeared in MORGENBLADET, No
“A formidable share of the research we pay for is simply wrong.” By JON KÅRE TIME, appeared in MORGENBLADET, No. 22/8–14 June 2018 Translated from the Norwegian by Heidi Sommerstad Ødegård, AMESTO/Semantix Translations Norway AS, Torture of data, perverse reward systems, declining morals and false findings: Science is in crisis argues statistician Andrea Saltelli. “There’s been a lot of data torturing with this cool dataset,” wrote Cornell University food scientist Brian Wansink in a message to a colleague. Wansink was famous, and his research was popular. It showed, among other things, how to lose weight without much effort. All anyone needed to do was make small changes to their habits and environment. We eat less if we put the cereal boxes in the cabinet instead of having it out on the counter. But after an in-depth investigation by Buzzfeed earlier this year, Wansink has fallen from grace as a social science star to a textbook example of what respectable scientists do not do. “Data torturing” is also referred to as “p-hacking” – the slicing and dicing of a dataset until you find a statistical context that is strong and exciting enough to publish. As Wansink himself formulated it in one of the uncovered emails, you can “tweak” the dataset to find what you want. “Think about all the different ways you can cut the data,” he wrote to a new grad student about to join his lab. It’s a crisis! “Is science in crisis? Well, it was perhaps debatable a few years ago, but now?” Andrea Saltelli seems almost ready to throw up his hands in despair at the question.