Confidential: for Review Only

Total Page:16

File Type:pdf, Size:1020Kb

Confidential: for Review Only BMJ Confidential: For Review Only Academic criteria for promotion and tenure in biomedical sciences faculties: a cross-sectional analysis of an international sample of universities Journal: BMJ Manuscript ID BMJ-2019-053055.R1 Article Type: Research BMJ Journal: BMJ Date Submitted by the 10-Feb-2020 Author: Complete List of Authors: Rice, Danielle; McGill University; Ottawa Hospital Research Institute, Raffoul, Hana; University of Waterloo; Ottawa Hospital Research Institute Ioannidis, John; Stanford University, Stanford Prevention Research Center, Department of Medicine and Department of Health Research and Policy Moher, David; Ottawa Hospital Research Institute, Ottawa Methods Centre Keywords: Promotion, Tenure, Faculty of Medicine, Incentives https://mc.manuscriptcentral.com/bmj Page 1 of 58 BMJ Faculty of Biomedical Sciences Promotion and Tenure Criteria 1 2 3 1 Academic criteria for promotion and tenure in biomedical sciences faculties: a cross-sectional 4 5 6 2 analysis of an international sample of universities 7 8 3 9 10 4 Danielle B Rice1,2, Hana Raffoul2,3, John PA Ioannidis4,5,6,7, David Moher8,9 11 Confidential: For Review Only 12 5 13 14 15 6 1Department of Psychology, McGill University, Montreal, Quebec, Canada; 2Ottawa Hospital 16 17 7 Research Institute, Ontario, Canada; 3Faculty of Engineering, University of Waterloo, Waterloo, 18 19 8 Ontario, Canada; 4Departments of Medicine, 5Health Research and Policy, 6Biomedical Data 20 21 7 22 9 Science, and Statistics, and Meta-Research Innovation Center at Stanford (METRICS), Stanford 23 24 10 University, Stanford, California, USA; 8Centre for Journalology, Clinical Epidemiology Program, 25 26 11 Ottawa Hospital Research Institute, Ontario, Canada; 9School of Epidemiology and Public Health, 27 28 29 12 University of Ottawa, Ottawa, Canada 30 31 13 32 33 14 Corresponding Author 34 35 15 David Moher 36 37 16 [email protected] 38 17 39 40 18 Danielle Rice: ORCID 0000-0001-5615-7005 41 42 19 David Moher: ORCID 0000-0003-2434-4206 43 44 20 John Ioannidis: ORCID 0000-0003-3118-6859 45 46 21 47 48 49 50 51 52 53 54 55 56 57 58 1 59 60 https://mc.manuscriptcentral.com/bmj BMJ Page 2 of 58 Faculty of Biomedical Sciences Promotion and Tenure Criteria 1 2 3 1 Affiliations: 4 5 6 2 Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, 7 8 3 Ontario, Canada, K1H 8L6 9 10 4 Danielle B Rice (Doctoral student) 11 Confidential: For Review Only 12 5 Hana Raffoul (Undergraduate student) 13 14 6 David Moher (Director) 15 16 7 Meta-Research Innovation Center at Stanford (METRICS), Stanford University, Stanford, 17 18 8 California, USA, 94305 19 20 9 John PA Ioannidis (Co-Director) 21 22 10 23 24 25 11 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 2 59 60 https://mc.manuscriptcentral.com/bmj Page 3 of 58 BMJ Faculty of Biomedical Sciences Promotion and Tenure Criteria 1 2 3 1 ABSTRACT 4 5 6 2 Objectives: To determine the presence of a set of pre-specified traditional and non-traditional 7 8 3 criteria used to assess scientists for promotion and tenure in faculties of biomedical sciences among 9 10 4 universities worldwide. 11 Confidential: For Review Only 12 5 Design: Cross-sectional study. 13 14 15 6 Setting: Not applicable. 16 17 7 Participants: 170 randomly selected universities from the Leiden Ranking of world universities 18 19 8 list were considered. 20 21 22 9 Main outcome measures: Two independent reviewers searched for all guidelines applied when 23 24 10 assessing scientists for promotion and tenure for institutions with biomedical faculties. Where 25 26 11 faculty-level guidelines were not available, institution-level guidelines were sought. Available 27 28 29 12 documents were reviewed and the presence of 5 traditional (e.g., number of publications) and 7 30 31 13 non-traditional (e.g., data sharing) criteria was noted in guidelines for assessing assistant 32 33 14 professors, associate professors, professors, and the granting of tenure. 34 35 15 Results: A total of 146 institutions had faculties of biomedical sciences with 92 having eligible 36 37 38 16 guidelines available to review. Traditional criteria mentioned peer-reviewed publications, 39 40 17 authorship order, journal impact, grant funding, and national or international reputation in 95%, 41 42 18 37%, 28%, 67%, and 48% of the guidelines, respectively. Conversely, among non-traditional 43 44 45 19 criteria only citations (any mention in 26%) and accommodations for employment leave (37%) 46 47 20 were relatively commonly mentioned; while there was rare mention of alternative metrics for 48 49 21 sharing research (2%) and data sharing (1%), and 3 criteria (publishing in open access mediums, 50 51 52 22 registering research, and adhering to reporting guidelines) were not found in any institution 53 54 23 reviewed. Traditional criteria were more commonly reported than non-traditional criteria (p= .001). 55 56 57 58 3 59 60 https://mc.manuscriptcentral.com/bmj BMJ Page 4 of 58 Faculty of Biomedical Sciences Promotion and Tenure Criteria 1 2 3 1 We observed notable differences across continents on whether guidelines are accessible or not 4 5 6 2 (Australia 100%, North America 97%, Europe 50%, Asia 58%, South America 17%), and more 7 8 3 subtle differences on the use of specific criteria. 9 10 4 Conclusions: This study demonstrates that the current evaluation of scientists emphasizes 11 Confidential: For Review Only 12 5 traditional criteria as opposed to non-traditional criteria. This may reinforce research practices that 13 14 15 6 are known to be problematic while insufficiently supporting the conduct of better-quality research 16 17 7 and open science. Institutions should consider incentivizing non-traditional criteria. 18 19 8 Registration: Open Science Framework (https://osf.io/26ucp/) 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 4 59 60 https://mc.manuscriptcentral.com/bmj Page 5 of 58 BMJ Faculty of Biomedical Sciences Promotion and Tenure Criteria 1 2 3 1 What is already known on this topic: 4 5 6 2 • Academics tailor their research practices based on the evaluation criteria applied within 7 8 3 their academic institution. 9 10 4 • Ensuring that biomedical researchers are incentivized by adhering to best practice 11 Confidential: For Review Only 12 5 13 guidelines for research is essential given the clinical implications of this work. 14 15 6 • While changes to the criteria used to assess professors and confer tenure have been 16 17 7 recommended, a systematic assessment of promotion and tenure criteria being applied 18 19 8 worldwide has not been conducted. 20 21 22 23 9 What this study adds: 24 25 26 10 • Across countries, university guidelines focus on rewarding traditional research criteria 27 28 29 11 (peer-reviewed publications, authorship order, journal impact, grant funding, and national 30 31 12 or international reputation). 32 33 13 • The minimum written requirements for promotion and tenure criteria are predominantly 34 35 36 14 objective in nature, although several of them are inadequate measures to assess the 37 38 15 impact of researchers. 39 40 16 • Developing and evaluating more appropriate, non-traditional indicators of research may 41 42 17 facilitate changes in the evaluation practices for rewarding researchers. 43 44 45 46 18 47 48 19 49 50 51 52 53 54 55 56 57 58 5 59 60 https://mc.manuscriptcentral.com/bmj BMJ Page 6 of 58 Faculty of Biomedical Sciences Promotion and Tenure Criteria 1 2 3 1 INTRODUCTION 4 5 6 2 There are important deficiencies in the quality and transparency of research conducted across 7 8 3 disciplines. 1 2 Numerous efforts have been made to combat these inadequacies by developing, for 9 10 4 example, reporting guidelines (e.g., the CONSORT and PRISMA Statements), registration of 11 Confidential: For Review Only 12 5 studies prior to data collection (e.g., clinicaltrials.gov), and data sharing practices.3 4 Despite these 13 14 15 6 strategies, poorly conducted and inadequately reported research remains highly prevalent.5 This 16 17 7 has important consequences, especially in the field of medicine, as research is heavily relied upon 18 19 8 to inform clinical decision-making. 20 21 22 9 Institutions have the ability to influence large-scale improvements among researchers, as 23 24 10 universities hire new faculty, and promote and tenure existing faculty. Universities can provide 25 26 11 incentives and rewards (e.g., promotions) for scholarly work that is conducted appropriately, 27 28 29 12 reported transparently, and adheres to best publication practices. A recent survey conducted in the 30 31 13 UK found that academics tailor their publication practices to align with their institutional 32 33 14 evaluation criteria.6 These criteria, however, may include metrics that are known to be problematic 34 35 15 for assessing researchers.7 Current incentives and rewards may also be misaligned with the needs 36 37 38 16 of society. Reward systems within universities typically include criteria within promotion and 39 40 17 tenure documents such as the quantity of publications and novelty of findings rather than the 41 42 18 reliability, accuracy, reproducibility and transparent reporting of findings.8 Inappropriate criteria 43 44 9 45 19 being applied for career advancement can inadvertently contribute to research waste, with billions 46 47 20 of dollars invested in non-usable research.10 For example, universities that emphasize the quantity 48 49 21 of published papers, can increase undeserved authorship, salami slicing, and publication in very 50 51 52 22 low-quality journals (e.g., predatory journals) without peer-review, and contribute to the problems 53 54 23 of reproducibility.
Recommended publications
  • 1 How to Make Experimental Economics
    This article is © Emerald Group Publishing and permission has been granted for this version to appear here (http://eprints.nottingham.ac.uk/id/eprint/32996). Emerald does not grant permission for this article to be further copied/distributed or hosted elsewhere without the express permission from Emerald Group Publishing Limited. How to Make Experimental Economics Research More Reproducible: Lessons from Other Disciplines and a New Proposal1 Zacharias Maniadis, University of Southampton Fabio Tufano, University of Nottingham John A. List University of Chicago2 Abstract: Efforts in the spirit of this special issue aim at improving the reproducibility of experimental economics, in response to the recent discussions regarding the “research reproducibility crisis.” We put this endeavour in perspective by summarizing the main ways (to our knowledge) that have been proposed – by researchers from several disciplines – to alleviate the problem. We discuss the scope for economic theory to contribute to evaluating the proposals. We argue that a potential key impediment to replication is the expectation of negative reactions by the authors of the individual study, and suggest that incentives for having one’s work replicated should increase. JEL Codes: B40, C90 Keywords: False-Positives, Reproducibility, Replication Introduction Nearly a century ago Ronald Fisher began laying the groundwork of experimental practice, which remains today. Fisher’s work was highlighted by the experimental tripod: the concepts of replication, blocking, and randomization were the foundation on which the analysis of the experiment was based. The current volume of Research in Experimental Economics (REE for short) aims at helping experimental economists think more deeply about replication, which has attracted the attention of scholars in other disciplines for some time.
    [Show full text]
  • (STROBE) Statement: Guidelines for Reporting Observational Studies
    Policy and practice The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Statement: guidelines for reporting observational studies* Erik von Elm,a Douglas G Altman,b Matthias Egger,a,c Stuart J Pocock,d Peter C Gøtzsche e & Jan P Vandenbroucke f for the STROBE Initiative Abstract Much biomedical research is observational. The reporting of such research is often inadequate, which hampers the assessment of its strengths and weaknesses and of a study’s generalizability. The Strengthening the Reporting of Observational Studies in Epidemiology (STROBE) Initiative developed recommendations on what should be included in an accurate and complete report of an observational study. We defined the scope of the recommendations to cover three main study designs: cohort, case-control and cross-sectional studies. We convened a two-day workshop, in September 2004, with methodologists, researchers and journal editors to draft a checklist of items. This list was subsequently revised during several meetings of the coordinating group and in e-mail discussions with the larger group of STROBE contributors, taking into account empirical evidence and methodological considerations. The workshop and the subsequent iterative process of consultation and revision resulted in a checklist of 22 items (the STROBE Statement) that relate to the title, abstract, introduction, methods, results and discussion sections of articles. Eighteen items are common to all three study designs and four are specific for cohort, case-control, or cross-sectional studies. A detailed Explanation and Elaboration document is published separately and is freely available on the web sites of PLoS Medicine, Annals of Internal Medicine and Epidemiology. We hope that the STROBE Statement will contribute to improving the quality of reporting of observational studies.
    [Show full text]
  • Drug Information Associates
    Drug Information Associates BRIEF: INFORMATION SERVICES IN AN ERA OF the use of new, expensive therapies (such as specialty UNPRECEDENTED EVIDENCE EXPANSION drugs, for example) be justified when compared to Heath Ford, PharmD, PhD | May 23, 2020 established practice?” In education, “can learners in certain outcome segments be expected to show Have you ever wondered … how effective is the statin improvement given new programs and substantial drug your doctor prescribed? Or the anti-depressant, increases in investment?” And in manufacturing, “are antipsychotic, or amphetamine? What about the recommended process improvements reasonable given expensive injectable medication your discount card or projected output expectations and investment costs?” drug coupon allows you to get every month for a low copay? And what about medical education (eg, medicine, These considerations validate the need for specialized nursing, pharmacy, dentistry)? Have you ever considered information services, specifically those centering on both the differences between your educational experience and retrieving and appraising scientific evidence. that of today’s professional students? How effective Information Retrieval & Appraisal Services in Medicine were educational strategies then compared to those employed today? In light of the extraordinary growth of the scientific literature, it is said that perhaps no industry is in greater The “Evidence Base” need of specialized information services than medicine. In all industries, especially medicine, the growing level of According to estimates, approximately 2.5 million experimentation and the pervasive expectation for an scientific journal manuscripts were published in 2009 in “evidence base” is very well established. Municipalities more than 29,000 journals.1 Between 1994 and 2001, and not-for-profit organizations, for instance, search for approximately 400,000 medical manuscripts were evidence-based interventions that address problems of published per year (on average),2 making medicine the homelessness, food insecurity, or public health.
    [Show full text]
  • John P.A. Ioannidis | Stanford Medicine Profiles
    06/01/2021 John P.A. Ioannidis | Stanford Medicine Profiles CAP Profiles John P.A. Ioannidis PROFESSOR OF MEDICINE (STANFORD PREVENTION RESEARCH), OF EPIDEMIOLOGY AND POPULATION HEALTH AND BY COURTESY, OF STATISTICS AND OF BIOMEDICAL DATA SCIENCE Medicine - Stanford Prevention Research Center PRINT PROFILE EMAIL PROFILE Profile Tabs Menu Bio C.F. Rehnborg Chair in Disease Prevention, Professor of Medicine, of Epidemiology and Population Health, and (by courtesy) of Biomedical Data Science, and of Statistics; co-Director, Meta-Research Innovation Center at Stanford (METRICS). Born in New York City in 1965 and raised in Athens, Greece. Valedictorian (1984) at Athens College; National Award of the Greek Mathematical Society (1984); MD (top rank of medical school class) from the National University of Athens in 1990; also received DSc in biopathology from the same institution. Trained at Harvard and Tus (internal medicine and infectious diseases), then held positions at NIH, Johns Hopkins and Tus. Chaired the Department of Hygiene and Epidemiology, University of Ioannina Medical School in 1999-2010 while also holding adjunct professor positions at Harvard, Tus, and Imperial College. Senior Advisor on Knowledge Integration at NCI/NIH (2012-6). Served as President, Society for Research Synthesis Methodology, and editorial board member of many leading journals (including PLoS Medicine, Lancet, Annals of Internal Medicine, JNCI among others) and as Editor-in-Chief of the European Journal of Clinical Investigation (2010-2019). Delivered ~600 invited and honorary lectures. Recipient of many awards (e.g. European Award for Excellence in Clinical Science [2007], Medal for Distinguished Service, Teachers College, Columbia University [2015], Chanchlani Global Health Award [2017], Epiphany Science Courage Award [2018], Einstein fellow [2018]).
    [Show full text]
  • Mass Production of Systematic Reviews and Meta-Analyses: an Exercise in Mega-Silliness?
    Commentary Mass Production of Systematic Reviews and Meta-analyses: An Exercise in Mega-silliness? MATTHEW J. PAGE∗,† and DAVID MOHER‡,§ ∗School of Public Health and Preventive Medicine, Monash University; †School of Social and Community Medicine, University of Bristol; ‡Centre for Practice Changing Research, Ottawa Hospital Research Institute; §School of Epidemiology, Public Health and Preventive Medicine, Faculty of Medicine, University of Ottawa n 1978, the distinguished professor of psychology Hans Eysenck delivered a scathing critique of what was then a new I method, that of meta-analysis, which he described as “an exer- cise in mega-silliness.” A provocative article by John Ioannidis in this issue of the journal1 suggests that “mega-silliness” may be an appro- priate characterization of what the meta-analysis literature has become. With surveys of the PubMed database and other empirical evaluations, Ioannidis paints a disturbing picture of the current state of affairs, where researchers are producing, in epidemic proportions, systematic reviews and meta-analyses that are redundant, misleading, or serving vested interests. Ioannidis presents an astounding case of 21 different meta-analyses of statins for atrial fibrillation in cardiac surgery published within a period of 7 years, with some of these having practically identical results.1 Moreover, his findings are in line with our recent cross-sectional study of systematic reviews of biomedical research.2 We identified 682 systematic reviews indexed in MEDLINE in a single month (February 2014), which is equivalent to 22 reviews published per day. The majority of reviews did not consider study risk of biases or other reporting biases when drawing conclusions.
    [Show full text]
  • “A Formidable Share of the Research We Pay for Is Simply Wrong.” by JON KÅRE TIME, Appeared in MORGENBLADET, No
    “A formidable share of the research we pay for is simply wrong.” By JON KÅRE TIME, appeared in MORGENBLADET, No. 22/8–14 June 2018 Translated from the Norwegian by Heidi Sommerstad Ødegård, AMESTO/Semantix Translations Norway AS, Torture of data, perverse reward systems, declining morals and false findings: Science is in crisis argues statistician Andrea Saltelli. “There’s been a lot of data torturing with this cool dataset,” wrote Cornell University food scientist Brian Wansink in a message to a colleague. Wansink was famous, and his research was popular. It showed, among other things, how to lose weight without much effort. All anyone needed to do was make small changes to their habits and environment. We eat less if we put the cereal boxes in the cabinet instead of having it out on the counter. But after an in-depth investigation by Buzzfeed earlier this year, Wansink has fallen from grace as a social science star to a textbook example of what respectable scientists do not do. “Data torturing” is also referred to as “p-hacking” – the slicing and dicing of a dataset until you find a statistical context that is strong and exciting enough to publish. As Wansink himself formulated it in one of the uncovered emails, you can “tweak” the dataset to find what you want. “Think about all the different ways you can cut the data,” he wrote to a new grad student about to join his lab. It’s a crisis! “Is science in crisis? Well, it was perhaps debatable a few years ago, but now?” Andrea Saltelli seems almost ready to throw up his hands in despair at the question.
    [Show full text]
  • Download the Print Version of Inside Stanford
    Medical students found out March 17 where they would be STANFORD going for their residencies. Page 4 INSIDE Volume 9, No. MEDICINE6 March 27, 2017 Published by the Office of Communication & Public Affairs Stem cell ‘therapy’ blinds three patients By Becky Bach The three patients — all women, ranging in age cells injected intravitreal in dry macular degeneration.” from 72 to 88 — suffered from macular degeneration, a Some of the patients believed they were participating hree people with macular degeneration were common, progressive disease of the retina that leads to in a trial, although the consent form and other written blinded after undergoing an unproven stem cell loss of vision. Before the surgery, the vision in their eyes materials given to the patients did not mention a trial, Ttreatment that was touted as a clinical trial in ranged from 20/30 to 20/200. Now, the patients are Albini said. 2015 at a clinic in Florida. Within a week following the likely to remain blind, said co-author Thomas Albini, “There’s a lot of hope for stem cells, and these types treatment, the patients experienced a variety of com- MD, an associate professor of clinical ophthalmology at of clinics appeal to patients desperate for care who hope plications, including vision loss, detached retinas and the University of Miami, where two of the patients were that stem cells are going to be the answer, but in this hemorrhage. They are now blind. subsequently treated for complications from the stem case these women participated in a clinical enterprise A paper documenting the cases was published March cell treatments.
    [Show full text]
  • 113 Building Capacity to Encourage Research Reproducibility and #Makeresearchtrue
    113 CASE STUDY DOI: dx.doi.org/10.5195/jmla.2018.273 Building capacity to encourage research reproducibility and #MakeResearchTrue Melissa L. Rethlefsen, AHIP; Mellanye J. Lackey; Shirley Zhao See end of article for authors’ affiliations. Background: Research into study replication and reporting has led to wide concern about a reproducibility crisis. Reproducibility is coming to the attention of major grant funders, including the National Institutes of Health, which launched new grant application instructions regarding rigor and reproducibility in 2015. Study Purpose: In this case study, the authors present one library’s work to help increase awareness of reproducibility and to build capacity for our institution to improve reproducibility of ongoing and future research. Case Presentation: Library faculty partnered with campus research leaders to create a daylong conference on research reproducibility, followed by a post-conference day with workshops and an additional seminar. Attendees came from nearly all schools and colleges on campus, as well as from other institutions, nationally and internationally. Feedback on the conference was positive, leading to efforts to sustain the momentum achieved at the conference. New networking and educational opportunities are in development. Discussion: Libraries are uniquely positioned to lead educational and capacity-building efforts on campus around research reproducibility. Costs are high and partnerships are required, but such efforts can lead to positive change institution-wide. INTRODUCTION psychology and their preliminary results in cancer In 2005, John Ioannidis published one of the most biology [4], have shown slightly less poor influential articles of recent decades in PLOS replicability but still enough to cause concern of a Medicine [1].
    [Show full text]
  • USING RESEARCH EVIDENCE a Practice Guide 1
    USING RESEARCH EVIDENCE A Practice Guide 1 USING RESEARCH EVIDENCE A Practice Guide ACKNOWLEDGEMENTS This guide was written by Jonathan Breckon, edited by Isobel Roberts and produced by Nesta’s Innovation Skills team. Thank you to the following for their comments, suggestions and help with this guide: Helen Anderson (Policy Profession, Civil Service Learning); Jonathan Bamber (Nesta); Albert Bravo- Biosca (Nesta); Professor Paul Cairney (University of Stirling); Michael O’Donnell (BOND); Triin Edovald (Nesta); Caroline Kenny (Parliamentary Office for Science and Technology); Geoff Mulgan (Nesta); Tony Munton (Centre for Evidence Based Management and RTK Ltd); Nicky Miller (College of Policing); Helen Mthiyane (Alliance for Useful Evidence); Jonathan Sharples (Education Endowment Foundation); Louise Shaxson (Overseas Development Institute); Howard White (Campbell Collaboration). We would also like to thank all the participants and partners in our Evidence Masterclasses during 2014-15, including Civil Service Learning, ACEVO (Association of Chief Executives of Voluntary Organisations), Scottish Council for Voluntary Organisations, SOLACE and the Welsh Government Policy Profession. The views do not, however, necessarily reflect the suggestions made by these individuals or their affiliated organisations. Any errors in the report remain the author’s own. About the Alliance for Useful Evidence The Alliance champions the use of evidence in social policy and practice. We are an open–access network of 2,500 individuals from across government, universities, charities, business and local authorities in the UK and internationally. Membership is free. To sign up visit: alliance4usefulevidence.org/join Nesta’s Practice Guides This guide is part of a series of Practice Guides developed by Nesta’s Innovation Skills team.
    [Show full text]
  • John Ioannidis 2012 06 18.Pdf (94.64
    On June 11, 2012, Holden Karnofsky and Cari Tuna spoke with John Ioannidis Holden: There are a couple of topics that I want to talk with you about. The first topic is the Cochrane Collaboration because we’re interested in that group and want your perspective. The second topic is what we’re calling meta-research. I’ve heard your name come up a lot in this context, I really like the work you’ve done, and I’m interested in your thoughts about what a funder could do to be helpful and speed along improvement in this area. John: I think these are very useful topics to discuss. Holden: Let’s start with Cochrane. The story that we’ve been hearing is that they need a lot more funding than they have, especially for the basic infrastructure and for things like the US Cochrane center. I don’t know how well positioned you are to have an opinion on that, but what do you think about the group in general and of the idea of funding them? John: I have a very positive feeling about Cochrane Collaboration. I have been involved with them. Although I have not been a key player for many years, I was probably among the first people to be involved with the Collaboration. It’s a non-profit group with very strict rules about what kind of funding they can get for the review projects that they perform. That creates a set of limitations for it, which I think are appropriate, because the authors cannot be objective when performing a systematic review if they are funded by sponsors of whatever’s being appraised in that review.
    [Show full text]
  • 2020.09.16.20194571V2.Full.Pdf
    medRxiv preprint doi: https://doi.org/10.1101/2020.09.16.20194571; this version posted October 22, 2020. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted medRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY-NC-ND 4.0 International license . Mortality outcomes with hydroxychloroquine and chloroquine in COVID-19: an international collaborative meta-analysis of randomized trials Cathrine Axfors MD, PhDa,1,2, Andreas M. Schmitt MDa,3,4, Perrine Janiaud PhD3, Janneke van 't Hooft MD, PhD1,5, Sherief Abd-Elsalam MD, PhD6, Ehab F. Abdo MD, PhD7, Benjamin S. Abella MD, MPhil8, Javed Akram MBBS, FRCP9, Ravi K. Amaravadi MD10, Derek C. Angus MD11,12, Yaseen M. Arabi MD, FCCP, FCCM13, Shehnoor Azhar BDS, MPH14, Lindsey R. Baden MD15, Arthur W. Baker MD, MPH16, Leila Belkhir MD, PhD17, Thomas Benfield MD, DMSc18, Marvin A.H. Berrevoets MD19, Cheng-Pin Chen MD20, Tsung-Chia Chen MD21, Shu-Hsing Cheng MD, PhD20, Chien-Yu Cheng MD20, Wei-Sheng Chung MD, PhD21, Yehuda Z. Cohen MD22, Lisa N. Cowan MS22, Olav Dalgard MD, PhD23,24, Fernando F. de Almeida e Val PhD25, Marcus V.G. de Lacerda PhD25,26, Gisely C. de Melo PhD25,27, Lennie Derde MD, PhD28,29, Vincent Dubee MD, PhD30, Anissa Elfakir MSc31, Anthony C. Gordon MD32, Carmen M. Hernandez-Cardenas MD, MSc33, Thomas Hills DPhil34,35, Andy I.M. Hoepelman Prof. Dr.36, Yi-Wen Huang MD37, Bruno Igau PhD22, Ronghua Jin MD, PhD38, Felipe Jurado-Camacho MD, MSc33, Khalid S.
    [Show full text]
  • Evidencelive16-Programme-160617.Pdf
    1 Notice of photography and filming Evidence live 2016 is being visually documented. By attending you acknowledge that you have been informed that you may be caught on “A thoroughly enjoyable camera during this event. Images taken will be treated as the property of Evidence Live and may be used in the future for promotional purposes. experience. I look These images may be used without limitation by any organisation approved by CEBM & The BMJ and edited prior to publication as seen forward to attending fit for purpose. Images will be available on the internet accessible to Evidence Live again internet users throughout the world including countries that may have less extensive data protection than partnering countries. All films in the future!” will be securely stored on University of Oxford servers. Please make yourself known at registration if you wish to remain off camera. 2 Welcome back to Oxford for EvidenceLive 2016 Welcome to Evidence Live 2016. This is our fifth conference, hosted jointly by the Centre for Evidence-Based Medicine (CEBM) at the Nuffield Department of Primary Care Health Sciences, University of Oxford and The BMJ (British Medical Journal). Due to popular demand, we have extended this years conference to cover 3 full days of discussion and learning around 5 main themes: Carl Heneghan, Director, Centre for Improving the Quality of Research Evidence Evidence-Based Medicine Disentangling the Problems of Too Much and Too Little Medicine Transforming the Communication of Evidence for Better Health Training the Next Generation of Leaders in Applied Evidence Translating Evidence into Better-Quality Health Services With a 22% increase on abstract submissions 2016 hosts more than 160 presentations, posters and workshops from 28 countries around the world.
    [Show full text]