Hong Kong Principles
Total Page:16
File Type:pdf, Size:1020Kb
1 The Hong Kong Principles for Assessing Researchers: 2 Fostering Research Integrity 3 4 David Moher1,2, Lex Bouter3,4, Sabine Kleinert5, Paul Glasziou6, Mai Har Sham7 5 Virginia Barbour8, Anne-Marie Coriat9, Nicole Foeger10, Ulrich Dirnagl11 6 7 1Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute; 2School of 8 Epidemiology and Public Health, University of Ottawa, Ottawa, Canada; 3Department of Epidemiology 9 and Biostatistics, Amsterdam University Medical Centers, location VUmc; 4Department of Philosophy, 10 Faculty of Humanities, Vrije Universiteit, Amsterdam, The Netherlands; 5The Lancet, London Wall Office, 11 London, UK; 6Institute for Evidence-Based healthcare, Bond University, Gold Coast, Qld, Australia; and 12 7School of Biomedical Sciences, LKS Faculty of Medicine, The University of Hong Kong, Pokfulam, Hong 13 Kong SAR, China; 8Queensland University of Technology (QUT), Brisbane, Australia; 9Wellcome Trust, 14 London; 10Austrian Agency for Research Integrity, Vienna, Austria; 11Berlin Institute of Health, QUEST 15 Center for Transforming Biomedical Research, Berlin, Germany 16 17 David Moher: ORCID 0000-0003-2434-4206 18 Lex Bouter: ORCID 0000-0002-2659-5482 19 Sabine Kleinert: ORCID 0000-0001-7826-1188 20 Paul Glasziou: ORCID 0000-0001-7564-073X 21 Mai Har Sham: ORCID 0000-0003-1179-7839 22 Virginia Barbour: ORCID: 0000-0002-2358-2440 23 Anne-Marie Coriat: ORCID 0000-0003-2632-1745 24 Nicole Foeger: ORCID 0000-0001-7775-7325 25 Ulrich Dirnagl: ORCID 0000-0003-0755-6119 26 27 28 23rd November 2019 29 Abstract 30 31 The primary goal of research is to advance knowledge. For that knowledge to benefit research and 32 society, it must be trustworthy. Trustworthy research is robust, rigorous and transparent at all stages of 33 design, execution and reporting. Initiatives such as the San Francisco Declaration on Research 34 Assessment (DORA) and the Leiden Manifesto have led the way bringing much needed global attention 35 to the importance of taking a considered, transparent and broad approach to assessing research quality. 36 Since publication in 2012 the DORA principles have been signed up to by over 1500 organizations and 37 nearly 15,000 individuals. Despite this significant progress, assessment of researchers still rarely includes 38 considerations related to trustworthiness, rigor and transparency. We have developed the Hong Kong 39 Principles (HKPs) as part of the 6th World Conference on Research Integrity with a specific focus on the 40 need to drive research improvement through ensuring that researchers are explicitly recognized and 41 rewarded (i.e., their careers are advanced) for behavior that leads to trustworthy research. The HKPs 42 have been developed with the idea that their implementation could assist in how researchers are 43 assessed for career advancement with a view to strengthen research integrity. We present five 44 principles: responsible research practices; transparent reporting; open science (open research); valuing 45 a diversity of types of research; and recognizing all contributions to research and scholarly activity. For 46 each principle we provide a rationale for its inclusion and provide examples where these principles are 47 already being adopted. 48 Introduction 49 In a quest to advance knowledge, researchers publish approximately 1.5 million journal articles each 50 year. The presumption is that this literature can be used by other researchers, stakeholders, and the 51 wider society because it is trusted, robust, rigorous and complete. 52 53 The approach taken to validating research and its outcomes differs depending on the nature of the 54 research. For example, to rigorously examine the effects of a health intervention, trial participants 55 (human or animal) are typically required to be randomized to the intervention being studied. Many 56 researchers advocate registration of protocols as a way to ensure transparency and to enable others to 57 engage with their research. Subsequently, the use of reporting guidelines can help ensure complete and 58 transparent reporting of the researchers’ methods and results. When the research is being 59 disseminated, the research team would ensure that the associated data, materials and any analytical 60 code are made available as an integral part of publication. Such data sharing facilitates re-analysis of the 61 data to check reproducibility and to perform secondary analyses. 62 63 Although some mechanisms exist to support researchers in ensuring transparency at all stages of design, 64 execution and reporting, there is not widespread adoption of these practices. There are many 65 interwoven reasons for this. One contributing factor, we argue, is that little emphasis is placed on the 66 rigor of research when hiring, reviewing and promoting researchers. Working together across the 67 research sector as a whole to address this systemic issue, we believe, offers a global opportunity to 68 improve research and impact. 69 70 We developed the Hong Kong Principles (HKPs) as part of the 6th World Conference on Research 71 Integrity (WCRI) specifically to drive greater recognition for researchers who commit to robust, rigorous 72 and transparent practices (i.e., their careers are advanced) (see Figure). If implemented, the HKPs could 73 play a critical role in evidence-based assessments of researchers and put research rigor at the heart of 74 assessment, as well as open up research to the wider benefit of society. 75 76 We propose five principles, each with a rationale for its inclusion. We illustrate these principles with 77 examples where we know they exist. These examples are not exhaustive, and many are relevant to more 78 than one principle. Together, they illustrate of a breadth of approaches as to how these principles can 79 operate at the very highest levels of international research. 80 81 Early drafts of the HKPs were circulated to the 700 participants registered for the 6th WCRI. Further 82 discussions took place during two sessions at the 6th WCRI. A penultimate version was uploaded on the 83 6th WCRI website after the conference. More than 100 people provided input and feedback. We 84 acknowledge all of these valuable contributions and the global leadership of those working on the San 85 Francisco Declaration on Research Assessment (DORA), the Leiden Manifesto and other initiatives to 86 promote the responsible use of metrics, which have laid the foundations for much of our work (1-4). 87 88 Principles 89 Principle 1: Assess researchers on responsible practices from conception to delivery, including the 90 development of the research idea, research design, methodology, execution and effective 91 dissemination. 92 93 Rationale 94 The numbers of publications, citations, and total volume of grants are often still the dominant metrics 95 used by research institutions for assessing and rewarding their researchers (1-4). Providing bonuses to 96 academics for publishing in certain journals (i.e., merit pay) is also common in many parts of the world 97 (5-7). These assessment criteria tell assessors little about the researchers and the rigor of their work; 98 thus they are not particularly ‘responsible’ metrics. These metrics can also be unduly influenced by field 99 and citation practices and provide little information about a publication’s (and therefore a researcher’s) 100 contributions to research and society. Other criteria are required to provide a broader view of markers 101 of best practices: for example, the extent to which a researcher develops research questions with the 102 involvement of appropriate members of the public (see Figure). 103 104 Current implementation 105 The Canadian Institutes of Health Research’s Strategy for Patient-Oriented Research (SPOR) is a multi- 106 million-dollar initiative to bring patients into a broad range of activities regarding research across 107 Canadian provinces and territories (8). Patients are now active in the development of research projects 108 in setting priorities and formulating study questions. The Ontario response (Ontario SUPPORT Unit) has 109 included a series of articles with patients taking a leadership role in co-authoring the content (9). In the 110 UK, the James Lind Alliance, funded by the UK National Institute of Health Research (NIHR), is a 111 successful example of including patients, carers and clinicians to develop priority-setting partnerships 112 (10) and question formulation (11). 113 114 With a focus on enhancing reproducibility the US National Institutes of Health (NIH) have revised their 115 application instructions and review criteria to strengthen scientific rigor and transparency (12). One of 116 the resources they recommend is the Experimental Design Assistant (EDA) developed by The National 117 Centre for the Replacement, Refinement & Reduction of Animals in Research (NC3Rs). This 10-module 118 online tool was developed to assist researchers in the design and analysis of animal experiments. It 119 includes dedicated support for randomization, blinding and sample size calculation. It can also be used 120 to help researchers prepare the experimental design information and analysis plan requested for grant 121 applications (13). The NC3Rs encourages the use of the EDA so that applicants can generate an EDA 122 report, which can be submitted in place of the experimental design and methodology appendix. 123 124 Other examples of preferred criteria include social media metrics as indicators of disseminating research 125 (14), public lectures about the results of a research project, public engagement and other types of 126 events that bring together funders, researchers and other stakeholders to work on an effective 127 communication plan of the research program (15). Organizations such as the Wellcome Trust are taking 128 a holistic attitude to redefining their approach to engagement explicitly to help people feel empowered 129 to access, use, respond to and create health research (16). 130 131 Principle 2: Value the accurate and transparent reporting of all research, regardless of the results. 132 133 Rationale 134 Failure to publish all findings of all studies seriously distorts the evidence base for decision making.