Guidance for Reporting Outcomes in Clinical Trials: Scoping Review Protocol

Total Page:16

File Type:pdf, Size:1020Kb

Guidance for Reporting Outcomes in Clinical Trials: Scoping Review Protocol Open access Protocol BMJ Open: first published as 10.1136/bmjopen-2018-023001 on 5 February 2019. Downloaded from Guidance for reporting outcomes in clinical trials: scoping review protocol Nancy J Butcher,1 Emma J Mew,1 Leena Saeed,1 Andrea Monsour,1 Alyssandra Chee-a-tow,1 An-Wen Chan,2 David Moher,3 Martin Offringa1 To cite: Butcher NJ, Mew EJ, ABSTRACT Strengths and limitations of this study Saeed L, et al. Guidance Introduction Patients, families and clinicians rely on for reporting outcomes published research to help inform treatment decisions. ► We will employ a comprehensive search strategy to in clinical trials: scoping Without complete reporting of the outcomes studied, review protocol. BMJ Open identify guidance on trial outcome reporting includ- evidence-based clinical and policy decisions are limited 2019;9:e023001. doi:10.1136/ ing electronic databases, grey literature sources, and researchers cannot synthesise, replicate or build bmjopen-2018-023001 expert colleagues and reference list screening pub- on existing research findings. To facilitate harmonised lished in the last 10 years. ► Prepublication history and reporting of outcomes in published trial protocols and ► Our methods are based on the methodologically rig- additional material for this reports, the Instrument for reporting Planned Endpoints in orous Joanna Briggs Institute scoping review meth- paper are available online. To Clinical Trials (InsPECT) is under development. As one of view these files, please visit ods manual. the initial steps in the development of InsPECT, a scoping the journal online (http:// dx. doi. ► To increase feasibility, two trained reviewers will review will identify and synthesise existing guidance on org/ 10. 1136/ bmjopen- 2018- perform screening and data charting, following the reporting of trial outcomes. 023001). training procedures and reliability assessments. Methods and analysis We will apply methods based ► As this is a scoping review of trial reporting guid- Received 16 March 2018 on the Joanna Briggs Institute scoping review methods ance, quality of the evidence and risk of bias will not Revised 11 October 2018 manual. Documents that provide explicit guidance on be systematically assessed. Accepted 19 December 2018 trial outcome reporting will be searched for using: (1) an electronic bibliographic database search; (2) a grey literature search; and (3) solicitation of colleagues for the quality of research reporting and reduce guidance documents using a snowballing approach. Reference list screening will be performed for included research waste, reporting guidelines have been developed to guide the preparation documents. Search results will be divided between two 4 trained reviewers who will complete title and abstract and publication of research studies. These screening, full-text screening and data charting. Captured include reporting guidelines for clinical trial http://bmjopen.bmj.com/ trial outcome reporting guidance will be compared reports (Consolidated Standards of Reporting with candidate InsPECT items to support, refute or Trials (CONSORT))5 and protocols (Stan- refine InsPECT content and to assess the need for the dard Protocol Items: Recommendations for development of additional items. Data analysis will explore Interventional Trials (SPIRIT)).6 common features of guidance and use quantitative Although CONSORT and SPIRIT measures (eg, frequencies) to characterise guidance and provide general guidance on how to report its sources. © Author(s) (or their outcomes,5 6 more detail is needed to employer(s)) 2019. Re-use Ethics and dissemination A paper describing the review 7–9 on September 30, 2021 by guest. Protected copyright. permitted under CC BY-NC. No findings will be published in a peer-reviewed journal. The adequately describe trial outcomes. Key commercial re-use. See rights results will be used to inform the InsPECT development information about how trial outcomes were and permissions. Published by process, helping to ensure that InsPECT provides an selected, defined, measured and analysed BMJ. is often missing or poorly reported.2 3 7 10–14 1 evidence-based tool for standardising trial outcome Child Health Evaluative reporting. Poor outcome reporting has been docu- Sciences, The Hospital for Sick mented across a diversity of disciplines and Children Research Institute, 10–12 Toronto, Ontario, Canada populations. Notably, 40%-60% of trials 2Department of Medicine, INTRODUCTIon have been found to have at least one primary Women's College Research Inadequate reporting remains a major chal- outcome that was changed, introduced or Institute, Women's College lenge in biomedical research.1 The quality omitted between protocol and publication.7 Hospital, University of Toronto, Toronto, Ontario, Canada of published research affects the ability of Without clear and complete reporting of trial 3Centre for Journalology, Clinical researchers and stakeholders to evaluate, outcomes, researchers cannot adequately Epidemiology Program, Ottawa replicate and build on study findings, which appraise, consolidate or replicate findings, Hospital Research Institute, in turn impacts evidence-based clinical and hindering the translation of evidence into Toronto, Ontario, Canada policy decision making. Poor reporting of clinical practice and policy. Correspondence to clinical trials, which provide the gold stan- Despite recommendations that a change Dr Nancy J Butcher; dard of evidence in research, has been in the reporting of trial outcomes is nancy. butcher@ sickkids. ca well documented.2 3 In an effort to improve needed,10 15–17 there is currently no dedicated Butcher NJ, et al. BMJ Open 2019;9:e023001. doi:10.1136/bmjopen-2018-023001 1 Open access BMJ Open: first published as 10.1136/bmjopen-2018-023001 on 5 February 2019. Downloaded from guidance that authors can follow when describing trial Eligibility criteria outcomes in published protocols and reports. To address The inclusion criteria are based on the Population, this issue, an international group of experts and knowl- Concept and Context framework, as recommended by edge users are developing the Instrument for reporting The Joanna Briggs Institute for scoping reviews as a less of Planned Endpoints in Clinical Trials (InsPECT)18 to restrictive alternative to the Population, Intervention, standardise and harmonise trial outcome reporting. Comparator, Outcome framework. To date, this team has developed an initial list of candi- date InsPECT items through an environmental scan of Population academic and regulatory publications16 and through Eligible documents will include those with content rele- consultations with methodologists and knowledge users vant to clinical trials performed in any human popula- (including clinicians, guideline developers and trial- tion, including any age, sex or health condition. ists). The candidate InsPECT items can be found on Concept the Open Science Framework at https://osf. io/ arwy8/. Eligible documents will include those that provide guid- Draft versions will be iteratively updated as the project ance (advice or formal recommendation) and/or a progresses. The final product will be unique InsPECT checklist describing outcome-specific information that extensions for the SPIRIT and CONSORT reporting should be included in clinical trial reports or protocols. guidelines. Reporting guidance relevant to any type of outcome (eg, No review of current guidance on how to adequately primary or secondary outcomes; biomarker/surrogate report outcomes in clinical trial protocols and final trial outcomes or clinical outcomes such as clinician-reported reports exists. The purpose of this scoping review is to outcomes; harms) will be eligible. As there is consider- identify and synthesise existing guidance for outcome able heterogeneity in the terms used in the published reporting in clinical trials and protocols to inform 10 literature to describe outcomes, guidance that uses InsPECT development. The results of this scoping review synonyms for outcomes will also be included, as appro- will be presented during the InsPECT Delphi process and priate (eg, endpoint, outcome measure, efficacy vari- consensus meeting. The specific research questions are: able). To increase study feasibility and reliability, this 1. What published guidance exists on the reporting of review will only include explicit guidance ("stated clearly outcomes for clinical trial protocols and reports? and in detail, leaving no room for confusion or doubt"26; 2. Does the identified guidance support or refute each such that the guidance must specifically state that the candidate InsPECT item as a reporting item for clini- information should be included in a clinical trial protocol cal trial protocols and/or reports? or report).27 An example of a statement of included 3. Does any identified guidance support the creation of guidance follows from the CONSORT patient-reported additional candidate InsPECT items or the refinement outcomes (PRO) extension: "Evidence of patient-re- http://bmjopen.bmj.com/ of existing items? ported outcome instrument validity and reliability should be provided or cited, if available".27 METHODS AND ANALYSIS Study design Context A scoping review was considered to be the most suitable Sources from any country or setting will be considered. Dates will be restricted to the last 10 years; as InsPECT will approach for addressing the broad aim of this study. A 5 scoping review is a type of knowledge synthesis approach form extensions to CONSORT (published in 2010)
Recommended publications
  • Stephen P. Lock on "Journalology"
    INSTITUTE FOn SCIENTl FICl NF0F4MAT10N8 3501 MARKET ST PHILADELPHIA PA 19104 Stephen P. Lock on “Journalology” Number 3 January 15, 1990 I don’t remember exactly how long I have Lock: Editor and Scholar known Stephen P. Lock. Our first encounter was probably at a meeting of the Council of Biology Editors (CBE) some 20 years Stephen P. Lock was born in 1929 and re- ago. It is somewhat unsettling to realize that ceived his education at Queen’s College, so much time has gone by. Equally disquiet- Cambridge University, UK, and the Medical ing is the notion that Steve will soon retire College of St. Bartholomew’s Hospital. as editor of the venerable British Medical Trained as a hematologist, he served on the Journal (BMJ). But in the near future Steve staffs of various London teaching hospitals, will indeed step down, having helped train including St. Bartholomew’s and the Hos- his successor, due to be appointed in the fall pital for Sick Children, before being ap- of 1990. pointed assistant editor of BMJ in 1964. It seems odd to refer to Lock as an elder After serving as senior assistant editor and statesman of biomedical editing-however deputy editor, Lock became editor in 1975. accurate that description might be. The word I am amused that Steve opened his talk that comes to mind when I think of Steve with the traditional, mythical reminder about is youth. This youthfidness is reflected in the estimates of the number of extant jour- his writing and in his approach to problems. nals.
    [Show full text]
  • Assessing the Utility of an Institutional Publications Officer: a Pilot Assessment
    Assessing the utility of an institutional publications officer: a pilot assessment Kelly D. Cobey1,2,3, James Galipeau1, Larissa Shamseer1,2 and David Moher1,2 1 Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Canada 2 School of Epidemiology, Public Health, and Preventative Medicine, University of Ottawa, Ottawa, Canada 3 School of Natural Sciences, Department of Psychology, University of Stirling, Stirling, United Kingdom ABSTRACT Background. The scholarly publication landscape is changing rapidly. We investigated whether the introduction of an institutional publications officer might help facilitate better knowledge of publication topics and related resources, and effectively support researchers to publish. Methods. In September 2015, a purpose-built survey about researchers' knowledge and perceptions of publication practices was administered at five Ottawa area research institutions. Subsequently, we publicly announced a newly hired publications officer (KDC) who then began conducting outreach at two of the institutions. Specifically, the publications officer gave presentations, held one-to-one consultations, developed electronic newsletter content, and generated and maintained a webpage of resources. In March 2016, we re-surveyed our participants regarding their knowledge and perceptions of publishing. Mean scores to the perception questions, and the percent of correct responses to the knowledge questions, pre and post survey, were computed for each item. The difference between these means or calculated percentages was then examined across the survey measures. Results. 82 participants completed both surveys. Of this group, 29 indicated that they had exposure to the publications officer, while the remaining 53 indicated they did not. Interaction with the publications officer led to improvements in half of the knowledge items (7/14 variables).
    [Show full text]
  • Core Competencies for Scientific Editors Of
    Moher et al. BMC Medicine (2017) 15:167 DOI 10.1186/s12916-017-0927-0 CORRESPONDENCE Open Access Core competencies for scientific editors of biomedical journals: consensus statement David Moher1,2* , James Galipeau3, Sabina Alam4, Virginia Barbour5, Kidist Bartolomeos6, Patricia Baskin7,8, Sally Bell-Syer9,10, Kelly D. Cobey1,2,11, Leighton Chan12, Jocalyn Clark13, Jonathan Deeks14, Annette Flanagin15, Paul Garner16, Anne-Marie Glenny17, Trish Groves18, Kurinchi Gurusamy19, Farrokh Habibzadeh20,21,22, Stefanie Jewell-Thomas23, Diane Kelsall24, José Florencio Lapeña Jr22,25,26,27, Harriet MacLehose28, Ana Marusic29,30, Joanne E. McKenzie31, Jay Shah32,33,34, Larissa Shamseer1,2, Sharon Straus35, Peter Tugwell2,36,37, Elizabeth Wager38,39, Margaret Winker22 and Getu Zhaori40 Abstract Background: Scientific editors are responsible for deciding which articles to publish in their journals. However, we have not found documentation of their required knowledge, skills, and characteristics, or the existence of any formal core competencies for this role. Methods: We describe the development of a minimum set of core competencies for scientific editors of biomedical journals. Results: The 14 key core competencies are divided into three major areas, and each competency has a list of associated elements or descriptions of more specific knowledge, skills, and characteristics that contribute to its fulfillment. Conclusions: We believe that these core competencies are a baseline of the knowledge, skills, and characteristics needed to perform competently the duties of a scientific editor at a biomedical journal. Keywords: Core competencies, Scientific editor, Biomedical journal, Delphi, Expert consensus, Editor role Introduction and in guidance for members of editor organizations Scientific editors (editors are responsible for the content [3–8].
    [Show full text]
  • Downloads Presented on the Abstract Page
    bioRxiv preprint doi: https://doi.org/10.1101/2020.04.27.063578; this version posted April 28, 2020. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under aCC-BY 4.0 International license. A systematic examination of preprint platforms for use in the medical and biomedical sciences setting Jamie J Kirkham1*, Naomi Penfold2, Fiona Murphy3, Isabelle Boutron4, John PA Ioannidis5, Jessica K Polka2, David Moher6,7 1Centre for Biostatistics, Manchester Academic Health Science Centre, University of Manchester, Manchester, United Kingdom. 2ASAPbio, San Francisco, CA, USA. 3Murphy Mitchell Consulting Ltd. 4Université de Paris, Centre of Research in Epidemiology and Statistics (CRESS), Inserm, Paris, F-75004 France. 5Meta-Research Innovation Center at Stanford (METRICS) and Departments of Medicine, of Epidemiology and Population Health, of Biomedical Data Science, and of Statistics, Stanford University, Stanford, CA, USA. 6Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Canada. 7School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, Canada. *Corresponding Author: Professor Jamie Kirkham Centre for Biostatistics Faculty of Biology, Medicine and Health The University of Manchester Jean McFarlane Building Oxford Road Manchester, M13 9PL, UK Email: [email protected] Tel: +44 (0)161 275 1135 bioRxiv preprint doi: https://doi.org/10.1101/2020.04.27.063578; this version posted April 28, 2020. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity.
    [Show full text]
  • Developing PRISMA-RR, a Reporting Guideline for Rapid Reviews of Primary Studies (Protocol)
    Developing PRISMA-RR, a reporting guideline for rapid reviews of primary studies (Protocol) Adrienne Stevens1,2, Chantelle Garritty1,2, Mona Hersi1, David Moher1,3,4 1Ottawa Methods Centre, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Canada; 2TRIBE Graduate Program, University of Split School of Medicine, Croatia; 3Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Canada; 4School of Epidemiology and Public Health, University of Ottawa, Canada; Funding: The development of PRISMA-RR is supported by a grant from the Canadian Institutes of Health Research (CIHR FRN-142310). The funder had no role in the development of this protocol and will not have a role in the data collection, analyses, interpretation of the data, and publication of the findings. Funding will be sought to develop the Explanation and Elaboration manuscript. Developing PRISMA-RR, a reporting guideline for rapid reviews of primary studies February 2018 INTRODUCTION Systematic reviews are known to be the best evidence upon which to make healthcare decisions, but can take years to complete (1). Rapid reviews have emerged as accelerated and/or abbreviated versions of systematic reviews with certain concessions made in the systematic review process to accommodate decision-making situations that require an expedited compilation of the evidence (2). Rapid reviews are usually conducted for a specific requestor and are typically understood to take 6 months or less to complete. Rapid reviews may differ from systematic reviews in a number of ways, such as the number of abbreviated methods or shortcuts employed and the breadth or scope of the question(s) posed (2–4). Some rapid reviews involve accelerated data mining processes and targeted screening of studies.
    [Show full text]
  • The Evolution of the Science Citation Index Search Engine to the Web of Science, Scientometric Evaluation and Historiography
    The Evolution of the Science Citation Index Search Engine to the Web of Science, Scientometric Evaluation and Historiography Eugene Garfield Chairman Emeritus, Thomson ISI 3501 Market Street, Philadelphia PA 19104 Fax: 215-387-1266 Tel. 215-243-2205 [email protected] www.eugenegarfield.org Presented at the University of Barcelona January 24, 2007 _________________________________ The Science Citation Index was proposed over fifty years ago to facilitate the dissemination and retrieval of scientific literature. Its unique search engine based on citation searching was not widely adopted until it was made available online in 1972. Its by product Journal Citation Reports became available in 1975 including its rankings by impact factor. Impact factors were not widely adopted until about a decade ago when they began to be used as surrogates for expected citation frequencies for recently published papers--a highly controversial application of scientometrics in evaluating scientists and institutions. The inventor of the SCI and its companion Social Sciences Citation Index will review its history and discuss their more recent use in graphically visualizing micro-histories of scholarly topics. Using the patented HistCite software for algorithmic historiographic analysis, the genealogy of the Watson-Crick discovery of the double helix structure of DNA and its relationship to the work of Heidelberger, Avery, and others will be discussed. ____________________________________ It is now over fifty years since the idea of the Science Citation Index (SCI) was first promulgated in Science magazine in 1955.i However, as the older generation of scientists will remember Current Contents is the information service that proved to be the primordial revolutionary “idea” which made the practical realization of the SCI possible.
    [Show full text]
  • Where to Publish?
    Where (Not) to Publish? B. Mohan Kumar Outline • What factors decide choice of journals • Journal impact factors • Predatory journals WHERE TO REPORT Choosing a journal -- international or local International journals: More prestigious (than local ones); Wider readerships Higher standards and rejection rates; Publishing in international journals requires more efforts, but is more rewarding in the long run. How do you choose a journal for publishing the paper? • Whoever will accept my article • Has friends on the editorial board • Has good artwork • Has (does not have) ads • Has fast turnaround time • Highest Journal Impact Factor • Impact factors are used to rank and, therefore, identify the so-called best journal to publish one’s work. • Journal most relevant to my field • Journal likely to be read by colleagues • Journal with reputation for timely, helpful review • Journal affiliated with professional society • Other factors : Intuitively - prestige? Past experience • Ask a librarian or respected colleague Journal Impact Factor: the central dogma of scholarly publishing Journal Impact Factor: Historical Aspects • Eugene Garfield, the founder of the Institute for Scientific Information (ISI, now the Thomson Scientific) in 1955 envisioned creating a metric to evaluate journal performance. • Over the next 20 years, Garfield and the late Irving H. Sher, worked to refine the Journal Impact Factor concept. • Science Citation Index in the 1960s, subsequently published in 1975 the Journal Citation Reports (JCR). • Calculated each year for those journals which Thomson Reuters indexes and are published in JCR. • A highly influential ranking tool used by participants in all phases of the research publishing cycle — librarians, publishers, editors, authors and information analysts.
    [Show full text]
  • Potential Predatory and Legitimate Biomedical Journals
    Shamseer et al. BMC Medicine (2017) 15:28 DOI 10.1186/s12916-017-0785-9 RESEARCHARTICLE Open Access Potential predatory and legitimate biomedical journals: can you tell the difference? A cross-sectional comparison Larissa Shamseer1,2* , David Moher1,2, Onyi Maduekwe3, Lucy Turner4, Virginia Barbour5, Rebecca Burch6, Jocalyn Clark7, James Galipeau1, Jason Roberts8 and Beverley J. Shea9 Abstract Background: The Internet has transformed scholarly publishing, most notably, by the introduction of open access publishing. Recently, there has been a rise of online journals characterized as ‘predatory’, which actively solicit manuscripts and charge publications fees without providing robust peer review and editorial services. We carried out a cross-sectional comparison of characteristics of potential predatory, legitimate open access, and legitimate subscription-based biomedical journals. Methods: On July 10, 2014, scholarly journals from each of the following groups were identified – potential predatory journals (source: Beall’s List), presumed legitimate, fully open access journals (source: PubMed Central), and presumed legitimate subscription-based (including hybrid) journals (source: Abridged Index Medicus). MEDLINE journal inclusion criteria were used to screen and identify biomedical journals from within the potential predatory journals group. One hundred journals from each group were randomly selected. Journal characteristics (e.g., website integrity, look and feel, editors and staff, editorial/peer review process, instructions to authors,
    [Show full text]
  • Hong Kong Principles
    1 The Hong Kong Principles for Assessing Researchers: 2 Fostering Research Integrity 3 4 David Moher1,2, Lex Bouter3,4, Sabine Kleinert5, Paul Glasziou6, Mai Har Sham7 5 Virginia Barbour8, Anne-Marie Coriat9, Nicole Foeger10, Ulrich Dirnagl11 6 7 1Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute; 2School of 8 Epidemiology and Public Health, University of Ottawa, Ottawa, Canada; 3Department of Epidemiology 9 and Biostatistics, Amsterdam University Medical Centers, location VUmc; 4Department of Philosophy, 10 Faculty of Humanities, Vrije Universiteit, Amsterdam, The Netherlands; 5The Lancet, London Wall Office, 11 London, UK; 6Institute for Evidence-Based healthcare, Bond University, Gold Coast, Qld, Australia; and 12 7School of Biomedical Sciences, LKS Faculty of Medicine, The University of Hong Kong, Pokfulam, Hong 13 Kong SAR, China; 8Queensland University of Technology (QUT), Brisbane, Australia; 9Wellcome Trust, 14 London; 10Austrian Agency for Research Integrity, Vienna, Austria; 11Berlin Institute of Health, QUEST 15 Center for Transforming Biomedical Research, Berlin, Germany 16 17 David Moher: ORCID 0000-0003-2434-4206 18 Lex Bouter: ORCID 0000-0002-2659-5482 19 Sabine Kleinert: ORCID 0000-0001-7826-1188 20 Paul Glasziou: ORCID 0000-0001-7564-073X 21 Mai Har Sham: ORCID 0000-0003-1179-7839 22 Virginia Barbour: ORCID: 0000-0002-2358-2440 23 Anne-Marie Coriat: ORCID 0000-0003-2632-1745 24 Nicole Foeger: ORCID 0000-0001-7775-7325 25 Ulrich Dirnagl: ORCID 0000-0003-0755-6119 26 27 28 23rd November 2019 29 Abstract 30 31 The primary goal of research is to advance knowledge. For that knowledge to benefit research and 32 society, it must be trustworthy.
    [Show full text]
  • Systematic Review of the Effectiveness of Training Programs in Writing For
    Galipeau et al. Systematic Reviews 2013, 2:41 http://www.systematicreviewsjournal.com/content/2/1/41 PROTOCOL Open Access Systematic review of the effectiveness of training programs in writing for scholarly publication, journal editing, and manuscript peer review (protocol) James Galipeau1*, David Moher1,2, Becky Skidmore3, Craig Campbell4, Paul Hendry2, D William Cameron1,2, Paul C Hébert1,2 and Anita Palepu5 Abstract Background: An estimated $100 billion is lost to ‘waste’ in biomedical research globally, annually, much of which comes from the poor quality of published research. One area of waste involves bias in reporting research, which compromises the usability of published reports. In response, there has been an upsurge in interest and research in the scientific process of writing, editing, peer reviewing, and publishing (that is, journalology) of biomedical research. One reason for bias in reporting and the problem of unusable reports could be due to authors lacking knowledge or engaging in questionable practices while designing, conducting, or reporting their research. Another might be that the peer review process for journal publication has serious flaws, including possibly being ineffective, and having poorly trained and poorly motivated reviewers. Similarly, many journal editors have limited knowledge related to publication ethics. This can ultimately have a negative impact on the healthcare system. There have been repeated calls for better, more numerous training opportunities in writing for publication, peer review, and publishing. However, little research has taken stock of journalology training opportunities or evaluations of their effectiveness. Methods: We will conduct a systematic review to synthesize studies that evaluate the effectiveness of training programs in journalology.
    [Show full text]
  • Influential Journals in Health Research: a Bibliometric Study José M
    Merigó and Núñez Globalization and Health (2016) 12:46 DOI 10.1186/s12992-016-0186-4 RESEARCH Open Access Influential journals in health research: a bibliometric study José M. Merigó and Alicia Núñez* Abstract Background: There is a wide range of intellectual work written about health research, which has been shaped by the evolution of diseases. This study aims to identify the leading journals over the last 25 years (1990–2014) according to a wide range of bibliometric indicators. Methods: The study develops a bibliometric overview of all the journals that are currently indexed in Web of Science (WoS) database in any of the four categories connected to health research. The work classifies health research in nine subfields: Public Health, Environmental and Occupational Health, Health Management and Economics, Health Promotion and Health Behavior, Epidemiology, Health Policy and Services, Medicine, Health Informatics, Engineering and Technology, and Primary Care. Results: The results indicate a wide dispersion between categories being the American Journal of Epidemiology, Environmental Health Perspectives, American Journal of Public Health, and Social Science & Medicine, the journals that have received the highest number of citations over the last 25 years. According to other indicators such as the h-index and the citations per paper, some other journals such as the Annual Review of Public Health and Medical Care, obtain better results which show the wide diversity and profiles of outlets available in the scientific community. The results are grouped and studied according to the nine subfields in order to identify the leading journals in each specific sub discipline of health. Conclusions: The work identifies the leading journals in health research through a bibliometric approach.
    [Show full text]
  • A Protocol of a Cross-Sectional Study Evaluating an Online Tool for Early Career Peer Reviewers Assessing Reports of Randomised Controlled Trials
    Open Access Protocol BMJ Open: first published as 10.1136/bmjopen-2017-017462 on 15 September 2017. Downloaded from A protocol of a cross-sectional study evaluating an online tool for early career peer reviewers assessing reports of randomised controlled trials Anthony Chauvin,1,2,3 David Moher,4,5 Doug Altman,6 David L Schriger,7 Sabina Alam,8 Sally Hopewell,9 Daniel R Shanahan,10 Alessandro Recchioni,10 Philippe Ravaud,1,2 Isabelle Boutron1,2,11 To cite: Chauvin A, Moher D, ABSTRACT Strengths and limitations of this study Altman D, et al. A protocol Introduction Systematic reviews evaluating the impact of a cross-sectional study of interventions to improve the quality of peer review for ► To the best of our knowledge, this will be the first evaluating an online tool for biomedical publications highlighted that interventions were early career peer reviewers study that will assess the efficacy of using an limited and have little impact. This study aims to compare assessing reports of randomised innovative online tool to help early career peer the accuracy of early career peer reviewers who use an controlled trials. BMJ Open reviewers to assess the quality of reporting of innovative online tool to the usual peer reviewer process 2017;7:e017462. doi:10.1136/ randomised controlled trials. bmjopen-2017-017462 in evaluating the completeness of reporting and switched ► A randomised selection of trials from a large panel of primary outcomes in completed reports. biomedical journals. ► Prepublication history and Methods and analysis This is a cross-sectional additional material for this ► The absence of background for choosing the a priori study of individual two-arm parallel-group randomised paper are available online.
    [Show full text]