Hong Kong Principles

Total Page:16

File Type:pdf, Size:1020Kb

Hong Kong Principles 1 The Hong Kong Principles for Assessing Researchers: 2 Fostering Research Integrity 3 4 David Moher1,2, Lex Bouter3,4, Sabine Kleinert5, Paul Glasziou6, Mai Har Sham7 5 Virginia Barbour8, Anne-Marie Coriat9, Nicole Foeger10, Ulrich Dirnagl11 6 7 1Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute; 2School of 8 Epidemiology and Public Health, University of Ottawa, Ottawa, Canada; 3Department of Epidemiology 9 and Biostatistics, Amsterdam University Medical Centers, location VUmc; 4Department of Philosophy, 10 Faculty of Humanities, Vrije Universiteit, Amsterdam, The Netherlands; 5The Lancet, London Wall Office, 11 London, UK; 6Institute for Evidence-Based healthcare, Bond University, Gold Coast, Qld, Australia; and 12 7School of Biomedical Sciences, LKS Faculty of Medicine, The University of Hong Kong, Pokfulam, Hong 13 Kong SAR, China; 8Queensland University of Technology (QUT), Brisbane, Australia; 9Wellcome Trust, 14 London; 10Austrian Agency for Research Integrity, Vienna, Austria; 11Berlin Institute of Health, QUEST 15 Center for Transforming Biomedical Research, Berlin, Germany 16 17 David Moher: ORCID 0000-0003-2434-4206 18 Lex Bouter: ORCID 0000-0002-2659-5482 19 Sabine Kleinert: ORCID 0000-0001-7826-1188 20 Paul Glasziou: ORCID 0000-0001-7564-073X 21 Mai Har Sham: ORCID 0000-0003-1179-7839 22 Virginia Barbour: ORCID: 0000-0002-2358-2440 23 Anne-Marie Coriat: ORCID 0000-0003-2632-1745 24 Nicole Foeger: ORCID 0000-0001-7775-7325 25 Ulrich Dirnagl: ORCID 0000-0003-0755-6119 26 27 28 23rd November 2019 29 Abstract 30 31 The primary goal of research is to advance knowledge. For that knowledge to benefit research and 32 society, it must be trustworthy. Trustworthy research is robust, rigorous and transparent at all stages of 33 design, execution and reporting. Initiatives such as the San Francisco Declaration on Research 34 Assessment (DORA) and the Leiden Manifesto have led the way bringing much needed global attention 35 to the importance of taking a considered, transparent and broad approach to assessing research quality. 36 Since publication in 2012 the DORA principles have been signed up to by over 1500 organizations and 37 nearly 15,000 individuals. Despite this significant progress, assessment of researchers still rarely includes 38 considerations related to trustworthiness, rigor and transparency. We have developed the Hong Kong 39 Principles (HKPs) as part of the 6th World Conference on Research Integrity with a specific focus on the 40 need to drive research improvement through ensuring that researchers are explicitly recognized and 41 rewarded (i.e., their careers are advanced) for behavior that leads to trustworthy research. The HKPs 42 have been developed with the idea that their implementation could assist in how researchers are 43 assessed for career advancement with a view to strengthen research integrity. We present five 44 principles: responsible research practices; transparent reporting; open science (open research); valuing 45 a diversity of types of research; and recognizing all contributions to research and scholarly activity. For 46 each principle we provide a rationale for its inclusion and provide examples where these principles are 47 already being adopted. 48 Introduction 49 In a quest to advance knowledge, researchers publish approximately 1.5 million journal articles each 50 year. The presumption is that this literature can be used by other researchers, stakeholders, and the 51 wider society because it is trusted, robust, rigorous and complete. 52 53 The approach taken to validating research and its outcomes differs depending on the nature of the 54 research. For example, to rigorously examine the effects of a health intervention, trial participants 55 (human or animal) are typically required to be randomized to the intervention being studied. Many 56 researchers advocate registration of protocols as a way to ensure transparency and to enable others to 57 engage with their research. Subsequently, the use of reporting guidelines can help ensure complete and 58 transparent reporting of the researchers’ methods and results. When the research is being 59 disseminated, the research team would ensure that the associated data, materials and any analytical 60 code are made available as an integral part of publication. Such data sharing facilitates re-analysis of the 61 data to check reproducibility and to perform secondary analyses. 62 63 Although some mechanisms exist to support researchers in ensuring transparency at all stages of design, 64 execution and reporting, there is not widespread adoption of these practices. There are many 65 interwoven reasons for this. One contributing factor, we argue, is that little emphasis is placed on the 66 rigor of research when hiring, reviewing and promoting researchers. Working together across the 67 research sector as a whole to address this systemic issue, we believe, offers a global opportunity to 68 improve research and impact. 69 70 We developed the Hong Kong Principles (HKPs) as part of the 6th World Conference on Research 71 Integrity (WCRI) specifically to drive greater recognition for researchers who commit to robust, rigorous 72 and transparent practices (i.e., their careers are advanced) (see Figure). If implemented, the HKPs could 73 play a critical role in evidence-based assessments of researchers and put research rigor at the heart of 74 assessment, as well as open up research to the wider benefit of society. 75 76 We propose five principles, each with a rationale for its inclusion. We illustrate these principles with 77 examples where we know they exist. These examples are not exhaustive, and many are relevant to more 78 than one principle. Together, they illustrate of a breadth of approaches as to how these principles can 79 operate at the very highest levels of international research. 80 81 Early drafts of the HKPs were circulated to the 700 participants registered for the 6th WCRI. Further 82 discussions took place during two sessions at the 6th WCRI. A penultimate version was uploaded on the 83 6th WCRI website after the conference. More than 100 people provided input and feedback. We 84 acknowledge all of these valuable contributions and the global leadership of those working on the San 85 Francisco Declaration on Research Assessment (DORA), the Leiden Manifesto and other initiatives to 86 promote the responsible use of metrics, which have laid the foundations for much of our work (1-4). 87 88 Principles 89 Principle 1: Assess researchers on responsible practices from conception to delivery, including the 90 development of the research idea, research design, methodology, execution and effective 91 dissemination. 92 93 Rationale 94 The numbers of publications, citations, and total volume of grants are often still the dominant metrics 95 used by research institutions for assessing and rewarding their researchers (1-4). Providing bonuses to 96 academics for publishing in certain journals (i.e., merit pay) is also common in many parts of the world 97 (5-7). These assessment criteria tell assessors little about the researchers and the rigor of their work; 98 thus they are not particularly ‘responsible’ metrics. These metrics can also be unduly influenced by field 99 and citation practices and provide little information about a publication’s (and therefore a researcher’s) 100 contributions to research and society. Other criteria are required to provide a broader view of markers 101 of best practices: for example, the extent to which a researcher develops research questions with the 102 involvement of appropriate members of the public (see Figure). 103 104 Current implementation 105 The Canadian Institutes of Health Research’s Strategy for Patient-Oriented Research (SPOR) is a multi- 106 million-dollar initiative to bring patients into a broad range of activities regarding research across 107 Canadian provinces and territories (8). Patients are now active in the development of research projects 108 in setting priorities and formulating study questions. The Ontario response (Ontario SUPPORT Unit) has 109 included a series of articles with patients taking a leadership role in co-authoring the content (9). In the 110 UK, the James Lind Alliance, funded by the UK National Institute of Health Research (NIHR), is a 111 successful example of including patients, carers and clinicians to develop priority-setting partnerships 112 (10) and question formulation (11). 113 114 With a focus on enhancing reproducibility the US National Institutes of Health (NIH) have revised their 115 application instructions and review criteria to strengthen scientific rigor and transparency (12). One of 116 the resources they recommend is the Experimental Design Assistant (EDA) developed by The National 117 Centre for the Replacement, Refinement & Reduction of Animals in Research (NC3Rs). This 10-module 118 online tool was developed to assist researchers in the design and analysis of animal experiments. It 119 includes dedicated support for randomization, blinding and sample size calculation. It can also be used 120 to help researchers prepare the experimental design information and analysis plan requested for grant 121 applications (13). The NC3Rs encourages the use of the EDA so that applicants can generate an EDA 122 report, which can be submitted in place of the experimental design and methodology appendix. 123 124 Other examples of preferred criteria include social media metrics as indicators of disseminating research 125 (14), public lectures about the results of a research project, public engagement and other types of 126 events that bring together funders, researchers and other stakeholders to work on an effective 127 communication plan of the research program (15). Organizations such as the Wellcome Trust are taking 128 a holistic attitude to redefining their approach to engagement explicitly to help people feel empowered 129 to access, use, respond to and create health research (16). 130 131 Principle 2: Value the accurate and transparent reporting of all research, regardless of the results. 132 133 Rationale 134 Failure to publish all findings of all studies seriously distorts the evidence base for decision making.
Recommended publications
  • Stephen P. Lock on "Journalology"
    INSTITUTE FOn SCIENTl FICl NF0F4MAT10N8 3501 MARKET ST PHILADELPHIA PA 19104 Stephen P. Lock on “Journalology” Number 3 January 15, 1990 I don’t remember exactly how long I have Lock: Editor and Scholar known Stephen P. Lock. Our first encounter was probably at a meeting of the Council of Biology Editors (CBE) some 20 years Stephen P. Lock was born in 1929 and re- ago. It is somewhat unsettling to realize that ceived his education at Queen’s College, so much time has gone by. Equally disquiet- Cambridge University, UK, and the Medical ing is the notion that Steve will soon retire College of St. Bartholomew’s Hospital. as editor of the venerable British Medical Trained as a hematologist, he served on the Journal (BMJ). But in the near future Steve staffs of various London teaching hospitals, will indeed step down, having helped train including St. Bartholomew’s and the Hos- his successor, due to be appointed in the fall pital for Sick Children, before being ap- of 1990. pointed assistant editor of BMJ in 1964. It seems odd to refer to Lock as an elder After serving as senior assistant editor and statesman of biomedical editing-however deputy editor, Lock became editor in 1975. accurate that description might be. The word I am amused that Steve opened his talk that comes to mind when I think of Steve with the traditional, mythical reminder about is youth. This youthfidness is reflected in the estimates of the number of extant jour- his writing and in his approach to problems. nals.
    [Show full text]
  • Educating Ourselves and Our Patrons About Retracted Articles
    Joanna Thielen 183 When Scholarly Publishing Goes Awry: Educating 18.1. Ourselves and Our Patrons portal about Retracted Articles Joanna Thielen publication, for abstract: Retracted articles, articles that violate professional ethics, are an unsettling, yet integral, part of the scholarly publishing process seldom discussed in the academy. Unfortunately, article retractions continue to rise across all disciplines. Although academic librarians consistently provide instruction on scholarly publishing, little has been writtenaccepted about their role in educating patrons about retracted article. This article provides an overview of the article retraction process. Search strategies for locating retracted articles in several scholarlyand databases are discussed. Suggestions on how to incorporate article retractions into information literacy instruction are provided, including connections to the Association of College and Research Libraries Framework for Information Literacy for Higher Education. edited, copy Introduction n the academy, there is a heightened awareness of the fallibility of scholarly research and publishing. This awareness is exemplified by the “reproducibility crisis”: re- searchersreviewed, have become increasingly cognizant that many published research stud- Iies cannot be successfully duplicated.1 For example, only a reported 10 to 25 percent of cancerpeer biology studies can be reproduced.2 (Being unable to reproduce published resultsis does not necessarily indicate that the authors committed malfeasance or that the article’s results are incorrect. Rather, it only indicates that another researcher was unable to replicate the published work.) The lack of reproducibility of scholarly research has led mss. to calls for more transparency in research and publication practices. While the scholarly publishing process, including peer review, is intended to set rigorous standards and This expectations for publications, it is not infallible.
    [Show full text]
  • Amending Published Articles: Time to Rethink Retractions and Corrections?[Version 1; Peer Review: 2 Approved with Reservations]
    F1000Research 2017, 6:1960 Last updated: 20 SEP 2021 OPINION ARTICLE Amending published articles: time to rethink retractions and corrections? [version 1; peer review: 2 approved with reservations] Virginia Barbour 1, Theodora Bloom 2, Jennifer Lin 3, Elizabeth Moylan 4 1QUT, Kelvin Grove, Brisbane, Queensland, 4059, Australia 2BMJ, BMA House, Tavistock Square, London, WC1H 9JR, UK 3Crossref, North American Office , 50 Salem Street, Lynnfield, MA 01940, USA 4BMC (part of Springer Nature), 236 Grays Inn Road, London, WC1X 8HB, UK v1 First published: 06 Nov 2017, 6:1960 Open Peer Review https://doi.org/10.12688/f1000research.13060.1 Latest published: 06 Nov 2017, 6:1960 https://doi.org/10.12688/f1000research.13060.1 Reviewer Status Invited Reviewers Abstract Academic publishing is evolving and our current system of correcting 1 2 research post-publication is failing, both ideologically and practically. It does not encourage researchers to engage in necessary post- version 1 publication changes in a consistent way. Worse yet, post-publication 06 Nov 2017 report report ‘updates’ can be misconstrued as punishments or admissions of misconduct. 1. Lex M. Bouter , VU University Medical We propose a different model that publishers of research can apply to Centre, Amsterdam, The Netherlands the content they publish, ensuring that any post-publication Vrije Universiteit, Amsterdam, The amendments are seamless, transparent and propagated to all the Netherlands countless places online where descriptions of research appear. At the center of our proposal is use of the neutral term “amendment” to 2. C.K. Gunsalus , University of Illinois at describe all forms of post-publication change to an article.
    [Show full text]
  • Assessing the Utility of an Institutional Publications Officer: a Pilot Assessment
    Assessing the utility of an institutional publications officer: a pilot assessment Kelly D. Cobey1,2,3, James Galipeau1, Larissa Shamseer1,2 and David Moher1,2 1 Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Canada 2 School of Epidemiology, Public Health, and Preventative Medicine, University of Ottawa, Ottawa, Canada 3 School of Natural Sciences, Department of Psychology, University of Stirling, Stirling, United Kingdom ABSTRACT Background. The scholarly publication landscape is changing rapidly. We investigated whether the introduction of an institutional publications officer might help facilitate better knowledge of publication topics and related resources, and effectively support researchers to publish. Methods. In September 2015, a purpose-built survey about researchers' knowledge and perceptions of publication practices was administered at five Ottawa area research institutions. Subsequently, we publicly announced a newly hired publications officer (KDC) who then began conducting outreach at two of the institutions. Specifically, the publications officer gave presentations, held one-to-one consultations, developed electronic newsletter content, and generated and maintained a webpage of resources. In March 2016, we re-surveyed our participants regarding their knowledge and perceptions of publishing. Mean scores to the perception questions, and the percent of correct responses to the knowledge questions, pre and post survey, were computed for each item. The difference between these means or calculated percentages was then examined across the survey measures. Results. 82 participants completed both surveys. Of this group, 29 indicated that they had exposure to the publications officer, while the remaining 53 indicated they did not. Interaction with the publications officer led to improvements in half of the knowledge items (7/14 variables).
    [Show full text]
  • Core Competencies for Scientific Editors Of
    Moher et al. BMC Medicine (2017) 15:167 DOI 10.1186/s12916-017-0927-0 CORRESPONDENCE Open Access Core competencies for scientific editors of biomedical journals: consensus statement David Moher1,2* , James Galipeau3, Sabina Alam4, Virginia Barbour5, Kidist Bartolomeos6, Patricia Baskin7,8, Sally Bell-Syer9,10, Kelly D. Cobey1,2,11, Leighton Chan12, Jocalyn Clark13, Jonathan Deeks14, Annette Flanagin15, Paul Garner16, Anne-Marie Glenny17, Trish Groves18, Kurinchi Gurusamy19, Farrokh Habibzadeh20,21,22, Stefanie Jewell-Thomas23, Diane Kelsall24, José Florencio Lapeña Jr22,25,26,27, Harriet MacLehose28, Ana Marusic29,30, Joanne E. McKenzie31, Jay Shah32,33,34, Larissa Shamseer1,2, Sharon Straus35, Peter Tugwell2,36,37, Elizabeth Wager38,39, Margaret Winker22 and Getu Zhaori40 Abstract Background: Scientific editors are responsible for deciding which articles to publish in their journals. However, we have not found documentation of their required knowledge, skills, and characteristics, or the existence of any formal core competencies for this role. Methods: We describe the development of a minimum set of core competencies for scientific editors of biomedical journals. Results: The 14 key core competencies are divided into three major areas, and each competency has a list of associated elements or descriptions of more specific knowledge, skills, and characteristics that contribute to its fulfillment. Conclusions: We believe that these core competencies are a baseline of the knowledge, skills, and characteristics needed to perform competently the duties of a scientific editor at a biomedical journal. Keywords: Core competencies, Scientific editor, Biomedical journal, Delphi, Expert consensus, Editor role Introduction and in guidance for members of editor organizations Scientific editors (editors are responsible for the content [3–8].
    [Show full text]
  • Final Roadmap: March 2015 2
    Final Roadmap: March 2015 2 Photo: ©Fotolia Grant agreement no. 282586 ROAMER: A roadmap for mental health research in Europe Coordination and Support Action Objective THEME [HEALTH.2011.3.3-4] Roadmap for Mental Health and Well-being Research in Europe Deliverable reference number and title: D11.4 Due date of deliverable: month 42 Actual submission date: 27/05/2015 Start date of project: October the 1st 2011 Duration: 42 months 2UJDQLVDWLRQQDPHRIOHDGEHQH¿FLDU\IRUWKLVGHOLYHUDEOH KCL - IOPPN / UNIMAN AUTHORS Prof. Til Wykes / Dr. Stefano Belli / Prof. Shôn Lewis (WP11) CIBERSAM-PSSJD Dr. Josep Maria Haro, Dr. Carla Obradors Tarragó (COORD) APPROVAL KCL - IOPPN / UNIMAN WORK-PACKAGE LEADER Prof. Til Wykes (WP11) CIBERSAM-PSSJD PROJECT COORDINATOR Dr. Josep Maria Haro (COORD) AUTHORISATION EUROPEAN PROJECT OFFICER Barbara Kerstiens COMMISSION Project co-funded by the European Commission within the Seventh Framework Programme (2007-2013) Dissemination Level(Please choose) PU Public X PP Restricted to other programme participants (including the Commission Services) RE 5HVWULFWHGWRDJURXSVSHFL¿HGE\WKHFRQVRUWLXP LQFOXGLQJWKH&RPPLVVLRQ6HUYLFHV CO &RQ¿GHQWLDORQO\IRUPHPEHUVRIWKHFRQVRUWLXP LQFOXGLQJWKH&RPPLVVLRQ6HUYLFHV Contents Contents ................................................................... 4 Partners.................................................................... 6 Executive Summary ......................................................... 8 1.Introduction ............................................................. 11 Structure
    [Show full text]
  • Downloads Presented on the Abstract Page
    bioRxiv preprint doi: https://doi.org/10.1101/2020.04.27.063578; this version posted April 28, 2020. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under aCC-BY 4.0 International license. A systematic examination of preprint platforms for use in the medical and biomedical sciences setting Jamie J Kirkham1*, Naomi Penfold2, Fiona Murphy3, Isabelle Boutron4, John PA Ioannidis5, Jessica K Polka2, David Moher6,7 1Centre for Biostatistics, Manchester Academic Health Science Centre, University of Manchester, Manchester, United Kingdom. 2ASAPbio, San Francisco, CA, USA. 3Murphy Mitchell Consulting Ltd. 4Université de Paris, Centre of Research in Epidemiology and Statistics (CRESS), Inserm, Paris, F-75004 France. 5Meta-Research Innovation Center at Stanford (METRICS) and Departments of Medicine, of Epidemiology and Population Health, of Biomedical Data Science, and of Statistics, Stanford University, Stanford, CA, USA. 6Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Ottawa, Canada. 7School of Epidemiology and Public Health, Faculty of Medicine, University of Ottawa, Ottawa, Canada. *Corresponding Author: Professor Jamie Kirkham Centre for Biostatistics Faculty of Biology, Medicine and Health The University of Manchester Jean McFarlane Building Oxford Road Manchester, M13 9PL, UK Email: [email protected] Tel: +44 (0)161 275 1135 bioRxiv preprint doi: https://doi.org/10.1101/2020.04.27.063578; this version posted April 28, 2020. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity.
    [Show full text]
  • Authorship, Publication, and Peer Review Mark Hooper* , Virginia Barbour, Anne Walsh, Stephanie Bradbury and Jane Jacobs
    Hooper et al. Research Integrity and Peer Review (2018) 3:2 Research Integrity and https://doi.org/10.1186/s41073-018-0046-2 Peer Review COMMENTARY Open Access Designing integrated research integrity training: authorship, publication, and peer review Mark Hooper* , Virginia Barbour, Anne Walsh, Stephanie Bradbury and Jane Jacobs Abstract This paper describes the experience of an academic institution, the Queensland University of Technology (QUT), developing training courses about research integrity practices in authorship, publication, and Journal Peer Review. The importance of providing research integrity training in these areas is now widely accepted; however, it remains an open question how best to conduct this training. For this reason, it is vital for institutions, journals, and peak bodies to share learnings. We describe how we have collaborated across our institution to develop training that supports QUT’s principles and which is in line with insights from contemporary research on best practices in learning design, universal design, and faculty involvement. We also discuss how we have refined these courses iteratively over time, and consider potential mechanisms for evaluating the effectiveness of the courses more formally. Background and allows institutions and researchers to demonstrate The idea that institutions ought to provide researchers their commitment to comprehensive research integrity with formal training in research integrity is now gener- training and to satisfy the requirements of certain fund- ally accepted. How best to conduct research integrity ing bodies. training, however, is a contested issue [1–5]. However, it is not clear that the “standalone” training One option is to provide research integrity training by method is sufficient to teach research integrity effectively way of “standalone” courses or units, covering a broad or to promote an institutional culture that truly values range of responsible research practices.
    [Show full text]
  • Developing PRISMA-RR, a Reporting Guideline for Rapid Reviews of Primary Studies (Protocol)
    Developing PRISMA-RR, a reporting guideline for rapid reviews of primary studies (Protocol) Adrienne Stevens1,2, Chantelle Garritty1,2, Mona Hersi1, David Moher1,3,4 1Ottawa Methods Centre, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Canada; 2TRIBE Graduate Program, University of Split School of Medicine, Croatia; 3Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Canada; 4School of Epidemiology and Public Health, University of Ottawa, Canada; Funding: The development of PRISMA-RR is supported by a grant from the Canadian Institutes of Health Research (CIHR FRN-142310). The funder had no role in the development of this protocol and will not have a role in the data collection, analyses, interpretation of the data, and publication of the findings. Funding will be sought to develop the Explanation and Elaboration manuscript. Developing PRISMA-RR, a reporting guideline for rapid reviews of primary studies February 2018 INTRODUCTION Systematic reviews are known to be the best evidence upon which to make healthcare decisions, but can take years to complete (1). Rapid reviews have emerged as accelerated and/or abbreviated versions of systematic reviews with certain concessions made in the systematic review process to accommodate decision-making situations that require an expedited compilation of the evidence (2). Rapid reviews are usually conducted for a specific requestor and are typically understood to take 6 months or less to complete. Rapid reviews may differ from systematic reviews in a number of ways, such as the number of abbreviated methods or shortcuts employed and the breadth or scope of the question(s) posed (2–4). Some rapid reviews involve accelerated data mining processes and targeted screening of studies.
    [Show full text]
  • The Evolution of the Science Citation Index Search Engine to the Web of Science, Scientometric Evaluation and Historiography
    The Evolution of the Science Citation Index Search Engine to the Web of Science, Scientometric Evaluation and Historiography Eugene Garfield Chairman Emeritus, Thomson ISI 3501 Market Street, Philadelphia PA 19104 Fax: 215-387-1266 Tel. 215-243-2205 [email protected] www.eugenegarfield.org Presented at the University of Barcelona January 24, 2007 _________________________________ The Science Citation Index was proposed over fifty years ago to facilitate the dissemination and retrieval of scientific literature. Its unique search engine based on citation searching was not widely adopted until it was made available online in 1972. Its by product Journal Citation Reports became available in 1975 including its rankings by impact factor. Impact factors were not widely adopted until about a decade ago when they began to be used as surrogates for expected citation frequencies for recently published papers--a highly controversial application of scientometrics in evaluating scientists and institutions. The inventor of the SCI and its companion Social Sciences Citation Index will review its history and discuss their more recent use in graphically visualizing micro-histories of scholarly topics. Using the patented HistCite software for algorithmic historiographic analysis, the genealogy of the Watson-Crick discovery of the double helix structure of DNA and its relationship to the work of Heidelberger, Avery, and others will be discussed. ____________________________________ It is now over fifty years since the idea of the Science Citation Index (SCI) was first promulgated in Science magazine in 1955.i However, as the older generation of scientists will remember Current Contents is the information service that proved to be the primordial revolutionary “idea” which made the practical realization of the SCI possible.
    [Show full text]
  • Where to Publish?
    Where (Not) to Publish? B. Mohan Kumar Outline • What factors decide choice of journals • Journal impact factors • Predatory journals WHERE TO REPORT Choosing a journal -- international or local International journals: More prestigious (than local ones); Wider readerships Higher standards and rejection rates; Publishing in international journals requires more efforts, but is more rewarding in the long run. How do you choose a journal for publishing the paper? • Whoever will accept my article • Has friends on the editorial board • Has good artwork • Has (does not have) ads • Has fast turnaround time • Highest Journal Impact Factor • Impact factors are used to rank and, therefore, identify the so-called best journal to publish one’s work. • Journal most relevant to my field • Journal likely to be read by colleagues • Journal with reputation for timely, helpful review • Journal affiliated with professional society • Other factors : Intuitively - prestige? Past experience • Ask a librarian or respected colleague Journal Impact Factor: the central dogma of scholarly publishing Journal Impact Factor: Historical Aspects • Eugene Garfield, the founder of the Institute for Scientific Information (ISI, now the Thomson Scientific) in 1955 envisioned creating a metric to evaluate journal performance. • Over the next 20 years, Garfield and the late Irving H. Sher, worked to refine the Journal Impact Factor concept. • Science Citation Index in the 1960s, subsequently published in 1975 the Journal Citation Reports (JCR). • Calculated each year for those journals which Thomson Reuters indexes and are published in JCR. • A highly influential ranking tool used by participants in all phases of the research publishing cycle — librarians, publishers, editors, authors and information analysts.
    [Show full text]
  • Converting Scholarly Journals to Open Access: a Review of Approaches and Experiences David J
    University of Nebraska - Lincoln DigitalCommons@University of Nebraska - Lincoln Copyright, Fair Use, Scholarly Communication, etc. Libraries at University of Nebraska-Lincoln 2016 Converting Scholarly Journals to Open Access: A Review of Approaches and Experiences David J. Solomon Michigan State University Mikael Laakso Hanken School of Economics Bo-Christer Björk Hanken School of Economics Peter Suber editor Harvard University Follow this and additional works at: http://digitalcommons.unl.edu/scholcom Part of the Intellectual Property Law Commons, Scholarly Communication Commons, and the Scholarly Publishing Commons Solomon, David J.; Laakso, Mikael; Björk, Bo-Christer; and Suber, Peter editor, "Converting Scholarly Journals to Open Access: A Review of Approaches and Experiences" (2016). Copyright, Fair Use, Scholarly Communication, etc.. 27. http://digitalcommons.unl.edu/scholcom/27 This Article is brought to you for free and open access by the Libraries at University of Nebraska-Lincoln at DigitalCommons@University of Nebraska - Lincoln. It has been accepted for inclusion in Copyright, Fair Use, Scholarly Communication, etc. by an authorized administrator of DigitalCommons@University of Nebraska - Lincoln. Converting Scholarly Journals to Open Access: A Review of Approaches and Experiences By David J. Solomon, Mikael Laakso, and Bo-Christer Björk With interpolated comments from the public and a panel of experts Edited by Peter Suber Published by the Harvard Library August 2016 This entire report, including the main text by David Solomon, Bo-Christer Björk, and Mikael Laakso, the preface by Peter Suber, and the comments by multiple authors is licensed under a Creative Commons Attribution 4.0 International License. https://creativecommons.org/licenses/by/4.0/ 1 Preface Subscription journals have been converting or “flipping” to open access (OA) for about as long as OA has been an option.
    [Show full text]