Evaluating Research: Do Bibliometric Indicators Provide the Best

Total Page:16

File Type:pdf, Size:1020Kb

Evaluating Research: Do Bibliometric Indicators Provide the Best EUGENE GARFIELD lNSTITUTE FOR SCIENTIFIC lNFOUMATIONe 3501 MAR KETST PHILADELPHIA PA 191C4 Evaluating Research: Do Bibliornetric Indicators Provide the Best Measures? Number 14 April 3, 1989 In recent months we have reprinted sever- Beginning with an examination of the peer al studies, originally published elsewhere, review system, King goes onto assess vari- involving the use of biblio-scientometric ous bibliometric indicators, including pub- measures to evaluate research. In one such lication counts, citation and co-citation study, Michael R. Hrdperin and Alok K. analysis, journal impact factors, and a newer Chakrabarti, Drexel University, Philadel- method known as co-word analysis. (For phia, examined the relationship between the reasons of space, we have omitted King’s volume of scientific and technical papers discussion of other indicators, such as pa- produced by industrial scientists and the tent analysis, measures of estwm [which in- characteristics of the corporations employ- clude attraction of outside funding, mem- ing them. 1 Another study, by economist bership in professional societies, winning of Arthur M. Diamond, Jr., University of Ne- international prizes, and so on], and input- braska, Omaha, evaluated the role that ci- output studies. ) She concludes by recom- tations play in determining salaries.z And mending continued development of new we recently reprinted an article from the methods, including online techniques for Times Higher Education Supplement publication and citation retrieval. (London) in which I discussed the strengths Discussing citation and co-citation analy- and weaknesses of citation analysis as a mea- sis, King cites various critical assessments sure of research performance in individual of these techniques. Since we have ad- university departments.q dressed many such criticisms on previous From those specific studies, we move to occasions,3$7 I will not do so here. I will a more general discussion by Jean King, simply present King’s paper, which is a then at the Agricultural and Food Research thorough and valuable review. As budgets Council, London, UK, now aftliated with and funding for science come under increas- the Cancer Research Campaign, London. ing scrutiny at rdl levels, it is essential that Her review originally appeared in the Jour- evahtative methods be as accurate, as nal of Inforrrdon Science, sponsored by the balanced, and as appropriately applied as Institute of Information Scientists. The ar- possible. ticle discusses a variety of bibliometric mea- ***** sures and the implications of their use in re- search evahtation.~ King kindly abridged My thanks to Chris King for his help in her paper for publication here. the preparation of this essay. wowN REFERENCES 1. Garffeld E. Measurins R&D productivity through scicntotnetrics. Current Conrems (30):3-9, 25 July 19S8. 2, —. Can researchers bank on citation analysis? Current Gmrents (44):3-12, 3 I October 198S. 3. ——. The impact of citation counts-a UK pxspectiv.. Current Cotients (37):3-5, 12 September 198S. 4, King J. A review of bibliometric and other science indicators and their role in research evaluation. J. inform Sci. 13:261-76, 1987. 5. Garfield E, Can criticism and documentation of research papers be automated? E.mays of an infomwion scientist. Philadelphia 1S1 Press, 1977. Vol. 1, p. 83-%. 6 —----- How to use citation analysis for faculty evaluations, and when is it relevant? Parts 1 & 2. 16id., t9S4. Vol. 6. p. 354-72. 7. ———. Uses and misuses of citation frequemcy. Ibid., 19S6. Vol. 8. p. 403-9. 93 A review of bibliometric and other science indicators and their role in research evaluation. Jean King Recent reductions in research budgets have led to the capital intensity of research (e.g. the ‘sophisti- need for greater selectivity in resource atkation. Ma- cation’ factor; the growth of ‘Big Science’); ex- Surcs of past pzrfomszncc are stilt snwng the most prOm- panding objectives and opportunities, with msmy ising mans of deciding between competing interests. new fields emerging; an increase in collaborative, B!bhometry, the measurement of scientific publications often multidisciplinary, research projects which and of their impact on the scicnti conrrnuni ty, assessed by the citations they attmct, provides a portfolio of in- require coordination; a coateacence of basic and dicators that can bc combined to give a useful picture applid research, with much research now of a of recent research actwiry, fn this stateaf-tk-arl review strategic mfure (i.e. long-terns basic research un- the vsrious methodologiesthst hsve been developed sre dertaken with fairly specific applications in view); outlined in terms of their strengths, weaknessessnd P- finally, economic constraints require choices to ttcular applications. The present limitations of science be made between different disciplines, fields and mdicafors in research evaluation are considered and research proposafs. 1The peer review process has some future directions for developnwm in techniques thus come under increasing pressure. are suggested. 1. Introduction 3. The peer review system In recent years policy makers and research The peer review system has been criticized on managers have become increasingly interested in several counts: the tssc of indicator-s of scientific output. The back- (1) The partiality of peers is an increasing prob ground to this development is considered, and in lem as the concentration of research facilities in Sections 2 and 3, the need for accountability and fewer, larger centres makes peers with no vested the current pressures on the peer review process interest in the review increasingly diffietdt to fid. are outfined. The types of bibliometric indicators (2) The ‘old Lmy’network often results in older, that may have vafue in assessment studies are re- entrenchd fields receiving greater recognition viewed in Section 4, ranging from weffdcctsme.rrt- than new, emerging areas of research, whife de- ed and widely applied approaches such as publi- clining fields may be protected out of a sense of cation counting, citation frequency analysis and loyalty to colleagues. Thus the peer review pro- co-citation mapping, to less-well-known methods cess may often be ineffective as a mechanism for such as journal-weighted citation analysis ad eo- restructuring scientific activity. word mapping. Directions for frshsre research are (3) The ‘Mo’ effect may restdt in a greater fike- suggested that may lead to more widely applicable Iihood of fmschng for more ‘visible’ scientists and and more routinely available indkwors of scien- for higher status departments or institutes. tific performance. (4) Reviewers often have quite different ideas abut what aspects of the research they are assess- ing, what criteria they should use and how these 2. Accountability in scienee should be interpreted. The review itself may vary Prior to the 1%0’s, funding for scientific re- from a brief assessment by post to site visits by search in the UK was relatively unrestricted, Re- panels of reviewers. sources were allocated largely on the basis of in- (5) The peer review process assurncs that a high ternal scientific criteria (criteria generated from level of agreement exists among scientists about within the scientific tield concerned), which were what constitutes good quafity work, who is doing evafuated by the peer review process. Various fac- it and where promising lines of enquiry lie, art tors have resulted in a need for greater selectivi- assumption that may not hold in newer ty in the allocation of ftrnds: an increase in the specialities. Abndsed verwo” of an mrick tit cmsmatly appuaj m du Journal oJ/nfomcion Scwnce 135. 1987, pp. 261-76 Repnntrd wrh pemussIon of Ekvier of ElscvIer Fubkbers B.V 94 (6) Resource inputs to the review process, Mb ticle with its acconrparryirrg list of citations has in terms rf administrative costs and of scientists’ always beers the accepted medium by which the time, are considerable but usually ignored. scientific community reports the results of its in- Various studies have addressed these criticisms. vestigations, However, while a publication count For example, art arudysis of National Science gives a measure of the total volume of research Foundation (NSF) review outcomes found little output, it provides no indication as to the quality evidence of bias in favour of eminent researchers of the work performed. or institutiorrs.z However, a later study, in which Other objections include: independently selected panels re-evahsated fund- (1) Informal and formal, non-joumsJ methods ing decisions by NSF reviewers. found a high lev- of communication in science are ignored.7 el of disagreement among peers over the same (2) Publication practices vary across fields and proposals, It concluded that the chances of receiv- between journals. Also, the social and political ing a grant were determined about 50% by the pressures on a group to publish vary according characteristics of the propml and prirtciprd in- to country, to the publication practices of the em- vestigator and 50% by random elements i.e. the ploying institution and to the emphasis placed on ‘luck of the reviewer draw’. 3 The interpretation number of publications for obtaining tenure, pro- of this data has beerr questioned, however, and motion, grants etc. the fact that most disagreement was found in the (3) It is often very difficult to retrieve all the mid-region between acceptance arrd rejection em- papers for a particular field, and to define the phasized. 4 In another study,
Recommended publications
  • Understanding the Initiation of the Publishing Process
    Part 1 of a 3-part Series: Writing for a Biomedical Publication UNDERSTANDING THE INITIATION OF THE PUBLISHING PROCESS Deana Hallman Navarro, MD Maria D. González Pons, PhD Deana Hallman Navarro, MD BIOMEDICAL SCIENTIFIC PUBLISHING Publish New knowledge generated from scientific research must be communicated if it is to be relevant Scientists have an obligation to the provider of funds to share the findings with the external research community and to the public Communication Personal communication Public lectures, seminars, e-publication, press conference or new release ◦ Unable to critically evaluate its validity Publication ◦ Professional scientific journals – 1665 ◦ 1ry channel for communication of knowledge ◦ Arbiter of authenticity/legitimacy of knowledge Responsibility shared among authors, peer reviewers, editors and scientific community How to Communicate Information Publications, brief reports, abstracts, case reports, review article, letter to the editor, conference reports, book reviews… 1ry full-length research publication-1968 ◦ Definition: the first written disclosure of new knowledge that would enable the reader to: Repeat exactly the experiments described To assess fully the observations reported Evaluate the intellectual processes involved Development of the Manuscript To repeat exactly the experiments: ◦ Need a comprehensive, detailed methodology section To assess fully the observations: ◦ Need a very detailed results section With graphs, charts, figures, tables, … And full exposure of hard data Development
    [Show full text]
  • A Comprehensive Framework to Reinforce Evidence Synthesis Features in Cloud-Based Systematic Review Tools
    applied sciences Article A Comprehensive Framework to Reinforce Evidence Synthesis Features in Cloud-Based Systematic Review Tools Tatiana Person 1,* , Iván Ruiz-Rube 1 , José Miguel Mota 1 , Manuel Jesús Cobo 1 , Alexey Tselykh 2 and Juan Manuel Dodero 1 1 Department of Informatics Engineering, University of Cadiz, 11519 Puerto Real, Spain; [email protected] (I.R.-R.); [email protected] (J.M.M.); [email protected] (M.J.C.); [email protected] (J.M.D.) 2 Department of Information and Analytical Security Systems, Institute of Computer Technologies and Information Security, Southern Federal University, 347922 Taganrog, Russia; [email protected] * Correspondence: [email protected] Abstract: Systematic reviews are powerful methods used to determine the state-of-the-art in a given field from existing studies and literature. They are critical but time-consuming in research and decision making for various disciplines. When conducting a review, a large volume of data is usually generated from relevant studies. Computer-based tools are often used to manage such data and to support the systematic review process. This paper describes a comprehensive analysis to gather the required features of a systematic review tool, in order to support the complete evidence synthesis process. We propose a framework, elaborated by consulting experts in different knowledge areas, to evaluate significant features and thus reinforce existing tool capabilities. The framework will be used to enhance the currently available functionality of CloudSERA, a cloud-based systematic review Citation: Person, T.; Ruiz-Rube, I.; Mota, J.M.; Cobo, M.J.; Tselykh, A.; tool focused on Computer Science, to implement evidence-based systematic review processes in Dodero, J.M.
    [Show full text]
  • “Hard-Boiled” And/Or “Soft-Boiled” Pedagogy?
    Interdisciplinary Contexts of Special Pedagogy NUMBER 16/2017 BOGUSŁAW ŚLIWERSKI Maria Grzegorzewska Pedagogical University in Warsaw “Hard-boiled” and/or “soft-boiled” pedagogy? ABSTRACT : Bogusław Śliwerski, “Hard-boiled” and/or “soft-boiled” pedagogy? Inter- disciplinary Contexts of Special Pedagogy, No. 16, Poznań 2017. Pp. 13-28. Adam Mickiewicz University Press. ISSN 2300-391X The article elaborates on the condition of partial moral anomy in the academic envi- ronment, in Poland that not only weakens the ethical causal power of scientific staff, but also results in weakening social capital and also in pathologies in the process of scientific promotion. The narration has been subject to a metaphor of eggs (equiva- lent of types of scientific publications or a contradiction thereof) and their hatching in various types of hens breeding (scientific or pseudo-academic schools) in order to sharpen the conditions and results of academic dishonesty and dysfunction and to notice problems in reviewing scientific degrees and titles’ dissertations. KEY WORDS : science, logology, reviews, scientific publications, pathologies, academic dishonesty, scientists The Republic of Poland’s post-socialist transformation period brings pathologies in various areas of life, including academic. In the years 2002-2015, I participated in the accreditation of academic universities and higher vocational schools and thus, I had the op- portunity to closely observe operation thereof and to initially assess their activity. For almost two years I have not been included in the accreditation team, probably due to the publicly formulated criti- 14 BOGUSŁAW ŚLIWERSKI cism of the body, whose authorities manipulate the assessment of units providing education at faculties and employ for related tasks persons of low scientific credibility, including, among others: owners of Slovak postdoctoral academic titles.
    [Show full text]
  • Helpful Tips on How to Get Published in Journals
    Helpful tips on how to get published in journals Peter Neijens What you should consider before submitting a paper 2 In general Publishing is as much a social as an intellectual process! Know the standards and expectations Learn to think like a reviewer Think of publishing as persuasive communication Be aware of tricks, but also of pitfalls 3 Preliminary questions Why publish? Basic decisions: authorship, responsibilities, publication types Judging the quality of your paper Finding the “right” outlet 4 Why publish? Discursive argument • Innovative contribution • Of interest to the scientific community Strategic argument • Publish or perish • Issue ownership • Visibility 5 Authorship Inclusion – who is an author? • All involved in the work of paper writing • All involved in the work necessary for the paper to be written • All involved in funding/ grants that made the study possible Order – who is first author? • The person who wrote the paper • The one who did most of the work for the study • The person that masterminded the paper • The most senior of the researchers 6 Responsibilities Hierarchical approach • First author masterminds and writes the paper • Second author contributes analyses, writes smaller parts • Third author edits, comments, advises Egalitarian approach • All authors equally share the work: alphabetical order • Authors alternate with first authorship in different papers 7 Publication plan Quantity or quality Aiming low or high Timing Topic sequence Focus: least publishable unit (LPU) 8 LPU / MPU Least publishable
    [Show full text]
  • A Large-Scale Study of Health Science Frandsen, Tove Faber; Eriksen, Mette Brandt; Hammer, David Mortan Grøne; Buck Christensen, Janne
    University of Southern Denmark Fragmented publishing a large-scale study of health science Frandsen, Tove Faber; Eriksen, Mette Brandt; Hammer, David Mortan Grøne; Buck Christensen, Janne Published in: Scientometrics DOI: 10.1007/s11192-019-03109-9 Publication date: 2019 Document version: Submitted manuscript Citation for pulished version (APA): Frandsen, T. F., Eriksen, M. B., Hammer, D. M. G., & Buck Christensen, J. (2019). Fragmented publishing: a large-scale study of health science. Scientometrics, 119(3), 1729–1743. https://doi.org/10.1007/s11192-019- 03109-9 Go to publication entry in University of Southern Denmark's Research Portal Terms of use This work is brought to you by the University of Southern Denmark. Unless otherwise specified it has been shared according to the terms for self-archiving. If no other license is stated, these terms apply: • You may download this work for personal use only. • You may not further distribute the material or use it for any profit-making activity or commercial gain • You may freely distribute the URL identifying this open access version If you believe that this document breaches copyright please contact us providing details and we will investigate your claim. Please direct all enquiries to [email protected] Download date: 29. Sep. 2021 Fragmented publishing: a large-scale study of health science Tove Faber Frandsen1,, Mette Brandt Eriksen2, David Mortan Grøne Hammer3,4, Janne Buck Christensen5 1 University of Southern Denmark, Department of Design and Communication, Universitetsparken 1, 6000 Kolding, Denmark, ORCID 0000-0002-8983-5009 2 The University Library of Southern Denmark, University of Southern Denmark, Campusvej 55, 5230 Odense M, Denmark.
    [Show full text]
  • Programmers, Professors, and Parasites: Credit and Co-Authorship in Computer Science
    Sci Eng Ethics (2009) 15:467–489 DOI 10.1007/s11948-009-9119-4 ORIGINAL PAPER Programmers, Professors, and Parasites: Credit and Co-Authorship in Computer Science Justin Solomon Received: 21 September 2008 / Accepted: 27 January 2009 / Published online: 27 February 2009 Ó Springer Science+Business Media B.V. 2009 Abstract This article presents an in-depth analysis of past and present publishing practices in academic computer science to suggest the establishment of a more consistent publishing standard. Historical precedent for academic publishing in computer science is established through the study of anecdotes as well as statistics collected from databases of published computer science papers. After examining these facts alongside information about analogous publishing situations and stan- dards in other scientific fields, the article concludes with a list of basic principles that should be adopted in any computer science publishing standard. These principles would contribute to the reliability and scientific nature of academic publications in computer science and would allow for more straightforward discourse in future publications. Keywords Co-authorship Á Computer science research Á Publishing In November 2002, a team of computer scientists, engineers, and other researchers from IBM and the Lawrence Livermore National Laboratory presented a conference paper announcing the development of a record-breaking supercomputer. This supercomputer, dubbed the BlueGene/L, would sport a ‘‘target peak processing power’’ of 360 trillion floating-point operations every second [1], enough to simulate the complexity of a mouse’s brain [2]. While the potential construction of the BlueGene/L was a major development for the field of supercomputing, the paper announcing its structure unwittingly suggested new industry standards for sharing co-authorship credit in the field of computer science.
    [Show full text]
  • “Publish Or Perish”
    “Publish or Perish” the pressure in academia to rapidly and continuously publish academic work to sustain or further one's career Bart De Strooper Publishing is the beating heart of academic research So what is the problem? The problem is elsewhere 1. Academic versus science careers 2. Human nature of evaluators 3. Human nature of the ones who are evaluated How is publication measured and how is it evaluated ? content ‹› counting thinking ‹› administrative rules What is a good publication? Sound Reproducible Original Interesting Stimulating Mind changing An important contribution to the eternal scientific debate Mandarins: the art of bibliometrics +impact factor (IF) +Eigenfactor +number of citations +h-index or Hirsch number and corrected h-index (5 years) +i10 index +g-index +corrections for author position in paper +ALTMETRICS, the new hype… (viewed, discussed, saved, cited, twittered, etc etc) Reactions: considering a limit number of publications per year to be taken into consideration for each scientist, or even penalising scientists for an excessive number of publications per year—e.g., more than 20 statement on publication practices and indices and the role of peer review in research assessment International Council for Science (ICSU) Reactions: evaluate only articles and no bibliometric information on candidates to be evaluated in all decisions concerning "performance-based funding allocations, postdoctoral qualifications, appointments, or reviewing funding proposals, [where] increasing importance has been given to numerical indicators such as the h-index and the impact factor" Deutsche Forschungsgemeinschaft (German Research Foundation) The scientists and their quantitative adaptation to the struggle for life The “publon” -LPU (least publishable unit) -SPU (smallest publishable unit) -MPU (minimum publishable unit) The “salami publication” The scientists and their qualitative adaptation to the struggle for life 1.
    [Show full text]
  • Are Returns to Research Quality Lower in Agricultural Economics Than in Economics? John Gibson, University of Waikato, Jkgibson@
    Are Returns to Research Quality Lower in Agricultural Economics than in Economics? John Gibson, University of Waikato, [email protected] Ethan-John Burton McKenzie, University of Waikato, [email protected] Selected Paper prepared for presentation at the 2016 Agricultural & Applied Economics Association Annual Meeting, Boston, Massachusetts, July 31-August 2 Copyright 2016 by John Gibson and Ethan-John Burton-McKenzie. All rights reserved. Readers may make verbatim copies of this document for non-commercial purposes by any means, provided that this copyright notice appears on all such copies. May, 2016 Are Returns to Research Quality Lower in Agricultural Economics than in Economics? John Gibson†* Ethan-John Burton-McKenzie* *Department of Economics, University of Waikato, Hamilton, New Zealand Abstract We compare effects of research quality and quantity on the salary of academics in agricultural economics and economics departments of the same universities. Agricultural economists get a significantly lower payoff to research quality, whether measured in terms of citations or in terms of quality-weighted journal output (based on nine different weighting schemes). Instead, salary in these agricultural economics departments appears to depend on the quantity of journal articles. In contrast, article counts have no independent effect on economist salaries. One-third of academics in the agricultural economics departments studied here have doctoral training in economics; these very different reward structures for research may cause frustration for these faculty due to the muted returns to research quality that agricultural economics departments seem to offer. JEL: A14, J44, Q00 Keywords: Academic salary; Citations; Journal rankings; Research quality Acknowledgements: We are grateful to David Anderson and John Tressler for helpful input.
    [Show full text]
  • Logological Poetry: an Editorial
    3 log%gica/ Poetry AN EDITORIAL It was Dmitri Borgmann who put the word "logology" into circulation. Before Langu.age on Vacation, his first book, was published, he wrote to me: "I don't believe the word 'IogoIogy' has ever appeared in a book devoted to words or puzzles. I dug it out of the unabridged Oxford while searching for a suitable name for my activity." He then gave the following information: None of the unabridged Funk and Wagnall's contain LOGOLOGY. Webster's Second and Third Editions do not contain LOGOLOGY. Webster's First Edition gives it as a synonym for "philology," which IS some­ thing else. The Century Dictionary and The New Century Dictionary do not contain the word. The Century Dictionary Supplement (2 volumes, 1909), defines it as the doc­ trine of the Logos (in theology). The unabridged Oxford gives two definitions: (I) The doctrine of the Logos (only as the title of two books in the 18th century) ; and (2) The science of words (rare-illustrated by two quotations, both from magazines dated 1820 and 1878). Mr. Borgmann was looking for a word which was not already c1aimed as the sole property of any other type of word expert. The commonly used tenn "word play" has about it an objectionably trifling aura. Not much sense of doing things with a vengeance exudes from its weak semantic fabric. Probably the haH-forgotten wore!, "logology," had dwelt long enough in limbo awaiting a new cause to serve. Several years ago I wished to acquire a list of all kno'wn word transpositions.
    [Show full text]
  • On the Non-Necessity of Levels in Phonology, Grammar and 'Abstract
    This is a repository copy of On the non-necessity of levels in phonology, grammar and ‘abstract semantics’. White Rose Research Online URL for this paper: http://eprints.whiterose.ac.uk/81502/ Version: Published Version Article: Dickins, J (2014) On the non-necessity of levels in phonology, grammar and ‘abstract semantics’. LinguisticaOnline (16). 1 - 32. ISSN 1801-5336 Reuse Unless indicated otherwise, fulltext items are protected by copyright with all rights reserved. The copyright exception in section 29 of the Copyright, Designs and Patents Act 1988 allows the making of a single copy solely for the purpose of non-commercial research or private study within the limits of fair dealing. The publisher or other rights-holder may allow further reproduction and re-use of this version - refer to the White Rose Research Online record for this item. Where records identify the publisher as the copyright holder, users can verify any specific terms of use on the publisher’s website. Takedown If you consider content in White Rose Research Online to be in breach of UK law, please notify us by emailing [email protected] including the URL of the record and the reason for the withdrawal request. [email protected] https://eprints.whiterose.ac.uk/ Linguistica ONLINE. Published: August 25, 2014 http://www.phil.muni.cz/linguistica/art/dickins/dic-002.pdf ISSN 1801-5336 ON THE NON-NECESSITY OF LEVELS IN PHONOLOGY, GRAMMAR AND ‘ABSTRACT SEMANTICS’1 [*] James Dickins (University of Leeds, [email protected]) Abstract. This paper argues that the abstract levels which are typically recog- nised in linguistics – whether within phonology (e.g.
    [Show full text]
  • Self-Plagiarism Research Literature in the Social Sciences: a Scoping Review
    University of Calgary PRISM: University of Calgary's Digital Repository Werklund School of Education Werklund School of Education Research & Publications 2018-06 Self-Plagiarism Research Literature in the Social Sciences: A Scoping Review Eaton, Sarah Elaine; Crossman, Katherine Springer Eaton, S. E., & Crossman, K. (2018). Self-plagiarism research literature in the social sciences: A scoping review. Interchange: A Quarterly Review of Education, 1-27. Retrieved from https://rdcu.be/YR5u doi:https://doi.org/10.1007/s1078 http://hdl.handle.net/1880/106763 journal article https://creativecommons.org/licenses/by/4.0 Unless otherwise indicated, this material is protected by copyright and has been made available with authorization from the copyright owner. You may use this material in any way that is permitted by the Copyright Act or through licensing that has been assigned to the document. For uses that are not allowable under copyright legislation or licensing, you are required to seek permission. Downloaded from PRISM: https://prism.ucalgary.ca 1 Self-Plagiarism Research Literature in the Social Sciences: A Scoping Review Sarah Elaine Eaton, Ph.D. Katherine Crossman, Ph.D. University of Calgary Published in: Interchange - https://link.springer.com/journal/10780 Notes: This is the Authors’ Accepted Manuscript version. It is published on line in accordance with Springer’s Copyright Transfer Agreement (https://www.springer.com/authors/book+authors/helpdesk?SGWID=0-1723113-12-799504-0). For citation purposes, please refer to the published manuscript. Date manuscript was accepted: May 23, 2018 Abstract Self-plagiarism is a contentious issue in higher education, research and scholarly publishing contexts.
    [Show full text]
  • Four Proposals for a More Reliable Scientific Literature
    Commentary Page 1 of 2 Four proposals for a more reliable scientific literature AUTHOR: Jason Bosch1 The publication of Richard Harris’ new book Rigor Mortis once again brings the issue of poorly conducted science and irreproducible research into focus.1 This book is the latest in a string of publications which point out flaws in AFFILIATION: the current scientific system. Earlier examples from the scientific literature include John Ioannidis’ provocatively 1Gregor Mendel Institute, Vienna, titled article ‘Why most published research findings are false’2 and the disclosure that researchers at Amgen Austria could only confirm 6/53 landmark cancer studies3. These articles have suggested and inspired ways to improve the situation that address, among other things, incentives in science4, guidelines for better experimental design5 CORRESPONDENCE TO: and AllTrials greater insistence on the registration and reporting of clinical trials. I think pre-registration would also Jason Bosch benefit basic science. I wish to advance a series of proposals regarding the way in which we publish scientific literature that, I believe, will EMAIL: benefit science by making research rapidly available, easier to search and more reliable. [email protected] Scientific articles should be short and focused KEYWORDS: publishing; reproducibility; One should be able to grasp the gist of a paper from its title and abstract alone. However, in a long paper with multiple retractions; citation experiments, many pieces of relevant or interesting information can remain hidden. Often, entire experiments are heavily summarised; compressed into just a few paragraphs or sentences, which can hinder understanding. Over the past three decades, there has been a dramatic increase in the size of the average publishable unit, i.e.
    [Show full text]