Research Evaluation Metrics
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Citation Analysis for the Modern Instructor: an Integrated Review of Emerging Research
CITATION ANALYSIS FOR THE MODERN INSTRUCTOR: AN INTEGRATED REVIEW OF EMERGING RESEARCH Chris Piotrowski University of West Florida USA Abstract While online instructors may be versed in conducting e-Research (Hung, 2012; Thelwall, 2009), today’s faculty are probably less familiarized with the rapidly advancing fields of bibliometrics and informetrics. One key feature of research in these areas is Citation Analysis, a rather intricate operational feature available in modern indexes such as Web of Science, Scopus, Google Scholar, and PsycINFO. This paper reviews the recent extant research on bibliometrics within the context of citation analysis. Particular focus is on empirical studies, review essays, and critical commentaries on citation-based metrics across interdisciplinary academic areas. Research that relates to the interface between citation analysis and applications in higher education is discussed. Some of the attributes and limitations of citation operations of contemporary databases that offer citation searching or cited reference data are presented. This review concludes that: a) citation-based results can vary largely and contingent on academic discipline or specialty area, b) databases, that offer citation options, rely on idiosyncratic methods, coverage, and transparency of functions, c) despite initial concerns, research from open access journals is being cited in traditional periodicals, and d) the field of bibliometrics is rather perplex with regard to functionality and research is advancing at an exponential pace. Based on these findings, online instructors would be well served to stay abreast of developments in the field. Keywords: Bibliometrics, informetrics, citation analysis, information technology, Open resource and electronic journals INTRODUCTION In an ever increasing manner, the educational field is irreparably linked to advances in information technology (Plomp, 2013). -
The Publish Or Perish Book
The Publish or Perish Book Your guide to effective and responsible citation analysis Anne-Wil Harzing Edition: September 2010 For inquiries about this book, refer to the book's web page: http://www.harzing.com/popbook.htm ISBN 978-0-9808485-0-2 (PDF) ISBN 978-0-9808485-1-9 (paperback, colour) ISBN 978-0-9808485-2-6 (paperback, black & white) © 2010 by Anne-Wil Harzing All rights reserved. No part of this book may be reproduced in any form or by any electronic or mechanical means (including electronic mail, photocopying, recording, or information sto- rage and retrieval) without permission in writing from the publisher. As the SOLE exception to the above if you purchased this book in its PDF edition, then you are allowed to print 1 (one) hard copy for your own use only for each licence that you pur- chased. Published by Tarma Software Research Pty Ltd, Melbourne, Australia. National Library of Australia Cataloguing-in-Publication entry: Author Harzing, Anne-Wil. Title The publish or perish book [electronic resource]: Your guide to effective and responsible citation analysis / Anne-Wil Harzing. Edition 1st ed. ISBN 9780980848502 (pdf) Notes Includes bibliographical references. Subjects Publish or perish (Computer program), Authorship, Academic writing, Scholarly publishing. Dewey Number 808.02 TABLE OF CONTENTS PREFACE .................................................................................................................................... VII CHAPTER 1: INTRODUCTION TO CITATION ANALYSIS ................................................................... -
Is Sci-Hub Increasing Visibility of Indian Research Papers? an Analytical Evaluation Vivek Kumar Singh1,*, Satya Swarup Srichandan1, Sujit Bhattacharya2
Journal of Scientometric Res. 2021; 10(1):130-134 http://www.jscires.org Perspective Paper Is Sci-Hub Increasing Visibility of Indian Research Papers? An Analytical Evaluation Vivek Kumar Singh1,*, Satya Swarup Srichandan1, Sujit Bhattacharya2 1Department of Computer Science, Banaras Hindu University, Varanasi, Uttar Pradesh, INDIA. 2CSIR-National Institute of Science Technology and Development Studies, New Delhi, INDIA. ABSTRACT Sci-Hub, founded by Alexandra Elbakyan in 2011 in Kazakhstan has, over the years, Correspondence emerged as a very popular source for researchers to download scientific papers. It is Vivek Kumar Singh believed that Sci-Hub contains more than 76 million academic articles. However, recently Department of Computer Science, three foreign academic publishers (Elsevier, Wiley and American Chemical Society) have Banaras Hindu University, filed a lawsuit against Sci-Hub and LibGen before the Delhi High Court and prayed for Varanasi-221005, INDIA. complete blocking these websites in India. It is in this context, that this paper attempts to Email id: [email protected] find out how many Indian research papers are available in Sci-Hub and who downloads them. The citation advantage of Indian research papers available on Sci-Hub is analysed, Received: 16-03-2021 with results confirming that such an advantage do exist. Revised: 29-03-2021 Accepted: 25-04-2021 Keywords: Indian Research, Indian Science, Black Open Access, Open Access, Sci-Hub. DOI: 10.5530/jscires.10.1.16 INTRODUCTION access publishing of their research output, and at the same time encouraging their researchers to publish in openly Responsible Research and Innovation (RRI) has become one accessible forms. -
Publish Or Perish
Publish or Perish Maximizing the impact of your next publication Andrew Plume, Director – Scientometrics & Market Analysis City Graduate School Welcome Event - 18th November 2013 The research life: Enable, do, share Recruit/evaluate Secure Establish Manage Develop researchers funding partnerships facilities Strategy ? Search, read, Collaborate & Experiment Analyze & review network synthesize ! Have Manage data Publish and Commercialise Promote impact disseminate 2 The research life: Publish or Perish 3 Publish or Perish: Origin 4 Publish or Perish: Evidence of published authors agree/strongly agree*: “My career depends on a history of publishing research 81% articles in peer reviewed journals” Institutional: Career advancement and funding Reasons for National: Research assessment exercises agreeing Global: Research dissemination is a goal of research Articles in peer-reviewed journals make the most At my institution, there are defined thresholds of important contribution to my career in terms of status, publications for academic promotions at least during merit pay, and marketability, vs. teaching or service. early career. Engineering & Technology, UK (36-45) Social Science, USA (36-45) If I publish well (Impact Factor, h-index) I have more Because the primary role of my job is to produce chance to get a better position and to have grants. research which is of no use if it does not get into the Medicine & Allied Health, Italy (46-55) public domain. Earth & Planetary Sciences, UK (56-65) * Survey of 3,090 published authors in November 2012 -
Researcher ID/Publons – 2019 Instructions
Researcher ID/Publons – 2019 Instructions ResearcherID is has now been integrated into the Publons platform: https://publons.com/account/login/. Publons is similar to researcher ID and is a platform which records, verifies, and showcases peer review contributions of researchers. Publons information integrates with Web of Science or records imported from ORCID, EndNote or manual uploads and allows you to track citations to your research outputs that are indexed in Web of Science. Please note that publications that have been manually entered into your Publons from journals outside of Web of Science will not track your citation metrics. Your Account Existing Account If you already have an account for ResearcherID, or Web of Science, you can login at https://publons.com/account/login/ using the same username and password. The publications you previously added into ResearcherID have automatically be added into your Publons account. Create a New Account If you do not already have a ResearcherID, create a Publons account here: https://publons.com/account/register/. You will receive a ResearcherID overnight from Publons that you will be able to use to attach to your publications in Web of Science. Adding Publications Once logged into your account you can load your publications by importing them from Web of Science, ORCID account, EndNote Library or manually using identifiers such as DOI numbers. (Note that citation metrics won't be tracked for records that do not appear in Web of Science). Loading Web of Science publications Select “Publications” under My Records then select “Import Publications” Publons will automatically search Web of Science when you select “Import Publications” for publications that match the email addresses and publishing names you listed in your profile. -
ORCID: Connecting the Research Community April 30, 2020 Introductions
ORCID: Connecting the Research Community April 30, 2020 Introductions Shawna Sadler Sheila Rabun Lori Ann M. Schultz https://orcid.org/0000-0002-6103-5034 https://orcid.org/0000-0002-1196-6279 https://orcid.org/0000-0002-1597-8189 Engagement Manager ORCID US Community Sr. Director of Research, Americas, Specialist, Innovation & Impact, ORCID LYRASIS University of Arizona Agenda 1. What is ORCID? 2. ORCID US Community Consortium 3. Research Impact & Global Connections 4. ORCID for Research Administrators 5. Questions What is ORCID? ORCID’S VISION IS A WORLD WHERE ALL WHO PARTICIPATE IN RESEARCH, SCHOLARSHIP, AND INNOVATION ARE UNIQUELY IDENTIFIED AND CONNECTED TO THEIR CONTRIBUTIONS AND AFFILIATIONS ACROSS TIME, DISCIPLINES, AND BORDERS. History ● ORCID was first announced in 2009 ● A collaborative effort by the research community "to resolve the author name ambiguity problem in scholarly communication" ● Independent nonprofit organization ● Offering services in 2012 ORCID An non-profit organization that provides: 1. ORCID iDs to people 2. ORCID records for people 3. Infrastructure to share research data between organizations ORCID for Researchers Free Unique Identifier Sofia Maria Hernandez Garcia ORCID iD https://orcid.org/0000-0001-5727-2427 ORCID Record: ORCID Record: ORCID Record: What is ORCID? https://vimeo.com/97150912 ORCID for Research Organizations Researcher ORCID Your Organization 1) Researcher creates ORCID iD All records are saved in the API Transfer Member data 2) Populates record ORCID Registry to your CRIS System Current -
On the Relation Between the Wos Impact Factor, the Eigenfactor, the Scimago Journal Rank, the Article Influence Score and the Journal H-Index
1 On the relation between the WoS impact factor, the Eigenfactor, the SCImago Journal Rank, the Article Influence Score and the journal h-index Ronald ROUSSEAU 1 and the STIMULATE 8 GROUP 2 1 KHBO, Dept. Industrial Sciences and Technology, Oostende, Belgium [email protected] 2Vrije Universiteit Brussel (VUB), Brussels, Belgium The STIMULATE 8 Group consists of: Anne Sylvia ACHOM (Uganda), Helen Hagos BERHE (Ethiopia), Sangeeta Namdev DHAMDHERE (India), Alicia ESGUERRA (The Philippines), Nguyen Thi Ngoc HOAN (Vietnam), John KIYAGA (Uganda), Sheldon Miti MAPONGA (Zimbabwe), Yohannis MARTÍ- LAHERA (Cuba), Kelefa Tende MWANTIMWA (Tanzania), Marlon G. OMPOC (The Philippines), A.I.M. Jakaria RAHMAN (Bangladesh), Bahiru Shifaw YIMER (Ethiopia). Abstract Four alternatives to the journal Impact Factor (IF) indicator are compared to find out their similarities. Together with the IF, the SCImago Journal Rank indicator (SJR), the EigenfactorTM score, the Article InfluenceTM score and the journal h- index of 77 journals from more than ten fields were collected. Results show that although those indicators are calculated with different methods and even use different databases, they are strongly correlated with the WoS IF and among each other. These findings corroborate results published by several colleagues and show the feasibility of using free alternatives to the Web of Science for evaluating scientific journals. Keywords: WoS impact factor, Eigenfactor, SCImago Journal Rank (SJR), Article Influence Score, journal h-index, correlations Introduction STIMULATE stands for Scientific and Technological Information Management in Universities and Libraries: an Active Training Environment. It is an international training programme in information management, supported by the Flemish Interuniversity Council (VLIR), aiming at young scientists and professionals from 2 developing countries. -
Do You Speak Open Science? Resources and Tips to Learn the Language
Do You Speak Open Science? Resources and Tips to Learn the Language. Paola Masuzzo1, 2 - ORCID: 0000-0003-3699-1195, Lennart Martens1,2 - ORCID: 0000- 0003-4277-658X Author Affiliation 1 Medical Biotechnology Center, VIB, Ghent, Belgium 2 Department of Biochemistry, Ghent University, Ghent, Belgium Abstract The internet era, large-scale computing and storage resources, mobile devices, social media, and their high uptake among different groups of people, have all deeply changed the way knowledge is created, communicated, and further deployed. These advances have enabled a radical transformation of the practice of science, which is now more open, more global and collaborative, and closer to society than ever. Open science has therefore become an increasingly important topic. Moreover, as open science is actively pursued by several high-profile funders and institutions, it has fast become a crucial matter to all researchers. However, because this widespread interest in open science has emerged relatively recently, its definition and implementation are constantly shifting and evolving, sometimes leaving researchers in doubt about how to adopt open science, and which are the best practices to follow. This article therefore aims to be a field guide for scientists who want to perform science in the open, offering resources and tips to make open science happen in the four key areas of data, code, publications and peer-review. The Rationale for Open Science: Standing on the Shoulders of Giants One of the most widely used definitions of open science originates from Michael Nielsen [1]: “Open science is the idea that scientific knowledge of all kinds should be openly shared as early as is practical in the discovery process”. -
Tipping Points: Cancelling Journals When Arxiv Access Is Good Enough
Tipping points: cancelling journals when arXiv access is good enough Tony Aponte Sciences Collection Coordinator UCLA Library ASEE ELD Lightning Talk June 17, 2019 Preprint explosion! Brian Resnick and Julia Belluz. (2019). The war to free science. Vox https://www.vox.com/the-highlight/2019/6/3/18271538/open- access-elsevier-california-sci-hub-academic-paywalls Preprint explosion! arXiv. (2019). arXiv submission rate statistics https://arxiv.org/help/stats/2018_by_area/index 2018 Case Study: two physics journals and arXiv ● UCLA: heavy users of arXiv. Not so heavy users of version of record ● Decent UC authorship ● No UC editorial board members 2017 Usage Annual cost Cost per use 2017 Impact Factor Journal A 103 $8,315 ~$80 1.291 Journal B 72 $6,344 ~$88 0.769 Just how many of these articles are OA? OAISSN.py - Enter a Journal ISSN and a year and this python program will tell you how many DOIs from that year have an open access version2 Ryan Regier. (2018). OAISSN.py https://github.com/ryregier/OAcounts. Just how many of these articles are OA? Ryan Regier. (2018). OAISSN.py https://github.com/ryregier/OAcounts. Just how many of these articles are OA? % OA articles from 2017 % OA articles from 2018 Journal A 68% 64% Journal B 11% 8% Ryan Regier. (2018). OAISSN.py https://github.com/ryregier/OAcounts. arXiv e-prints becoming closer to publisher versions of record according to UCLA similarity study of arXiv articles vs versions of record Martin Klein, Peter Broadwell, Sharon E. Farb, Todd Grappone. 2018. Comparing Published Scientific Journal Articles to Their Pre-Print Versions -- Extended Version. -
Web of Science, Scopus, & Altmetrics
Measure Your Research Impact: Author Impact Indices & Other Tools Open Access Week 2017 – Theme: “Open in Order To…” October 25, 2017 About this session We will explore the importance of academic reputation, explaining your research to a wider audience, measuring the impact of your activities related to spreading the word about your publications, and what you can do to enhance yours. Defining Impact “… the beneficial application of research to achieve social, economic, environmental and/or cultural outcomes…. … impact in the academic domain, which is seen more as an indicator of the intrinsic quality of the research on scholarly or academic measures” Australian Research Quality Framework, 2006 a host of social networking platforms designed specifically for scholars abound Author Identification and Measuring Impact Authorship and researcher ID Establishing a unique author/researcher identity is an important step to improving your research visibility and impact. There are a variety of options for creating a unique identity, with ORCID being the latest development. ORCID is well supported by many publishers. ORCID is a central registry of unique identifiers for individual researchers and an open and transparent linking mechanism between ORCID and other current author ID schemes which includes: •Researcher ID - linked to Thomson Reuter's Web of Knowledge platform •My Citations - Google Scholar •Author identifier - Scopus •arXiv – arXiv author identifiers ORCiD® iDs are unique researcher identifiers designed to provide a transparent method for linking researchers and contributors to their activities and outputs. arXiv allows you to link your ORCID iD with your arXiv account. Organizational names suffer from similar ambiguities to those described for researcher names. -
Microsoft Academic Search and Google Scholar Citations
Microsoft Academic Search and Google Scholar Citations Microsoft Academic Search and Google Scholar Citations: A comparative analysis of author profiles José Luis Ortega1 VICYT-CSIC, Serrano, 113 28006 Madrid, Spain, [email protected] Isidro F. Aguillo Cybermetrics Lab, CCHS-CSIC, Albasanz, 26-28 28037 Madrid, Spain, [email protected] Abstract This paper aims to make a comparative analysis between the personal profiling capabilities of the two most important free citation-based academic search engines, namely Microsoft Academic Search (MAS) and Google Scholar Citations (GSC). Author profiles can be very useful for evaluation purposes once the advantages and the shortcomings of these services are described and taken into consideration. A total of 771 personal profiles appearing in both the MAS and the GSC databases are analysed. Results show that the GSC profiles include more documents and citations than those in MAS, but with a strong bias towards the Information and Computing sciences, while the MAS profiles are disciplinarily better balanced. MAS shows technical problems such as a higher number of duplicated profiles and a lower updating rate than GSC. It is concluded that both services could be used for evaluation proposes only if they are applied along with other citation indexes as a way to supplement that information. Keywords: Microsoft Academic Search; Google Scholar Citations; Web bibliometrics; Academic profiles; Research evaluation; Search engines Introduction In November 2009, Microsoft Research Asia started a new web search service specialized in scientific information. Even though Google (Google Scholar) already introduced an academic search engine in 2004, the proposal of Microsoft Academic Search (MAS) went beyond a mere document retrieval service that counts citations. -
The Journal Ranking System Undermining the Impact of 2 Brazilian Science 3 4 Rodolfo Jaffé1 5 6 1 Instituto Tecnológico Vale, Belém-PA, Brazil
bioRxiv preprint doi: https://doi.org/10.1101/2020.07.05.188425; this version posted July 6, 2020. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under aCC-BY-NC-ND 4.0 International license. 1 QUALIS: The journal ranking system undermining the impact of 2 Brazilian science 3 4 Rodolfo Jaffé1 5 6 1 Instituto Tecnológico Vale, Belém-PA, Brazil. Email: [email protected] 7 8 Abstract 9 10 A journal ranking system called QUALIS was implemented in Brazil in 2009, intended to rank 11 graduate programs from different subject areas and promote selected national journals. Since this 12 system uses a complicated suit of criteria (differing among subject areas) to group journals into 13 discrete categories, it could potentially create incentives to publish in low-impact journals ranked 14 highly by QUALIS. Here I assess the influence of the QUALIS journal ranking system on the 15 global impact of Brazilian science. Results reveal a steeper decrease in the number of citations 16 per document since the implementation of this QUALIS system, compared to the top Latin 17 American countries publishing more scientific articles. All the subject areas making up the 18 QUALIS system showed some degree of bias, with social sciences being usually more biased 19 than natural sciences. Lastly, the decrease in the number of citations over time proved steeper in a 20 more biased area, suggesting a faster shift towards low-impact journals ranked highly by 21 QUALIS.