A Matter of Time: Publication Dates in Web of Science Core Collection
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Deciding Where to Publish
LIBRARY INFORMATION SHEET NCSU LIBRARIES – Wm. R. Kenan, Jr. Library of Veterinary Medicine Deciding Where to Publish What are your goals as an author? REPUTATION Journal Impact Factor is one well-known metric for • Change practice of clinicians or scientists scientific journals indexed by Web of Science WITHIN • Earn citations in others’ work you influence a discipline. Impact factors can be found in Journal • Be recognized for your research by clients, Citation Reports (see Libraries Databases) donors, or professional associations It is calculated using 2 years of data, e.g.: AUDIENCE # of 2014 citations / # of eligible articles in 2012-2013 Who are you trying to get to read your article? Journals not indexed by Web of Science do not have • Researchers in your specialized domain an impact factor in Journal Citation Reports. They may be in SciMago www.scimagojr.com/journalrank.php. • Specialty clinicians For NCSU CVM faculty publications (2014), the median • General practitioners IF was 2.641 and the highest IF was 12.996 (American • The public Journal of Respiratory and Critical Care Medicine). Consider how your target journal is disseminated. The 2014 IF of Journal of Veterinary Internal Medicine Open Access (total or embargoed) is important if you was 1.879, e.g. the average paper 1-2 years old has desire an audience beyond academic peer institutions. earned about 2 citations. It can be helpful for looking at article level metrics--did your paper get more citations UlrichsWeb (see Libraries Databases) provides the in the past two years than the impact factor? number of subscribers for many journals. -
Is Sci-Hub Increasing Visibility of Indian Research Papers? an Analytical Evaluation Vivek Kumar Singh1,*, Satya Swarup Srichandan1, Sujit Bhattacharya2
Journal of Scientometric Res. 2021; 10(1):130-134 http://www.jscires.org Perspective Paper Is Sci-Hub Increasing Visibility of Indian Research Papers? An Analytical Evaluation Vivek Kumar Singh1,*, Satya Swarup Srichandan1, Sujit Bhattacharya2 1Department of Computer Science, Banaras Hindu University, Varanasi, Uttar Pradesh, INDIA. 2CSIR-National Institute of Science Technology and Development Studies, New Delhi, INDIA. ABSTRACT Sci-Hub, founded by Alexandra Elbakyan in 2011 in Kazakhstan has, over the years, Correspondence emerged as a very popular source for researchers to download scientific papers. It is Vivek Kumar Singh believed that Sci-Hub contains more than 76 million academic articles. However, recently Department of Computer Science, three foreign academic publishers (Elsevier, Wiley and American Chemical Society) have Banaras Hindu University, filed a lawsuit against Sci-Hub and LibGen before the Delhi High Court and prayed for Varanasi-221005, INDIA. complete blocking these websites in India. It is in this context, that this paper attempts to Email id: [email protected] find out how many Indian research papers are available in Sci-Hub and who downloads them. The citation advantage of Indian research papers available on Sci-Hub is analysed, Received: 16-03-2021 with results confirming that such an advantage do exist. Revised: 29-03-2021 Accepted: 25-04-2021 Keywords: Indian Research, Indian Science, Black Open Access, Open Access, Sci-Hub. DOI: 10.5530/jscires.10.1.16 INTRODUCTION access publishing of their research output, and at the same time encouraging their researchers to publish in openly Responsible Research and Innovation (RRI) has become one accessible forms. -
A Comprehensive Framework to Reinforce Evidence Synthesis Features in Cloud-Based Systematic Review Tools
applied sciences Article A Comprehensive Framework to Reinforce Evidence Synthesis Features in Cloud-Based Systematic Review Tools Tatiana Person 1,* , Iván Ruiz-Rube 1 , José Miguel Mota 1 , Manuel Jesús Cobo 1 , Alexey Tselykh 2 and Juan Manuel Dodero 1 1 Department of Informatics Engineering, University of Cadiz, 11519 Puerto Real, Spain; [email protected] (I.R.-R.); [email protected] (J.M.M.); [email protected] (M.J.C.); [email protected] (J.M.D.) 2 Department of Information and Analytical Security Systems, Institute of Computer Technologies and Information Security, Southern Federal University, 347922 Taganrog, Russia; [email protected] * Correspondence: [email protected] Abstract: Systematic reviews are powerful methods used to determine the state-of-the-art in a given field from existing studies and literature. They are critical but time-consuming in research and decision making for various disciplines. When conducting a review, a large volume of data is usually generated from relevant studies. Computer-based tools are often used to manage such data and to support the systematic review process. This paper describes a comprehensive analysis to gather the required features of a systematic review tool, in order to support the complete evidence synthesis process. We propose a framework, elaborated by consulting experts in different knowledge areas, to evaluate significant features and thus reinforce existing tool capabilities. The framework will be used to enhance the currently available functionality of CloudSERA, a cloud-based systematic review Citation: Person, T.; Ruiz-Rube, I.; Mota, J.M.; Cobo, M.J.; Tselykh, A.; tool focused on Computer Science, to implement evidence-based systematic review processes in Dodero, J.M. -
Sci-Hub Provides Access to Nearly All Scholarly Literature
Sci-Hub provides access to nearly all scholarly literature A DOI-citable version of this manuscript is available at https://doi.org/10.7287/peerj.preprints.3100. This manuscript was automatically generated from greenelab/scihub-manuscript@51678a7 on October 12, 2017. Submit feedback on the manuscript at git.io/v7feh or on the analyses at git.io/v7fvJ. Authors • Daniel S. Himmelstein 0000-0002-3012-7446 · dhimmel · dhimmel Department of Systems Pharmacology and Translational Therapeutics, University of Pennsylvania · Funded by GBMF4552 • Ariel Rodriguez Romero 0000-0003-2290-4927 · arielsvn · arielswn Bidwise, Inc • Stephen Reid McLaughlin 0000-0002-9888-3168 · stevemclaugh · SteveMcLaugh School of Information, University of Texas at Austin • Bastian Greshake Tzovaras 0000-0002-9925-9623 · gedankenstuecke · gedankenstuecke Department of Applied Bioinformatics, Institute of Cell Biology and Neuroscience, Goethe University Frankfurt • Casey S. Greene 0000-0001-8713-9213 · cgreene · GreeneScientist Department of Systems Pharmacology and Translational Therapeutics, University of Pennsylvania · Funded by GBMF4552 PeerJ Preprints | https://doi.org/10.7287/peerj.preprints.3100v2 | CC BY 4.0 Open Access | rec: 12 Oct 2017, publ: 12 Oct 2017 Abstract The website Sci-Hub provides access to scholarly literature via full text PDF downloads. The site enables users to access articles that would otherwise be paywalled. Since its creation in 2011, Sci- Hub has grown rapidly in popularity. However, until now, the extent of Sci-Hub’s coverage was unclear. As of March 2017, we find that Sci-Hub’s database contains 68.9% of all 81.6 million scholarly articles, which rises to 85.2% for those published in toll access journals. -
Searching the Evidence in Web of Science
CAMBRIDGE UNIVERSITY LIBRARY MEDICAL LIBRARY Supporting Literature Searching Searching the Evidence in Web of Science September 2015 0 Supporting Literature Searching Searching the Evidence in Web of Science How to access Web of Science - and what is it? 2 Planning your Search 4 Searching Web of Science 6 Displaying your results 9 Refine Results 10 Citing Articles and Cited References 13 Accessing the full-text 14 Marked List - Email /Print/Export Your Results 15 Save your Strategy 17 More options 18 Help 19 To help you use this guide, indicates a step in the process of searching and retrieving articles. ! indicates a tip, or an extra piece of information. September 2015 1 How to access Web of Science - and what is it? http://wok.mimas.ac.uk Go to http://wok.mimas.ac.uk Click on the central orange button Logging On ! If you are accessing Web of Science from a non-University computer, you will need to log in with your RAVEN password. When you are presented with an ATHENS login screen, click "Alternative/Institutional Login", and search or browse for University of Cambridge. If you have problems logging on, contact the Medical Library. 2 Web of Science is made up of several different sections, including: Web of Science Core Collection o Covering citation indexes in Science, Social Science, Arts & Humanities, Books, Conference Proceedings. SciELO o focusing on literature from Latin American sources Data Citation Index Zoological Report Medline This guide will concentrate on Web of Science Core Collection. What's the difference between a citation index and a database like ! Pubmed? The key element that differentiates citation databases from other searchable databases is the way references are linked across time. -
Scientometrics1
Scientometrics1 Loet Leydesdorff a and Staša Milojević b a Amsterdam School of Communication Research (ASCoR), University of Amsterdam, Kloveniersburgwal 48, 1012 CX Amsterdam, The Netherlands; [email protected] b School of Informatics and Computing, Indiana University, Bloomington 47405-1901, United States; [email protected]. Abstract The paper provides an overview of the field of scientometrics, that is: the study of science, technology, and innovation from a quantitative perspective. We cover major historical milestones in the development of this specialism from the 1960s to today and discuss its relationship with the sociology of scientific knowledge, the library and information sciences, and science policy issues such as indicator development. The disciplinary organization of scientometrics is analyzed both conceptually and empirically. A state-of-the-art review of five major research threads is provided. Keywords: scientometrics, bibliometrics, citation, indicator, impact, library, science policy, research management, sociology of science, science studies, mapping, visualization Cross References: Communication: Electronic Networks and Publications; History of Science; Libraries; Networks, Social; Merton, Robert K.; Peer Review and Quality Control; Science and Technology, Social Study of: Computers and Information Technology; Science and Technology Studies: Experts and Expertise; Social network algorithms and software; Statistical Models for Social Networks, Overview; 1 Forthcoming in: Micheal Lynch (Editor), International -
PUBLICATIONS and PUBLICATION POLICY (FLW, September 2016)
PUBLICATIONS AND PUBLICATION POLICY (FLW, September 2016) General ............................................................................................................................. 2 Why publish? ........................................................................................................... 2 A publication strategy .............................................................................................. 2 Publish where? ........................................................................................................ 2 Aggregation level ..................................................................................................... 3 Social impact ........................................................................................................... 3 Publication types ............................................................................................................... 3 UGent publication typology ...................................................................................... 3 Web-of-science ........................................................................................................ 4 Vabb publication typology ........................................................................................ 4 FWO publication typology ........................................................................................ 5 Relevant publication channels .......................................................................................... 6 How to choose? ...................................................................................................... -
Sci-Hub Downloads Lead to More Article Citations
THE SCI-HUB EFFECT:SCI-HUB DOWNLOADS LEAD TO MORE ARTICLE CITATIONS Juan C. Correa⇤ Henry Laverde-Rojas Faculty of Business Administration Faculty of Economics University of Economics, Prague, Czechia Universidad Santo Tomás, Bogotá, Colombia [email protected] [email protected] Fernando Marmolejo-Ramos Julian Tejada Centre for Change and Complexity in Learning Departamento de Psicologia University of South Australia Universidade Federal de Sergipe [email protected] [email protected] Štepánˇ Bahník Faculty of Business Administration University of Economics, Prague, Czechia [email protected] ABSTRACT Citations are often used as a metric of the impact of scientific publications. Here, we examine how the number of downloads from Sci-hub as well as various characteristics of publications and their authors predicts future citations. Using data from 12 leading journals in economics, consumer research, neuroscience, and multidisciplinary research, we found that articles downloaded from Sci-hub were cited 1.72 times more than papers not downloaded from Sci-hub and that the number of downloads from Sci-hub was a robust predictor of future citations. Among other characteristics of publications, the number of figures in a manuscript consistently predicts its future citations. The results suggest that limited access to publications may limit some scientific research from achieving its full impact. Keywords Sci-hub Citations Scientific Impact Scholar Consumption Knowledge dissemination · · · · Introduction Science and its outputs are essential in daily life, as they help to understand our world and provide a basis for better decisions. Although scientific findings are often cited in social media and shared outside the scientific community [1], their primary use is what we could call “scholar consumption.” This phenomenon includes using websites that provide subscription-based access to massive databases of scientific research [2]. -
Thomson Reuters Journal Selection Process
THOMSON REUTERS JOURNAL SELECTION PROCESS Mariana Boletta Intellectual Property & Science SEARCHING IN A WORLD OF INFORMATION EXPLOSION 10,000+ 5,000,000+ 110,000 Conference papers Conferences Science, Social Science Arts & Humanities journals 50,000+ Academic books 12,000,000+ Patents (per year) 80 million+ ¿POR QUÉ SER SELECTIVO? Gene Sequences Information≠Knowledge 1 JOURNALS MUST BE SELECTED papers will be read by a 200+ researcher in a year, on average of journals (50,000+) on 0.4% average a scientist is capable of reading in a year. Tenopir C. What Scientists Really Need. In: American Association for the Advancement of Science Meeting (AAAS). Washington D.C.; 2005. 2 Sólo las revistas incluidas en la colección núcleo se evalúan For decades, Thomson Reuters has carefully evaluated journals for potential inclusion in the database La mayoría de los artículos importantes se publican en una cantidad relativamente pequeña de revistas especializadas GOLD STANDARD used by 5,000+ institutions in 100+ countries Nuestra misión es actualizar la cobertura de las revistas especializadas en Web of Science JOURNAL SELECTION : MAIN OBJECTIVES 1. To evaluate and select the best scholarly journal content available today for coverage in Web of Science. As a result, the Web of Science is known as the worldwide source for top tier scholarly research published in the best international and regional journals. 2. Provide the worldwide publishing community with objective standards useful in building world-class publications. Thomson Reuters has built lasting partnerships with the global scholarly publishing community. We work together to improve the quality of scholarly communication everywhere in the world. -
Google Scholar: the Democratization of Citation Analysis?
Google Scholar: the democratization of citation analysis? Anne-Wil Harzing Ron van der Wal Version November 2007 Accepted for Ethics in Science and Environmental Politics Copyright © 2007 Anne-Wil Harzing and Ron van der Wal. All rights reserved. Dr. Anne-Wil Harzing Email: [email protected] University of Melbourne Web: www.harzing.com Department of Management Faculty of Economics & Commerce Parkville Campus Melbourne, VIC 3010 Australia Google Scholar: the democratization of citation analysis? Anne-Wil Harzing* 1, Ron van der Wal2 1 Department of Management, University of Melbourne, Parkville Campus, Parkville, Victoria 3010, Australia * Email: [email protected] 2 Tarma Software Research, GPO Box 4063, Melbourne, Victoria 3001 Australia Running head: citation analysis with Google Scholar Key words: Google Scholar, citation analysis, publish or perish, h-index, g-index, journal impact factor Abstract Traditionally, the most commonly used source of bibliometric data is Thomson ISI Web of Knowledge, in particular the (Social) Science Citation Index and the Journal Citation Reports (JCR), which provide the yearly Journal Impact Factors (JIF). This paper presents an alternative source of data (Google Scholar, GS) as well as three alternatives to the JIF to assess journal impact (the h-index, g-index and the number of citations per paper). Because of its broader range of data sources, the use of GS generally results in more comprehensive citation coverage in the area of Management and International Business. The use of GS particularly benefits academics publishing in sources that are not (well) covered in ISI. Among these are: books, conference papers, non-US journals, and in general journals in the field of Strategy and International Business. -
On the Relation Between the Wos Impact Factor, the Eigenfactor, the Scimago Journal Rank, the Article Influence Score and the Journal H-Index
1 On the relation between the WoS impact factor, the Eigenfactor, the SCImago Journal Rank, the Article Influence Score and the journal h-index Ronald ROUSSEAU 1 and the STIMULATE 8 GROUP 2 1 KHBO, Dept. Industrial Sciences and Technology, Oostende, Belgium [email protected] 2Vrije Universiteit Brussel (VUB), Brussels, Belgium The STIMULATE 8 Group consists of: Anne Sylvia ACHOM (Uganda), Helen Hagos BERHE (Ethiopia), Sangeeta Namdev DHAMDHERE (India), Alicia ESGUERRA (The Philippines), Nguyen Thi Ngoc HOAN (Vietnam), John KIYAGA (Uganda), Sheldon Miti MAPONGA (Zimbabwe), Yohannis MARTÍ- LAHERA (Cuba), Kelefa Tende MWANTIMWA (Tanzania), Marlon G. OMPOC (The Philippines), A.I.M. Jakaria RAHMAN (Bangladesh), Bahiru Shifaw YIMER (Ethiopia). Abstract Four alternatives to the journal Impact Factor (IF) indicator are compared to find out their similarities. Together with the IF, the SCImago Journal Rank indicator (SJR), the EigenfactorTM score, the Article InfluenceTM score and the journal h- index of 77 journals from more than ten fields were collected. Results show that although those indicators are calculated with different methods and even use different databases, they are strongly correlated with the WoS IF and among each other. These findings corroborate results published by several colleagues and show the feasibility of using free alternatives to the Web of Science for evaluating scientific journals. Keywords: WoS impact factor, Eigenfactor, SCImago Journal Rank (SJR), Article Influence Score, journal h-index, correlations Introduction STIMULATE stands for Scientific and Technological Information Management in Universities and Libraries: an Active Training Environment. It is an international training programme in information management, supported by the Flemish Interuniversity Council (VLIR), aiming at young scientists and professionals from 2 developing countries. -
A “Citation Surplus” Should Be Added to the H-Index
From the SelectedWorks of Sergio Da Silva 2017 A “Citation Surplus” Should Be Added to the h- Index Sergio Da Silva, Federal University of Santa Catarina Available at: https://works.bepress.com/sergiodasilva/178/ Open Access Library Journal 2017, Volume 4, e3959 ISSN Online: 2333-9721 ISSN Print: 2333-9705 A “Citation Surplus” Should Be Added to the h-Index Sergio Da Silva Department of Economics, Federal University of Santa Catarina, Florianopolis, Brazil How to cite this paper: Da Silva, S. (2017) Abstract A “Citation Surplus” Should Be Added to the h-Index. Open Access Library Journal, 4: The h-index is the largest number h such that h publications have at least h e3959. citations. The index reflects both the number of publications and the number https://doi.org/10.4236/oalib.1103959 of citations per publication. One unperceived deficiency of this metric is that Received: September 21, 2017 it is Pareto-inefficient. A “citation surplus” would be absent and, thus, the Accepted: October 22, 2017 h-index would be efficient for a researcher if all his h papers that are equal or Published: October 25, 2017 above his h-index received exactly h citations. This inefficiency would not be Copyright © 2017 by author and Open of great concern if those h papers were normally distributed. However, the Access Library Inc. rank from top to bottom does not decay exponentially. The decay follows the This work is licensed under the Creative power law known in the literature as Lotka’s law. To remedy this deficiency, I Commons Attribution International License (CC BY 4.0).