Citation Analysis: Web of Science, Scopus
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Transitive Reduction of Citation Networks Arxiv:1310.8224V2
Transitive reduction of citation networks James R. Clough, Jamie Gollings, Tamar V. Loach, Tim S. Evans Complexity and Networks group, Imperial College London, South Kensington campus, London, SW7 2AZ, United Kingdom March 28, 2014 Abstract In many complex networks the vertices are ordered in time, and edges represent causal connections. We propose methods of analysing such directed acyclic graphs taking into account the constraints of causality and highlighting the causal struc- ture. We illustrate our approach using citation networks formed from academic papers, patents, and US Supreme Court verdicts. We show how transitive reduc- tion reveals fundamental differences in the citation practices of different areas, how it highlights particularly interesting work, and how it can correct for the effect that the age of a document has on its citation count. Finally, we transitively reduce null models of citation networks with similar degree distributions and show the difference in degree distributions after transitive reduction to illustrate the lack of causal structure in such models. Keywords: directed acyclic graph, academic paper citations, patent citations, US Supreme Court citations 1 Introduction Citation networks are complex networks that possess a causal structure. The vertices are documents ordered in time by their date of publication, and the citations from one document to another are represented by directed edges. However unlike other directed networks, the edges of a citation network are also constrained by causality, the edges must always point backwards in time. Academic papers, patent documents, and court judgements all form natural citation networks. Citation networks are examples of directed acyclic graphs, which appear in many other contexts: from scheduling problems [1] to theories of the structure of space-time [2]. -
Citation Analysis for the Modern Instructor: an Integrated Review of Emerging Research
CITATION ANALYSIS FOR THE MODERN INSTRUCTOR: AN INTEGRATED REVIEW OF EMERGING RESEARCH Chris Piotrowski University of West Florida USA Abstract While online instructors may be versed in conducting e-Research (Hung, 2012; Thelwall, 2009), today’s faculty are probably less familiarized with the rapidly advancing fields of bibliometrics and informetrics. One key feature of research in these areas is Citation Analysis, a rather intricate operational feature available in modern indexes such as Web of Science, Scopus, Google Scholar, and PsycINFO. This paper reviews the recent extant research on bibliometrics within the context of citation analysis. Particular focus is on empirical studies, review essays, and critical commentaries on citation-based metrics across interdisciplinary academic areas. Research that relates to the interface between citation analysis and applications in higher education is discussed. Some of the attributes and limitations of citation operations of contemporary databases that offer citation searching or cited reference data are presented. This review concludes that: a) citation-based results can vary largely and contingent on academic discipline or specialty area, b) databases, that offer citation options, rely on idiosyncratic methods, coverage, and transparency of functions, c) despite initial concerns, research from open access journals is being cited in traditional periodicals, and d) the field of bibliometrics is rather perplex with regard to functionality and research is advancing at an exponential pace. Based on these findings, online instructors would be well served to stay abreast of developments in the field. Keywords: Bibliometrics, informetrics, citation analysis, information technology, Open resource and electronic journals INTRODUCTION In an ever increasing manner, the educational field is irreparably linked to advances in information technology (Plomp, 2013). -
Citation Analysis As a Tool in Journal Evaluation Journals Can Be Ranked by Frequency and Impact of Citations for Science Policy Studies
Citation Analysis as a Tool in Journal Evaluation Journals can be ranked by frequency and impact of citations for science policy studies. Eugene Garfield [NOTE: Tbe article reptintedbere was referenced in the eoay vbi.b begins m @g. 409 is Volume I, IIS ia. #dverte#t omission W6Sdiscovered too Me to iwctnde it at it~ proper Iocatioa, immediately follom”ag tbe essay ) As a communications system, the net- quinquennially, but the data base from work of journals that play a paramount which the volumes are compiled is role in the exchange of scientific and maintained on magnetic tape and is up- technical information is little under- dated weekly. At the end of 1971, this stood. Periodically since 1927, when data base contained more than 27 mi[- Gross and Gross published their study tion references to about 10 million dif- (1) of references in 1 year’s issues of ferent published items. These references the Journal of the American Chemical appeared over the past decade in the Socie/y, pieces of the network have footnotes and bibliographies of more been illuminated by the work of Brad- than 2 million journal articles, commu- ford (2), Allen (3), Gross and nications, letters, and so on. The data Woodford (4), Hooker (5), Henkle base is, thus, not only multidisciplinary, (6), Fussier (7), Brown (8), and it covers a substantial period of time others (9). Nevertheless, there is still no and, being in machine-readable form, is map of the journal network as a whok. amenable to extensive manipulation by To date, studies of the network and of computer. -
How Can Citation Impact in Bibliometrics Be Normalized?
RESEARCH ARTICLE How can citation impact in bibliometrics be normalized? A new approach combining citing-side normalization and citation percentiles an open access journal Lutz Bornmann Division for Science and Innovation Studies, Administrative Headquarters of the Max Planck Society, Hofgartenstr. 8, 80539 Munich, Germany Downloaded from http://direct.mit.edu/qss/article-pdf/1/4/1553/1871000/qss_a_00089.pdf by guest on 01 October 2021 Keywords: bibliometrics, citation analysis, citation percentiles, citing-side normalization Citation: Bornmann, L. (2020). How can citation impact in bibliometrics be normalized? A new approach ABSTRACT combining citing-side normalization and citation percentiles. Quantitative Since the 1980s, many different methods have been proposed to field-normalize citations. In this Science Studies, 1(4), 1553–1569. https://doi.org/10.1162/qss_a_00089 study, an approach is introduced that combines two previously introduced methods: citing-side DOI: normalization and citation percentiles. The advantage of combining two methods is that their https://doi.org/10.1162/qss_a_00089 advantages can be integrated in one solution. Based on citing-side normalization, each citation Received: 8 May 2020 is field weighted and, therefore, contextualized in its field. The most important advantage of Accepted: 30 July 2020 citing-side normalization is that it is not necessary to work with a specific field categorization scheme for the normalization procedure. The disadvantages of citing-side normalization—the Corresponding Author: Lutz Bornmann calculation is complex and the numbers are elusive—can be compensated for by calculating [email protected] percentiles based on weighted citations that result from citing-side normalization. On the one Handling Editor: hand, percentiles are easy to understand: They are the percentage of papers published in the Ludo Waltman same year with a lower citation impact. -
A Citation Analysis of the Quantitative/Qualitative Methods Debate's Reflection in Sociology Research: Implications for Library Collection Development
Georgia State University ScholarWorks @ Georgia State University University Library Faculty Publications Georgia State University Library January 2004 A Citation Analysis of the Quantitative/Qualitative Methods Debate's Reflection in Sociology Research: Implications for Library Collection Development Amanda J. Swygart-Hobaugh M.L.S., Ph.D. Georgia State University, [email protected] Follow this and additional works at: https://scholarworks.gsu.edu/univ_lib_facpub Part of the Library and Information Science Commons, and the Sociology Commons Recommended Citation Swygart-Hobaugh, A. J. (2004). A citation analysis of the quantitative/qualitative methods debate's reflection in sociology esearr ch: Implications for library collection development. Library Collections, Acquisitions, and Technical Services, 28, 180-95. This Article is brought to you for free and open access by the Georgia State University Library at ScholarWorks @ Georgia State University. It has been accepted for inclusion in University Library Faculty Publications by an authorized administrator of ScholarWorks @ Georgia State University. For more information, please contact [email protected]. A Citation Analysis of the Quantitative/Qualitative Methods Debate’s Reflection in Sociology Research: Implications for Library Collection Development Amanda J. Swygart-Hobaugh Consulting Librarian for the Social Sciences Russell D. Cole Library Cornell College 600 First Street West Mt. Vernon, IA 52314-1098 [email protected] NOTICE: This is the author’s version of a work that was accepted for publication in Library Collections, Acquisitions, and Technical Services. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. -
How Frequently Are Articles in Predatory Open Access Journals Cited
publications Article How Frequently Are Articles in Predatory Open Access Journals Cited Bo-Christer Björk 1,*, Sari Kanto-Karvonen 2 and J. Tuomas Harviainen 2 1 Hanken School of Economics, P.O. Box 479, FI-00101 Helsinki, Finland 2 Department of Information Studies and Interactive Media, Tampere University, FI-33014 Tampere, Finland; Sari.Kanto@ilmarinen.fi (S.K.-K.); tuomas.harviainen@tuni.fi (J.T.H.) * Correspondence: bo-christer.bjork@hanken.fi Received: 19 February 2020; Accepted: 24 March 2020; Published: 26 March 2020 Abstract: Predatory journals are Open Access journals of highly questionable scientific quality. Such journals pretend to use peer review for quality assurance, and spam academics with requests for submissions, in order to collect author payments. In recent years predatory journals have received a lot of negative media. While much has been said about the harm that such journals cause to academic publishing in general, an overlooked aspect is how much articles in such journals are actually read and in particular cited, that is if they have any significant impact on the research in their fields. Other studies have already demonstrated that only some of the articles in predatory journals contain faulty and directly harmful results, while a lot of the articles present mediocre and poorly reported studies. We studied citation statistics over a five-year period in Google Scholar for 250 random articles published in such journals in 2014 and found an average of 2.6 citations per article, and that 56% of the articles had no citations at all. For comparison, a random sample of articles published in the approximately 25,000 peer reviewed journals included in the Scopus index had an average of 18, 1 citations in the same period with only 9% receiving no citations. -
Google Scholar, Web of Science, and Scopus
Journal of Informetrics, vol. 12, no. 4, pp. 1160-1177, 2018. https://doi.org/10.1016/J.JOI.2018.09.002 Google Scholar, Web of Science, and Scopus: a systematic comparison of citations in 252 subject categories Alberto Martín-Martín1 , Enrique Orduna-Malea2 , Mike 3 1 Thelwall , Emilio Delgado López-Cózar Version 1.6 March 12, 2019 Abstract Despite citation counts from Google Scholar (GS), Web of Science (WoS), and Scopus being widely consulted by researchers and sometimes used in research evaluations, there is no recent or systematic evidence about the differences between them. In response, this paper investigates 2,448,055 citations to 2,299 English-language highly-cited documents from 252 GS subject categories published in 2006, comparing GS, the WoS Core Collection, and Scopus. GS consistently found the largest percentage of citations across all areas (93%-96%), far ahead of Scopus (35%-77%) and WoS (27%-73%). GS found nearly all the WoS (95%) and Scopus (92%) citations. Most citations found only by GS were from non-journal sources (48%-65%), including theses, books, conference papers, and unpublished materials. Many were non-English (19%- 38%), and they tended to be much less cited than citing sources that were also in Scopus or WoS. Despite the many unique GS citing sources, Spearman correlations between citation counts in GS and WoS or Scopus are high (0.78-0.99). They are lower in the Humanities, and lower between GS and WoS than between GS and Scopus. The results suggest that in all areas GS citation data is essentially a superset of WoS and Scopus, with substantial extra coverage. -
A New Index for the Citation Curve of Researchers
A New Index for the Citation Curve of Researchers Claes Wohlin School of Engineering Blekinge Institute of Technology Box 520 SE37225 Ronneby Sweden Email: [email protected] Abstract Internet has made it possible to move towards researcher and article impact instead of solely focusing on journal impact. To support citation measurement, several indexes have been proposed, including the h‐index. The h‐index provides a point estimate. To address this, a new index is proposed that takes the citation curve of a researcher into account. This article introduces the index, illustrates its use and compares it to rankings based on the h‐index as well as rankings based on publications. It is concluded that the new index provides an added value, since it balances citations and publications through the citation curve. Keywords: impact factor, citation analysis, h‐index, g‐index, Hirsch, w‐index 1. Introduction The impact of research is essential. In attempts to quantify impact, different measures have been introduced. This includes the impact factor for journals introduced 45 years ago by GARFIELD and SHER [1963] and GARFIELD [2006]. However, it may be argued that the impact of a journal becomes less important as research articles becomes available on‐line and could be easily located even if they are not published in a high impact journal. In general, high impact articles are more important than the journal itself. Historically, highly cited articles have been published in high impact journals, but this pattern may change as for example conference articles are as easily accessible as journal articles. -
Google Scholar: the Democratization of Citation Analysis?
Google Scholar: the democratization of citation analysis? Anne-Wil Harzing Ron van der Wal Version November 2007 Accepted for Ethics in Science and Environmental Politics Copyright © 2007 Anne-Wil Harzing and Ron van der Wal. All rights reserved. Dr. Anne-Wil Harzing Email: [email protected] University of Melbourne Web: www.harzing.com Department of Management Faculty of Economics & Commerce Parkville Campus Melbourne, VIC 3010 Australia Google Scholar: the democratization of citation analysis? Anne-Wil Harzing* 1, Ron van der Wal2 1 Department of Management, University of Melbourne, Parkville Campus, Parkville, Victoria 3010, Australia * Email: [email protected] 2 Tarma Software Research, GPO Box 4063, Melbourne, Victoria 3001 Australia Running head: citation analysis with Google Scholar Key words: Google Scholar, citation analysis, publish or perish, h-index, g-index, journal impact factor Abstract Traditionally, the most commonly used source of bibliometric data is Thomson ISI Web of Knowledge, in particular the (Social) Science Citation Index and the Journal Citation Reports (JCR), which provide the yearly Journal Impact Factors (JIF). This paper presents an alternative source of data (Google Scholar, GS) as well as three alternatives to the JIF to assess journal impact (the h-index, g-index and the number of citations per paper). Because of its broader range of data sources, the use of GS generally results in more comprehensive citation coverage in the area of Management and International Business. The use of GS particularly benefits academics publishing in sources that are not (well) covered in ISI. Among these are: books, conference papers, non-US journals, and in general journals in the field of Strategy and International Business. -
The Journal Impact Factor Denominator Defining Citable (Counted) Items
COMMENTARIES The Journal Impact Factor Denominator Defining Citable (Counted) Items Marie E. McVeigh, MS The items counted in the denominator of the impact fac- tor are identifiable in the Web of Science database by hav- Stephen J. Mann ing the index field document type set as “Article,” “Re- view,” or “Proceedings Paper” (a specialized subset of the VER ITS 30-YEAR HISTORY, THE JOURNAL IMPACT article document type). These document types identify the factor has been the subject of much discussion scholarly contribution of the journal to the literature and and debate.1 From its first release in 1975, bib- are counted as “citable items” in the denominator of the im- liometricians and library scientists discussed its pact factor. A journal accepted for coverage in the Thom- Ovalue and its vagaries. In the last decade, discussion has 6 son Reuters citation database is reviewed by experts who shifted to the way in which impact factor data are used. In consider the bibliographic and bibliometric characteristics an environment eager for objective measures of productiv- of all article types published by that journal (eg, items), which ity, relevance, and research value, the impact factor has been are covered by that journal in the context of other materi- applied broadly and indiscriminately.2,3 The impact factor als in the journal, the subject, and the database as a whole. has gone from being a measure of a journal’s citation influ- This journal-specific analysis identifies the journal sec- ence in the broader literature to a surrogate that assesses tions, subsections, or both that contain materials likely to the scholarly value of work published in that journal. -
Finding Seminal Scientific Publications with Graph Mining
DEGREE PROJECT, IN COMPUTER SCIENCE , SECOND LEVEL STOCKHOLM, SWEDEN 2015 Finding seminal scientific publications with graph mining MARTIN RUNELÖV KTH ROYAL INSTITUTE OF TECHNOLOGY SCHOOL OF COMPUTER SCIENCE AND COMMUNICATION Finding seminal scientific publications with graph mining MARTIN RUNELÖV Master’s Thesis at CSC Supervisor at CSC: Olov Engwall Supervisor at SICS: John Ardelius August 16, 2015 Abstract We investigate the applicability of network analysis to the prob- lem of finding seminal publications in scientific publishing. In particular, we focus on the network measures betweenness cen- trality, the so-called backbone graph, and the burstiness of cita- tions. The metrics are evaluated using precision-related scores with respect to gold standards based on fellow programmes and manual annotation. Citation counts, PageRank, and random se- lection are used as baselines. We find that the backbone graph provides us with a way to possibly discover seminal publications with low citation count, and combining betweenness and bursti- ness gives results on par with citation count. Referat Användning av grafanalys för att hitta betydelsefulla vetenskapliga artiklar I detta examensarbete undersöks det huruvida analys av cite- ringsgrafer kan användas för att finna betydelsefulla vetenskapli- ga publikationer. Framför allt studeras ”betweenness”-centralitet, den så kallade ”backbone”-grafen samt ”burstiness” av citering- ar. Dessa mått utvärderas med hjälp av precisionsmått med av- seende på guldstandarder baserade på ’fellow’-program samt via manuell annotering. Antal citeringar, PageRank, och slumpmäs- sigt urval används som jämförelse. Resultaten visar att ”backbone”-grafen kan bidra till att eventuellt upptäcka bety- delsefulla publikationer med ett lågt antal citeringar samt att en kombination av ”betweenness” och ”burstiness” ger resultat i nivå med de man får av att räkna antal citeringar. -
Author Template for Journal Articles
Mining citation information from CiteSeer data Dalibor Fiala University of West Bohemia, Univerzitní 8, 30614 Plzeň, Czech Republic Phone: 00420 377 63 24 29, fax: 00420 377 63 24 01, email: [email protected] Abstract: The CiteSeer digital library is a useful source of bibliographic information. It allows for retrieving citations, co-authorships, addresses, and affiliations of authors and publications. In spite of this, it has been relatively rarely used for automated citation analyses. This article describes our findings after extensively mining from the CiteSeer data. We explored citations between authors and determined rankings of influential scientists using various evaluation methods including cita- tion and in-degree counts, HITS, PageRank, and its variations based on both the citation and colla- boration graphs. We compare the resulting rankings with lists of computer science award winners and find out that award recipients are almost always ranked high. We conclude that CiteSeer is a valuable, yet not fully appreciated, repository of citation data and is appropriate for testing novel bibliometric methods. Keywords: CiteSeer, citation analysis, rankings, evaluation. Introduction Data from CiteSeer have been surprisingly little explored in the scientometric lite- rature. One of the reasons for this may have been fears that the data gathered in an automated way from the Web are inaccurate – incomplete, erroneous, ambiguous, redundant, or simply wrong. Also, the uncontrolled and decentralized nature of the Web is said to simplify manipulating and biasing Web-based publication and citation metrics. However, there have been a few attempts at processing the Cite- Seer data which we will briefly mention. Zhou et al.