Citation Metrics Overview

Total Page:16

File Type:pdf, Size:1020Kb

Citation Metrics Overview UNIVERSITY LIBRARY Contents ARTICLE-LEVEL CITATION METRICS ........................................................................................................................................................................................................................ 1 AUTHOR-LEVEL METRICS ....................................................................................................................................................................................................................................... 2 JOURNAL-LEVEL METRICS ...................................................................................................................................................................................................................................... 4 Article-Level Citation Metrics These metrics can be used as indicators of the performance of individual articles or publications. Data Field- Metric Brief Description Access via Timeframe Source(s) Weighted* Citation Count Number of citations to the document captured within the data Scopus, Web Scopus, Web source. of Science, of Science, No Customisable Google Google Scholar Scholar Field-Weighted Citation Measure of citations received by a document relative to the world Impact (FWCI) average of citations received by documents of similar type, year of publication, and research field over a three year period. An FWCI Scopus Scopus Yes 3 years of 1.00 indicates that the document has been cited as expected based on the global average. Citation benchmarking Measure of citations received by a document compared with the percentile average for documents of similar type, date of publication, and field of research within an 18 month period. A document in the 99th Scopus Scopus Yes 1.5 years percentile is in the top 1% globally. Page 1 of 4 UNIVERSITY LIBRARY Author-Level Metrics These metrics can be calculated for whole-career or across a specific time period. Many can also be calculated for all subjects or specific FoR codes. Data Field- Metric Brief Description Access via Timeframe Source(s) Weighted* Scholarly output The total number of documents attributed to an author. Scopus, Web Scopus, Web of Science, of Science, No Customisable Google Google Scholar Scholar Total citation count The total number of citations received by documents attributed to Scopus, Web Scopus, Web an author. Can be calculated with and without self-citations. of Science, of Science, No Customisable Google Google Scholar Scholar Citations per publication The average number of citations received by an author per SciVal, Web SciVal, Web No Customisable publication. of Science of Science Field-weighted citation Measure of citations received by an author’s publications relative impact (FWCI) to the world average of citations received by publications of similar type, year of publication, and research field. An FWCI of SciVal SciVal Yes Customisable 1.00 indicates that the author’s publications have been cited as expected based on the global average. Category-normalised Number of citations received compared to the expected citation Web of citation impact (CNCI) rate of publications of similar document type, publication year, Science, InCites Yes Customisable and subject area. A CNCI of 1.00 indicates citation performance InCites at par with world average. H-Index Measure of the whole-career productivity and impact of an Scopus, Web Scopus, Web author’s publications based on the total number of publications of Science, of Science, No Whole career and citations received per publication. Google Google Scholar Scholar H5- Index The h-index calculated for the past 5 years, rather than a whole career. Scopus SciVal No 5 years Page 2 of 4 UNIVERSITY LIBRARY Publications in top Percentage of outputs in the most-cited journals, ranked by percentiles most-cited journal-level metrics such as Source-Normalized Impact Per Paper journals or SCImago Journal Rank. Can be calculated based on the top SciVal SciVal No Customisable 1%, 5%, 10% or 25% percentiles. International The volume of an author’s scholarly output that features collaboration international collaboration, calculated from co-author affiliations. SciVal SciVal No Customisable i10-Index Number of an author’s publications with at least 10 citations. Whole Google Google No career, last 5 Scholar Scholar years Page 3 of 4 UNIVERSITY LIBRARY Journal-Level Metrics Rankings based on these metrics can be used to compare journals. Data Field- Metric Brief Description Access via Timeframe Source(s) Weighted* Journal Impact Factor The number of citations to the journal within the most recent Journal Journal (JIF) reported year to items published in the previous two years, Citation Citation divided by the number of citable items in the previous two years. Reports, No 3 years Reports (Web Web of of Science) Science SCImago Journal Rank Calculated based on both the number of citations to the journal Scopus, (SJR) and a measure of prestige of citing publications. SCImago Scopus Yes 4 years Journal and Country Rank Source Normalized A ratio of a citation per paper calculation and the expected Impact Per Paper (SNIP) number of citations for the journal. Scopus Scopus Yes 4 years CiteScore Elsevier metric calculated from the number of citations received within one year to documents published in the previous three Scopus Scopus No 4 years years. Eigenfactor Score Calculated from the number of citations to items from the past five Journal Journal years in the most recent year, accounting for the citation rates of Citation Citation Yes 6 years citing journals. Reports (Web Reports of Science) 5-Year Impact Factor The number of citations to the journal within the most recent Journal Journal reported year to items published in the previous two years, Citation Citation divided by the number of citable items in the previous five years. Reports, No 5 years Reports (Web Web of of Science) Science % Documents Cited The percentage of a journal’s publications that have been cited at least once. InCites InCites No Customisable Page 4 of 4 .
Recommended publications
  • Australian Business Deans Council 2019 Journal Quality List Review Final Report 6 December 2019
    Australian Business Deans Council 2019 Journal Quality List Review Final Report 6 December 2019 1 About the Australian Business Deans Council The Australian Business Deans Council (ABDC) is the peak body of Australian university business schools. Our 38 members graduate one-third of all Australian university students and more than half of the nation’s international tertiary students. ABDC’s mission is to make Australian business schools better, and to foster the national and global impact of Australian business education and research. ABDC does this by: • Being the collective and collegial voice of university business schools • Providing opportunities for members to share knowledge and best practice • Creating and maintaining strong, collaborative relationships with affiliated national and international peak industry, higher education, professional and government bodies • Engaging in strategic initiatives and activities that further ABDC’s mission. Australian Business Deans Council Inc. UNSW Business School, Deans Unit, Level 6, West Lobby, College Road, Kensington, NSW, Australia 2052 T: +61 (0)2 6162 2970 E: [email protected] 2 Table of Contents Acknowledgements 4 Background and Context 4 Method and Approach 7 Outcomes 10 Beyond the 2019 Review 13 Appendix 1 – Individual Panel Reports 14 Information Systems 15 Economics 20 Accounting 37 Finance including Actuarial Studies 57 Management, Commercial Services and Transport and Logistics 63 (and Other, covering 1599) Marketing and Tourism 78 Business and Taxation Law 85 Appendix 2 – Terms
    [Show full text]
  • New Perspectives Welcome to the First Research Trends Maga- Zine, Which Accompanies Our 15Th Issue of Research Trends Online
    r ReseaRch TRendst magazine New perspectives Welcome to the first Research Trends maga- zine, which accompanies our 15th issue of Research Trends online. Research Trends 15 is a special issue devoted to the role of bibliometric indicators in journal evaluation and research assessment. Over the past 40 years, academic evaluation has changed radically, in both its objectives and methods. Bibliometrics has grown to meet emerging challenges, and new metrics are frequently introduced to measure different aspects of academic performance and improve upon earlier, pioneering work. it is becoming increasingly apparent that assessment and benchmarking are here to stay, and bibliometrics are an intrinsic aspect of today’s evaluation landscape, whether of journals, researchers or insti- tutes. This is not uncontroversial, and we have featured many critics of bibliometric analysis in general and of specific tools or their applications in past issues (please visit www.researchtrends.com). The bibliometric community acknowl- edges these shortcomings and is tirelessly campaigning for better understanding and careful use of the metrics they produce. Therefore, in this issue, we speak to three of the world’s leading researchers in this area, who all agree that bibliometric indi- cators should always be used with exper- tise and care, and that they should never completely supercede more traditional methods of evaluation, such as peer review. in fact, they all advocate a wider choice of more carefully calibrated metrics as the only way to ensure fair assessment. if you would like to read any of our back issues or subscribe to our bi-monthly e-newsletter, please visit www.researchtrends.com Kind regards, The Research Trends editorial Board Research Trends is a bi-monthly online newsletter providing objective, up-to-the-minute insights into scientific trends based on bibliometric analysis.
    [Show full text]
  • Research Metrics Definitions
    University Library Research Metrics Definitions Research Metrics 1. Author metrics h-index The h-index was developed by J.E. Hirsch and published in PNAS in 2005. The h-index is method of measuring the productivity and impact of a researcher's work and is calculated using the number of publications with a citation number greater than or equal to h. Example: An author with 20 publications that have at least 20 citations each will have a h-index of 20 Your h-index will vary, depending on which research outputs are included in the calculation. Tools: Aurora, Google Scholar, Scopus, Web of Science i10-index The number of publications with at least 10 citations. Tool: Google Scholar Total citations Tools: Google Scholar, Scopus, Web of Science Altmetrics Altmetrics are alternative metrics such as social media mentions, downloads etc. Tools: Altmetric Explorer (aggregated and article-level) 2. Article metrics Citation Count The number of times an article has been cited Tools: Aurora, Google Scholar, Scopus, Web of Science Field Weighted Citation Impact (FWCI) This shows how well cited this article is when compared to similar articles. A FWCI greater than 1.00 means the article is more cited than expected according to the world average. FWCI takes into account the year of publication, document type, and the disciplines associated with its source. Tools: Scopus Altmetrics Altmetrics are alternative metrics such as social media mentions, downloads etc. Tools: Altmetric Explorer, PlumX Metrics (Scopus), Journal web pages Last updated April 2019 Research Metrics 3. Journal metrics Journal Impact Factor (JIF) Definition The Journal Impact Factor is the average number of times articles from the journal published in the past two years have been cited in the JCR year.
    [Show full text]
  • 2017​ ​Metrics​ ​Tools​ ​-​ ​Current​ ​Strengths​ ​And
    2017 Metrics Tools - Current Strengths and Issues ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Author evaluation ​ ​ Strengths Known Issues ​ ​ h-index ● Combines measure of quantity ● Different for different fields of research – ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ (publications) and impact (citations) may be more useful for ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Provides high objectivity Science/Engineering, less suitable for ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Easily obtained Education and the Social Sciences ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Measures impact of researcher rather ● Depends on duration of researcher’s ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ than of any particular publication career ​ ​ ​ ​ ​ ​ ​ ​ ● Does not highlight highly cited papers ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Difficult to obtain for complete output ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ including all material types ​ ​ ​ ​ ​ ​ ● Different sources may give different ​ ​ ​ ​ ​ ​ ​ ​ values e.g. Google Scholar vs Web of ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Science ● Authors need to check that publications ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ counted belong to them ​ ​ ​ ​ ​ ​ i10-index ● Simple to calculate ● Used only in Google Scholar ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Authors need to check that publications ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ counted belong to them ​ ​ ​ ​ ​ ​ Publication count ​ ​ ● Easily obtained - most databases and ● Does not distinguish between different ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Google Scholar give publication counts document types and duration of ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ publication process ​ ​ ● Dependent on material types indexed by ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ particular
    [Show full text]
  • Advanced Scientometrics Workshop
    Advanced Scientometrics Workshop Nilofar Barahmand 1 Phd Student of Library and Information Sciences [email protected] 2 Author Level Metrics in Scopus Productivity Impact Document Count Citation Count With out self citation H- index Field-weighted Citation Impact (FWCI) h-index 3 The h-index is an author-level metric that attempts to measure both the productivity and citation impact of the publications of a scientist or scholar. “A scientist has index h if h of his or her Np papers have at least h citations each and the other (Np – h) papers have ≤h citations each.” 4 h- index: 5 Scopus Merge Author 6 Field-weighted Citation Impact (FWCI) Entity Subject Area Citation/Paper Researcher A Clinical Medicine 3 Researcher B Mathematics 1 Which one’s Performance is Better? 7 Field-weighted Citation Impact (FWCI) Field-Weighted Citation Impact is a Ratio that takes into account the differences in research behavior across disciplines. Field-Weighted Citation Impact is the ratio of the total citations actually received by the denominator’s output, and the total citations that would be expected based on the average of the subject field, year & document type. 8 Field-weighted Citation Impact (FWCI) Entity Subject Area Citation/Paper Researcher A Clinical Medicine 3 Researcher B Mathematics 1 Which one’s Performance is Better? Field-Weighted Entity Subject Area Citation Impact Researcher A Clinical Medicine 1.5 Researcher B Mathematics 3 Article Level Metrics in Scopus 9 Citation count and percentile benchmark Field-Weighted Citation Impact (FWCI) Altmetrics Altmetrics is a quantitative measure of Scholarly Activity Downloads and posts in common research tools the quality and quantity of attention such as Mendeley and CiteULike that a scholarly work is receiving through social media, citations, and Mentions characterized by rapid, brief engagement article downloads.
    [Show full text]
  • Citescore Metrics: Creating Journal Metrics from the Scopus Citation Index
    CiteScore metrics: Creating journal metrics from the Scopus citation index Authors: Chris James 0000-0003-1619-5204, Lisa Colledge 0000-0002-8921-7147, Wim Meester 0000-0001-9350-3448, Norman Azoulay 0000-0003-1453-4882, Andrew Plume 0000-0002-4942-1426 Affiliations: Chris James, Lisa Colledge and Wim Meester: Elsevier, Radarweg 29, 1043 NX Amsterdam, Netherlands Norman Azoulay: Elsevier, Suite 800, 230 Park Ave, New York, NY 10169, United States Andrew Plume: Elsevier, The Boulevard, Langford Lane, Kidlington OX5 1GB, UK Corresponding author: Andrew Plume ([email protected]) Competing interests statement: At the time of this work being conducted all authors were employees of Elsevier, which owns the Scopus database and produces the CiteScore metrics. NA is now no longer employed by Elsevier. Highlights: • Journal citation metrics remain important in the age of responsible metrics • CiteScore metrics were designed on the principles of transparency, comprehensiveness and currency • CiteScore metrics are calculated from snapshots of a dynamic database • CiteScore Tracker is updated monthly to show progress towards the static CiteScore value as it builds • Feedback after the initial release of CiteScore is acknowledged and resulting developments are discussed Abstract In December 2016, after several years of development, Elsevier launched a set of transparent, comprehensive, current, and freely-available journal citation metrics called CiteScore metrics. Most of the CiteScore metrics are static, annual indicators calculated from the dynamic Scopus citation index. In the spirit of recent public statements on the responsible use of metrics, we outline the desirable characteristics of journal citation metrics, discuss how we decided on the cited and citing publications years and document types to be used for CiteScore metrics, and detail the precise method of calculation of each metric.
    [Show full text]
  • <User Training Session> the World Largest Abstract and Citation Database
    <User training session> The world largest abstract and citation database www.scopus.com December, 2019 @ Gifu University Elsevier Japan KK 2 Agenda • What is Scopus? • Find the latest articles Basic search workflow about your research topic Find important articles • Find high-impact articles based on citations Check the author profile of • Search for authors a top researcher Select a journal to publish • Check journal metrics with • Personal Log-in Alert What is Scopus︖- Comparison 1 3 Indexed journal information. (All Scopus indexed titles are peer-review titles.) What is Scopus? - Comparison 2 Abstract & Citation Database 4 • Title • Authors • Abstract • Keywords • Main text −Introduction −Methods −Results −Discussions (Conclusion) • References What is Scopus? - Comparison 2 Abstract & Citation Database 5 Journal Title Article Title Article Title Article Title Bibliographic Info Bibliographic Info Bibliographic Info Abstract Abstract Abstract Fields Keywords Keywords Keywords Can rank and sort Full Text articles by the number of citations ReferencesReferences References DB Type E-journal list, Abstract DB Abstract & Citation DB Full Text DB OPAC Know what (who) Purpose Check availability Know what the is citing the article Read and search for of the title article is about and by what full-text articles (whom) the article is cited Examples Covers more Covers a particular than 5,000 publisherʼs full-text publishers articles worldwide Basic search workflow 6 ① Start searching Select search mode Search terms Search fields Date range and
    [Show full text]
  • Citescore Faqs 2
    FAQs December 2016 Contents 1. About CiteScore and its derivative metrics 4 1.1 What is CiteScore? 5 1.2 Why don’t you include articles-in-press in CiteScore? 5 1.3 Why don’t you include abstracts in CiteScore? 6 1.4 Are documents that were published in supplements and/or special issues included in the CiteScore calculation? 6 1.5 How did you calculate the static CiteScore metrics from Scopus.com? 6 1.6 What is CiteScore Tracker? 6 1.7 Which database are CiteScore metrics calculated from? 7 1.8 How do I feed back to Scopus if I find an error in the data? 7 1.9 Why is the CiteScore value that I’ve calculated from Scopus different from the one that’s published? 7 1.10 How much do CiteScore metrics cost? 8 1.11 Can I export CiteScore metrics? 8 1.12 Which titles are CiteScore metrics available for? 8 1.13 When do new serials get CiteScore metrics? 8 1.14 Why doesn’t a serial title have CiteScore metrics? 8 1.15 How is the CiteScore calculation different from the Impact Factor calculation? 9 1.16 Why does this serial title have a lower CiteScore than Impact Factor? 10 1.17 Why does this serial title have a higher CiteScore than Impact Factor? 10 1.18 Why is this serial title’s rank different than its rank with the Impact Factor? 10 1.19 Why does CiteScore have a 3 year publication window? 10 1.20 Why do you include all types of documents in the CiteScore calculation? 11 Recognizing the total influence of a serial title on the scholarly community 11 Reducing the ease of manipulation of the metric 11 Disagreements over document type classification 11 1.21 How do I compare between subject fields using CiteScore? 12 1.22 What is the baseline so I know whether a CiteScore of 3.64 is good in the serial title’s subject field? 12 1.23 What are the benefits of CiteScore Tracker? 12 1.24 How are self-citations handled? 13 1.25 How should I refer to CiteScore metrics? 13 2 CiteScore FAQs 2.
    [Show full text]
  • Relationship Between Journal-Ranking Metrics for a Multidisciplinary Set of Journals
    Upeksha Perera and ManjulaWijewickrema 35 Relationship between Journal-Ranking Metrics for 18.1. a Multidisciplinary Set of portal Journals Upeksha Perera and ManjulaWijewickrema publication, for abstract: Ranking of scholarly journals is important to many parties. Studying the relationships among various ranking metrics is key to understanding the significance of one metric based on another. This research investigates the relationship among four major journal-ranking indicators: the impact factor (IF), the Eigenfactor score (ES), the acceptedh-index (hI), and SCImago Journal Rank (SJR). The authors used 519 journals from a diverse range of subjects and executed a correlation analysis. The results show that a higher correlationand exists between impact factor and SCImago Journal Rank (0.796). However, a very strong positive correlation between two metrics does not necessarily imply that journals are ranked according to the same pattern. edited, copy Introduction t is important for a researcher to know the rank of the journal to which he or she intends to submit a manuscript for numerous reasons, including to identify the most appropriate journals for publishing, to form an idea about the level of journals, and to Iidentify the reviewed,publishing outlets that could advance the author’s career.1 Bibliometrics, the statistical analysis of journals and similar publications, has introduced journal-ranking indicatorspeer for evaluating publication outlets. These metrics are defined by how they reflectis the properties of the considered journals. Having an idea about the relationships among different ranking indicators is important to predict the behavior of one metric mss.based on another. For instance, some metrics emphasize popularity, while others indicate prestige.2 Hence, a relationship between the metrics of popularity and prestige could be used to illustrate the performance of one metric in comparison to another.
    [Show full text]
  • The Bibliometric Literature on Scopus and Wos: the Medicine and Environmental Sciences Categories As Case of Study
    International Journal of Environmental Research and Public Health Article The Bibliometric Literature on Scopus and WoS: The Medicine and Environmental Sciences Categories as Case of Study Mila Cascajares 1 , Alfredo Alcayde 1 , Esther Salmerón-Manzano 2,* and Francisco Manzano-Agugliaro 1 1 Department of Engineering, University of Almeria, ceiA3, 04120 Almeria, Spain; [email protected] (M.C.); [email protected] (A.A.); [email protected] (F.M.-A.) 2 Faculty of Law, Universidad Internacional de La Rioja (UNIR), Av. de la Paz, 137, 26006 Logroño, Spain * Correspondence: [email protected] Abstract: In a broad sense, science can be understood as the knowledge contained in scientific manuscripts published in scientific journals. Scientific databases index only those journals that reach certain quality standards. Therefore, research and dissemination of scientific knowledge are essential activities for the growth of science itself. The aim of this manuscript is to assess the situation of medicine and environmental sciences among the bibliometric literature and to put it in perspective with the overall bibliometric publications in all scientific fields. The main countries publishing bibliometric manuscripts are China, USA and Spain. The latter country is ranked three out of the top five institutions according to the Scopus and WoS databases. In both databases, the average scientific collaboration of the top 20 institutions offers the same result, 41%. According to Scopus, Citation: Cascajares, M.; Alcayde, A.; the main subject categories in which this research falls are social sciences (38%), computer science Salmerón-Manzano, E.; (26%) and medicine (23%), while the environmental sciences category has 8%. In the analysis of Manzano-Agugliaro, F.
    [Show full text]
  • Research Intelligence Research Metrics Guidebook Table of Contents
    Research Intelligence Research Metrics Guidebook Table of contents 1.0 Introduction: Elsevier’s approach to research metrics.............04 3.2 Factors besides performance that affect the value of a metric..................................................................11 1.1 Journal metrics....................................................................05 3.2.1 Size...........................................................................13 1.2 Article-level metrics.............................................................05 3.2.2 Discipline..................................................................13 3.2.3 Publication-type.......................................................14 1.3 Author and institutional metrics.........................................05 3.2.4 Database coverage...................................................16 2.0 Scopus: the primary data source for 3.2.5 Manipulation............................................................18 Elsevier’s research metrics.......................................................06 3.2.6 Time.........................................................................18 2.1 Scopus content...................................................................07 4.0 SciVal and research metrics......................................................20 2.1.1 Scopus content and SciVal......................................07 2.1.2 Books and SciVal.....................................................07 4.1 Groups of metrics in SciVal.................................................21
    [Show full text]
  • Scholarly Metrics Qianjin (Marina) Zhang Engineering & Informatics Librarian Lichtenberger Engineering Library [email protected] Learning Objectives
    Scholarly Metrics Qianjin (Marina) Zhang Engineering & Informatics Librarian Lichtenberger Engineering Library [email protected] Learning Objectives Be able to interpret some standard citation metrics with caution and know where to find them Explore three major sources of citation data including Web of Science (JCR), Scopus and Google Scholar Identify best practices on broadening your scholarly impact h-index • J. E. Hirsch, 2005 • # of papers (h) published that have each been cited at least h times. • Can be applied to any level of aggregation such as author, institution and journal, etc. Image from Hirsch, J. E. “An Index to Quantify an Individual’s Scientific Research Output.” Proceedings of the National Academy of Sciences 102.46 (2005): 16569–16572. http://web.cs.ucla.edu/~palsberg/hirsch05.pdf h-index (Cont’d) An author has 8 papers that have been cited 34, 29, 20, 15, 7, 6, 5 and 4 times. What is the author’s h-index? Paper # # of Citations 1 34 2 29 3 20 4 15 5 7 6 6 = h-index 7 5 Example from 8 4 https://subjectguides.uwaterloo.ca/calc ulate-academic-footprint/YourHIndex John’s h-index = 10 Sara’s h-index = 5 h-index Issues Paper # # of Citations # of Citations 1 34 62 • Is not influenced by 2 28 50 citation counts of papers 3 25 40 that have been cited at least h times 4 23 32 • Distinctive. Its value can 5 20 5 only increase, never 6 17 4 decrease, over time. 7 16 3 • Advantage senior researchers but 8 14 2 disadvantage early-career 9 13 1 researchers 10 10 1 Finding Your h-index in Web of Science Finding Your h-index in Scopus • i10-index: # of papers with at least 10 citations.
    [Show full text]