Introduction to Altmetrics

Total Page:16

File Type:pdf, Size:1020Kb

Introduction to Altmetrics Chapter 1 Introduction to Altmetrics n today’s modern era of analytics, electronics, and scholarly competition, metrics are an important part Iof the everyday lives and workflows of people across the higher education community. From researchers applying for federal grants to faculty members pre- paring their tenure and promotion files, metrics have become an increasingly visible part of how academ- ics and administrators are expected, if not required, to talk about impact and value. However, just as what Figure 1.1 it means to do research has changed drastically over The first recorded use of the term altmetrics, in a Tweet the last fifteen years with advances in information posted by Jason Priem on September 28, 2011. technology, so have the qualifications for what con- stitutes a useful impact metric begun to evolve and pertaining to the future of this quickly changing field expand with changes in scholarly communication. Of of research and practice. these expansions, the most significant is arguably the We begin this first chapter with a review of the development of altmetrics, which constitutes a strictly recent origins of altmetrics, as well as a look at how twenty-first-century approach to impact measurement the approach of altmetrics relates to more established ReportsLibrary Technology that relies heavily on the connection between schol- practices for measuring scholarly impact, such as cita- arly activity and the opportunities afforded by the tion-based bibliometrics. Social Web. In this Library Technology Report, we introduce the most important features of the current altmet- Defining Altmetrics rics movement, from its origins in scholarly commu- nication and citation-based bibliometrics to its recent Altmetrics as a term was coined in September 2010 by flourishing in partnership with academic innovators Jason Priem, a doctoral student at UNC-Chapel Hill’s and a growing population of academic librarians. School of Information and Library Science (see figure alatechsource.org Within each chapter, we highlight key players and 1.1).1 A firm believer in the power of online scholarly issues that have arisen in combination with the alt- tools to help researchers filter information and iden- metrics movement, including the uncertainties and tify relevant sources, Priem was interested in iden- opportunities that have alternatively stymied and tifying a set of metrics that could describe relation- encouraged its acceptance in certain higher educa- ships between the social aspects of the web and the tion circles. By providing the facts surrounding the spread of scholarship online. With few terms avail- 2015 July growth and development of altmetrics, particularly as able to encompass this diverse-yet-specific group of they overlap with the concerns of academic libraries, analytics, Priem decided to popularize one of his own we seek to provide today’s library leaders with the making. The result, altmetrics, is a shortened version necessary context to make decisions and take actions of the phrase alternative metrics, presumably because 5 Altmetrics Robin Chin Roemer and Rachel Borchardt Figure 1.2 The Altmetrics Manifesto, authored by Jason Priem, Dario Taraborelli, Paul Groth, and Cameron Neylon, provided the first comprehensive online description of altmetrics. http:// altmetrics.org/manifesto. it offered scholars an alternative to metrics derived Figure 1.3 from a purely print-based understanding of scholarly Screenshot of the LiveJournal home page, circa 2000. research and communication. (Source: Internet Archive) For practical purposes, the best-known definition of altmetrics comes from Altmetric.org, a website set include shorter updates, media, and more. By 2004, up by Priem and three of his colleagues in October the year of the first Web 2.0 Conference, the Social 2010 in order to promote their more detailed Altmet- Web had officially blossomed from a possible fad into rics Manifesto (see figure 1.2). On it, the altmetrics a real and significant part of the Internet. approach is described as “the creation and study of The technological changes of the late 1990s and new metrics based on the Social Web for analyzing, early-to-mid 2000s were also important from the per- and informing scholarship.”2 However, in the years spective of academia, although not entirely in the following the release of this resource, new questions same ways. For instance, for the first time, research- have arisen about exactly what this definition of alt- ers at colleges and universities were beginning to see metrics encompasses, and what it actually means to the widespread availability of scholarship online. “Big calculate altmetrics in different scholarly contexts. We Deals” made by librarians with certain scholarly pub- will discuss these issues later, in the third chapter of lishers resulted in new electronic access to thousands this report. of articles, often from journals previously outside of In order to better understand the early history libraries’ print collections. This sudden spike in the of altmetrics, we look now at a few of the more sig- range and availability of electronic scholarly material nificant events leading up to its development, begin- quickly altered the ways that users searched for and ning with the changes in information technology and found academic information. In response, most aca- scholarly communication at work toward the end of demic libraries continued to pursue bundled subscrip- the twentieth century. tions to scholarly e-journals. However, at the start of July 2015 July the twenty-first century, mounting evidence began to suggest that such deals do little to solve the long-term Development of Altmetrics problem of increasing costs for serials access. In December 2002, at the height of the serials cri- As the definition of altmetrics makes clear, one of the sis, the attendees of a small conference in Budapest first prerequisites for its development was the growth convened by the Open Society Institute released a alatechsource.org of the Social Web, or the part of the Internet focused short public statement, in which they proposed using on social relationships and activities. the Internet to make research literature free for any- Between the late 1990s and early 2000s, the tex- one to use “for any … lawful purpose, without finan- ture of the Internet underwent a dramatic shift as cial, legal, or technical barriers other than those innovative toolmakers began offering users more and inseparable from gaining access to the internet itself.”3 more ways to create and share original, personal con- Later known as the Budapest Open Access Initiative, tent on the web. Free online journaling platforms, this powerful statement became a founding document such as LiveJournal (figure 1.3), led to an explosion of the open-access (OA) movement, for which many in the number of blogs and bloggers, while early libraries and librarians have since become champions. Library Technology ReportsLibrary Technology social networking sites such as MySpace and Friend- While the history of the open-access movement is ster broadened the scope of online social sharing to too rich a topic to go into here, it is notable that its 6 Altmetrics Robin Chin Roemer and Rachel Borchardt Figure 1.4 Figure 1.5 The Public Library of Science “Open Access” webpage The home page of arXiv.org (http://arxiv.org). ArXiv is an (www.plos.org/open-access). PLOS is committed to open e-print service owned and operated by Cornell University. access and applies the Creative Commons Attribution (CC- It specializes in publications from quantitative fields such as BY) license to all the content it publishes. physics, mathematics, and computer science. invention helped set the stage for the later develop- culture to make the idea of a set of web-based metrics ment of altmetrics. By emphasizing the power of the for measuring impact a tempting proposition—not Internet as a tool for research, the benefits of rapid just for scholars, but for publishers, toolmakers, and research discovery for purposes of innovation, and librarians, too. However, the “alternative” positioning the positive consequences of openly sharing scholarly of altmetrics, specifically in relation to citation-based content with the public, OA helped encourage deeper bibliometrics, created an immediate set of obstacles connection between libraries, scholars, and makers of for the movement, obstacles that the field of altmet- alternative platforms for scholarly publishing and net- rics has had to work hard to overcome ever since. For working. Evidence of this can be seen in the type of this reason, we take a moment here to briefly examine online scholarly venues that began to grow and thrive the relationship between bibliometrics and altmetrics, in the early 2000s following the articulation of open including how each has been received by proponents access, including the Public Library of Science (fig- of the other over time. ure 1.4) and arXiv (figure 1.5), both of which endorse OA values while tracking interactions between objects and users online—that is, alternative impact metrics. From Bibliometrics to Altmetrics Perhaps it is for the combination of these various reasons that the mid-2000s saw the first true flour- In contrast to altmetrics, which has emerged as a fully ishing of both Web 2.0 and “open values” across the articulated idea only within the last five years, bib- spheres of both academia and the general public. The liometrics has been around as a formal concept since year 2004, for instance,
Recommended publications
  • Reading Peer Review
    EVE This Element describes for the first time the database of peer review reports at PLOS ONE, the largest scientific journal in ET AL. the world, to which the authors had unique access. Specifically, Reading Peer Review this Element presents the background contexts and histories of peer review, the data-handling sensitivities of this type PLOS ONE and Institutional of research, the typical properties of reports in the journal Change in Academia to which the authors had access, a taxonomy of the reports, and their sentiment arcs. This unique work thereby yields a compelling and unprecedented set of insights into the evolving state of peer review in the twenty-first century, at a crucial political moment for the transformation of science. It also, though, presents a study in radicalism and the ways in which Reading Peer Review Peer Reading PLOS’s vision for science can be said to have effected change in the ultra-conservative contemporary university. This title is also available as Open Access on Cambridge Core. Cambridge Elements in Publishing and Book Culture Series Editor: Samantha Rayner University College London Associate Editor: Leah Tether University of Bristol Publishing and Book Culture Academic Publishing Martin Paul Eve, Cameron Neylon, Daniel ISSN 2514-8524 (online) ISSN 2514-8516 (print) Paul O’Donnell, Samuel Moore, Robert Gadie, Downloaded from https://www.cambridge.org/core. IP address: 170.106.33.14, on 25 Sep 2021 at 19:03:04, subject to the Cambridge CoreVictoria terms of use, available Odeniyi at https://www.cambridge.org/core/terms and Shahina Parvin. https://doi.org/10.1017/9781108783521 Downloaded from https://www.cambridge.org/core.
    [Show full text]
  • “Altmetrics” Using Google Scholar, Twitter, Mendeley, Facebook
    Pre-Print Version Altmetrics of “altmetrics” using Google Scholar, Twitter, Mendeley, Facebook, Google-plus, CiteULike, Blogs and Wiki Saeed-Ul Hassan, Uzair Ahmed Gillani [email protected] Information Technology University, 346-B Ferozepur Road, Lahore (Pakistan) Abstract: We measure the impact of “altmetrics” field by deploying altmetrics indicators using the data from Google Scholar, Twitter, Mendeley, Facebook, Google- plus, CiteULike, Blogs and Wiki during 2010- 2014. To capture the social impact of scientific publications, we propose an index called alt-index, analogues to h-index. Across the deployed indices, our results have shown high correlation among the indicators that capture social impact. While we observe medium Pearson’s correlation (ρ= .247) among the alt-index and h-index, a relatively high correlation is observed between social citations and scholarly citations (ρ= .646). Interestingly, we find high turnover of social citations in the field compared with the traditional scholarly citations, i.e. social citations are 42.2% more than traditional citations. The social mediums such as Twitter and Mendeley appear to be the most effective channels of social impact followed by Facebook and Google-plus. Overall, altmetrics appears to be working well in the field of “altmetrics”. Keywords: Altmetrics, Social Media, Usage Indicators, Alt-index Pre-Print Version Introduction In scholarly world, altmetrics are getting popularity as to support and/or alternative to traditional citation-based evaluation metrics such as impact factor, h-index etc. (Priem et. al., 2010). The concept of altmetrics was initially proposed in 2010 as a generalization of article level metrics and has its roots in the #altmetrics hashtag (McIntyre et al, 2011).
    [Show full text]
  • Sci-Hub Provides Access to Nearly All Scholarly Literature
    Sci-Hub provides access to nearly all scholarly literature A DOI-citable version of this manuscript is available at https://doi.org/10.7287/peerj.preprints.3100. This manuscript was automatically generated from greenelab/scihub-manuscript@51678a7 on October 12, 2017. Submit feedback on the manuscript at git.io/v7feh or on the analyses at git.io/v7fvJ. Authors • Daniel S. Himmelstein 0000-0002-3012-7446 · dhimmel · dhimmel Department of Systems Pharmacology and Translational Therapeutics, University of Pennsylvania · Funded by GBMF4552 • Ariel Rodriguez Romero 0000-0003-2290-4927 · arielsvn · arielswn Bidwise, Inc • Stephen Reid McLaughlin 0000-0002-9888-3168 · stevemclaugh · SteveMcLaugh School of Information, University of Texas at Austin • Bastian Greshake Tzovaras 0000-0002-9925-9623 · gedankenstuecke · gedankenstuecke Department of Applied Bioinformatics, Institute of Cell Biology and Neuroscience, Goethe University Frankfurt • Casey S. Greene 0000-0001-8713-9213 · cgreene · GreeneScientist Department of Systems Pharmacology and Translational Therapeutics, University of Pennsylvania · Funded by GBMF4552 PeerJ Preprints | https://doi.org/10.7287/peerj.preprints.3100v2 | CC BY 4.0 Open Access | rec: 12 Oct 2017, publ: 12 Oct 2017 Abstract The website Sci-Hub provides access to scholarly literature via full text PDF downloads. The site enables users to access articles that would otherwise be paywalled. Since its creation in 2011, Sci- Hub has grown rapidly in popularity. However, until now, the extent of Sci-Hub’s coverage was unclear. As of March 2017, we find that Sci-Hub’s database contains 68.9% of all 81.6 million scholarly articles, which rises to 85.2% for those published in toll access journals.
    [Show full text]
  • Open Access Availability of Scientific Publications
    Analytical Support for Bibliometrics Indicators Open access availability of scientific publications Analytical Support for Bibliometrics Indicators Open access availability of scientific publications* Final Report January 2018 By: Science-Metrix Inc. 1335 Mont-Royal E. ▪ Montréal ▪ Québec ▪ Canada ▪ H2J 1Y6 1.514.495.6505 ▪ 1.800.994.4761 [email protected] ▪ www.science-metrix.com *This work was funded by the National Science Foundation’s (NSF) National Center for Science and Engineering Statistics (NCSES). Any opinions, findings, conclusions or recommendations expressed in this report do not necessarily reflect the views of NCSES or the NSF. The analysis for this research was conducted by SRI International on behalf of NSF’s NCSES under contract number NSFDACS1063289. Analytical Support for Bibliometrics Indicators Open access availability of scientific publications Contents Contents .............................................................................................................................................................. i Tables ................................................................................................................................................................. ii Figures ................................................................................................................................................................ ii Abstract ............................................................................................................................................................
    [Show full text]
  • Wos Vs. Scopus: on the Reliability of Scientometrics
    WOS VS. SCOPUS: ON THE RELIABILITY OF SCIENTOMETRICS Éric Archambault* Science-Metrix, 1335A avenue du Mont-Royal E, Montréal, Québec, H2J 1Y6, Canada and Observatoire des sciences et des technologies (OST), Centre interuniversitaire de recherche sur la science et la technologie (CIRST), Université du Québec à Montréal, Montréal (Québec), Canada. E-mail: [email protected] David Campbell Science-Metrix, 1335A avenue du Mont-Royal E, Montréal, Québec, H2J 1Y6, Canada E-mail: [email protected] Yves Gingras, Vincent Larivière Observatoire des sciences et des technologies (OST), Centre interuniversitaire de recherche sur la science et la technologie (CIRST), Université du Québec à Montréal, Case Postale 8888, succ. Centre-Ville, Montréal (Québec), H3C 3P8, Canada. E-mail: [email protected]; [email protected] * Corresponding author Theme 6: Accuracy and reliability of data sources for scientometric studies Keywords: papers; citation; databases; web of science; scopus. Background and research question For more than 40 years, the Institute for Scientific Information (ISI, now part of Thomson-Reuters), produced the only available database making possible citation analysis, the Web of Science (WoS). Now, another company, Reed-Elsevier, has created its own bibliographic database, Scopus, available since 2002. For those who perform bibliometric analysis and comparisons of countries or institutions, the existence of these two major databases raises the important question of the comparability and stability of rankings obtained from different data sources. Although several studies have compared the WoS and Scopus for Web use [BALL and TUNGER, 2006; BAR-ILAN, 2008; BOSMAN et al., 2006; FALAGAS et al., 2008; JACSO, 2005; MEHO and YANG, 2007; NORRIS and OPPENHEIM, 2007], no study has yet compared them in the context of a bibliometric production environment.
    [Show full text]
  • Sci-Hub Downloads Lead to More Article Citations
    THE SCI-HUB EFFECT:SCI-HUB DOWNLOADS LEAD TO MORE ARTICLE CITATIONS Juan C. Correa⇤ Henry Laverde-Rojas Faculty of Business Administration Faculty of Economics University of Economics, Prague, Czechia Universidad Santo Tomás, Bogotá, Colombia [email protected] [email protected] Fernando Marmolejo-Ramos Julian Tejada Centre for Change and Complexity in Learning Departamento de Psicologia University of South Australia Universidade Federal de Sergipe [email protected] [email protected] Štepánˇ Bahník Faculty of Business Administration University of Economics, Prague, Czechia [email protected] ABSTRACT Citations are often used as a metric of the impact of scientific publications. Here, we examine how the number of downloads from Sci-hub as well as various characteristics of publications and their authors predicts future citations. Using data from 12 leading journals in economics, consumer research, neuroscience, and multidisciplinary research, we found that articles downloaded from Sci-hub were cited 1.72 times more than papers not downloaded from Sci-hub and that the number of downloads from Sci-hub was a robust predictor of future citations. Among other characteristics of publications, the number of figures in a manuscript consistently predicts its future citations. The results suggest that limited access to publications may limit some scientific research from achieving its full impact. Keywords Sci-hub Citations Scientific Impact Scholar Consumption Knowledge dissemination · · · · Introduction Science and its outputs are essential in daily life, as they help to understand our world and provide a basis for better decisions. Although scientific findings are often cited in social media and shared outside the scientific community [1], their primary use is what we could call “scholar consumption.” This phenomenon includes using websites that provide subscription-based access to massive databases of scientific research [2].
    [Show full text]
  • Google Scholar, Web of Science, and Scopus
    Journal of Informetrics, vol. 12, no. 4, pp. 1160-1177, 2018. https://doi.org/10.1016/J.JOI.2018.09.002 Google Scholar, Web of Science, and Scopus: a systematic comparison of citations in 252 subject categories Alberto Martín-Martín1 , Enrique Orduna-Malea2 , Mike 3 1 Thelwall , Emilio Delgado López-Cózar Version 1.6 March 12, 2019 Abstract Despite citation counts from Google Scholar (GS), Web of Science (WoS), and Scopus being widely consulted by researchers and sometimes used in research evaluations, there is no recent or systematic evidence about the differences between them. In response, this paper investigates 2,448,055 citations to 2,299 English-language highly-cited documents from 252 GS subject categories published in 2006, comparing GS, the WoS Core Collection, and Scopus. GS consistently found the largest percentage of citations across all areas (93%-96%), far ahead of Scopus (35%-77%) and WoS (27%-73%). GS found nearly all the WoS (95%) and Scopus (92%) citations. Most citations found only by GS were from non-journal sources (48%-65%), including theses, books, conference papers, and unpublished materials. Many were non-English (19%- 38%), and they tended to be much less cited than citing sources that were also in Scopus or WoS. Despite the many unique GS citing sources, Spearman correlations between citation counts in GS and WoS or Scopus are high (0.78-0.99). They are lower in the Humanities, and lower between GS and WoS than between GS and Scopus. The results suggest that in all areas GS citation data is essentially a superset of WoS and Scopus, with substantial extra coverage.
    [Show full text]
  • An Initiative to Track Sentiments in Altmetrics
    Halevi, G., & Schimming, L. (2018). An Initiative to Track Sentiments in Altmetrics. Journal of Altmetrics, 1(1): 2. DOI: https://doi.org/10.29024/joa.1 PERSPECTIVE An Initiative to Track Sentiments in Altmetrics Gali Halevi and Laura Schimming Icahn School of Medicine at Mount Sinai, US Corresponding author: Gali Halevi ([email protected]) A recent survey from Pew Research Center (NW, Washington & Inquiries 2018) found that over 44 million people receive science-related information from social media channels to which they subscribe. These include a variety of topics such as new discoveries in health sciences as well as “news you can use” information with practical tips (p. 3). Social and news media attention to scientific publications has been tracked for almost a decade by several platforms which aggregate all public mentions of and interactions with scientific publications. Since the amount of comments, shares, and discussions of scientific publications can reach an audience of thousands, understanding the overall “sentiment” towards the published research is a mammoth task and typically involves merely reading as many posts, shares, and comments as possible. This paper describes an initiative to track and provide sentiment analysis to large social and news media mentions and label them as “positive”, “negative” or “neutral”. Such labels will enable an overall understanding of the content that lies within these social and news media mentions of scientific publications. Keywords: sentiment analysis; opinion mining; scholarly output Background Traditionally, the impact of research has been measured through citations (Cronin 1984; Garfield 1964; Halevi & Moed 2015; Moed 2006, 2015; Seglen 1997). The number of citations a paper receives over time has been directly correlated to the perceived impact of the article.
    [Show full text]
  • Piracy of Scientific Papers in Latin America: an Analysis of Sci-Hub Usage Data
    Developing Latin America Piracy of scientific papers in Latin America: An analysis of Sci-Hub usage data Juan D. Machin-Mastromatteo Alejandro Uribe-Tirado Maria E. Romero-Ortiz This article was originally published as: Machin-Mastromatteo, J.D., Uribe-Tirado, A., and Romero-Ortiz, M. E. (2016). Piracy of scientific papers in Latin America: An analysis of Sci-Hub usage data. Information Development, 32(5), 1806–1814. http://dx.doi.org/10.1177/0266666916671080 Abstract Sci-Hub hosts pirated copies of 51 million scientific papers from commercial publishers. This article presents the site’s characteristics, it criticizes that it might be perceived as a de-facto component of the Open Access movement, it replicates an analysis published in Science using its available usage data, but limiting it to Latin America, and presents implications caused by this site for information professionals, universities and libraries. Keywords: Sci-Hub, piracy, open access, scientific articles, academic databases, serials crisis Scientific articles are vital for students, professors and researchers in universities, research centers and other knowledge institutions worldwide. When academic publishing started, academies, institutions and professional associations gathered articles, assessed their quality, collected them in journals, printed and distributed its copies; with the added difficulty of not having digital technologies. Producing journals became unsustainable for some professional societies, so commercial scientific publishers started appearing and assumed printing, sales and distribution on their behalf, while academics retained the intellectual tasks. Elsevier, among the first publishers, emerged to cover operations costs and profit from sales, now it is part of an industry that grew from the process of scientific communication; a 10 billion US dollar business (Murphy, 2016).
    [Show full text]
  • Sciverse Scopus User Guide
    SciVerse Scopus User Guide September 2010 SciVerse Scopus Open to accelerate science 2 Contents Welcome to SciVerse Scopus: How to use this guide to get the most from your subscription 4 Perform a basic search 5 Review results 6 Refine your search 7 View your results in detail 8 Find authors 10 View author details 11 Track citations 12 Evaluate an author 13 SciVerse Scopus affiliation identifier 15 Stay up-to-date 16 My settings 16 Alerts and feeds 17 Search history 18 Sources 18 SciVerse Scopus Journal Analyzer 19 SJR journal metric 20 SNIP (Source-Normalized Impact per Paper) journal metric 21 Citations 22 Documents 23 To find additional help 23 3 Welcome to SciVerse Scopus: How to use this guide to get the most from your subscription SciVerse Scopus is the largest abstracts and citations database of Elsevier’s SciVerse, a vital scientific ecosystem that facilitates collaboration, rewards innovation and accelerates the research process itself. SciVerse integrates the familiar – trusted content from SciVerse Scopus, peer-reviewed literature, SciVerse ScienceDirect full-text articles and the Web, based on your current subscriptions - with the forward-looking: community-developed applications that enrich and expand content value. Through step-by-step instructions and precise illustrations, this Quick Start Guide will show you how to: Get a quick overview of a new subject field – refine your search to find relevant results Track citations and view the h-index – find out what’s hot in a research area by finding the most highly cited articles and authors Identify authors and find author-related information – find the right person by distinguishing between authors with the same name or similar names Stay up-to-date – set up search and citation alerts and RSS feeds Evaluate research performance – analyze the research output at an institutional or journal level and use the results to help you make clear decisions 4 Perform a basic search You can perform a broad search with one or two keywords to get an overview of a field.
    [Show full text]
  • Ddawson2012.Pdf (700.6Kb)
    Open Science and Crowd Science: Selected Sites and Resources Diane (DeDe) Dawson Natural Sciences Liaison Librarian University of Saskatchewan Saskatoon, Saskatchewan, Canada [email protected] Table of Contents Introduction Open Science Crowd Science Methods and Scope Open Science – Definitions and Principles Open Science – Open Lab Notebooks of Individuals and Lab Groups Open Science – Blogs Crowd Science – Projects for Individuals or Small Teams Crowd Science – Volunteer Distributed Computing Projects The Main Software Organizations Selected Projects Further Sources for Projects Selected Examples of Collaborative Science Sites for Specialists Main Software & Online Tools for Open Science Open Science Conferences and Community Conferences Further Reading/Viewing Videos Reports and White Papers Open e-Books Selected Essays, Articles, and Interviews References Introduction “To take full advantage of modern tools for the production of knowledge, we need to create an open scientific culture where as much information as possible is moved out of people’s heads and laboratories, and onto the network” (Nielsen 2011, p.183). New Internet technologies are radically enhancing the speed and ease of scholarly communications, and are providing opportunities for conducting and sharing research in new ways. This webliography explores the emerging “open science” and “crowd science” movements which are making use of these new opportunities to increase collaboration and openness in scientific research. The collaboration of many researchers on a project can enhance the rate of data-collection and analysis, and ignite new ideas. In addition, since there are more eyes to spot any inaccuracies or errors, collaborative research is likely to produce better quality results. Openness early in the research process alerts others to the work resulting in less duplication of efforts.
    [Show full text]
  • Citations Versus Altmetrics
    Research Data Explored: Citations versus Altmetrics Isabella Peters1, Peter Kraker2, Elisabeth Lex3, Christian Gumpenberger4, and Juan Gorraiz4 1 [email protected] ZBW Leibniz Information Centre for Economics, Düsternbrooker Weg 120, D-24105 Kiel (Germany) & Kiel University, Christian-Albrechts-Platz 4, D-24118 Kiel (Germany) 2 [email protected] Know-Center, Inffeldgasse 13, A-8010 Graz (Austria) 3 [email protected] Graz University of Technology, Knowledge Technologies Institute, Inffeldgasse 13, A-8010 Graz (Austria) 4 christian.gumpenberger | [email protected] University of Vienna, Vienna University Library, Dept of Bibliometrics, Boltzmanngasse 5, A-1090 Vienna (Austria) Abstract The study explores the citedness of research data, its distribution over time and how it is related to the availability of a DOI (Digital Object Identifier) in Thomson Reuters’ DCI (Data Citation Index). We investigate if cited research data “impact” the (social) web, reflected by altmetrics scores, and if there is any relationship between the number of citations and the sum of altmetrics scores from various social media-platforms. Three tools are used to collect and compare altmetrics scores, i.e. PlumX, ImpactStory, and Altmetric.com. In terms of coverage, PlumX is the most helpful altmetrics tool. While research data remain mostly uncited (about 85%), there has been a growing trend in citing data sets published since 2007. Surprisingly, the percentage of the number of cited research data with a DOI in DCI has decreased in the last years. Only nine repositories account for research data with DOIs and two or more citations. The number of cited research data with altmetrics scores is even lower (4 to 9%) but shows a higher coverage of research data from the last decade.
    [Show full text]