Webometrics and Altmetrics: Home Birth Vs

Total Page:16

File Type:pdf, Size:1020Kb

Webometrics and Altmetrics: Home Birth Vs Michael Thelwall Webometrics and Altmetrics: Home Birth vs. Hospital Birth 1 Introduction Almost two decades ago it became apparent that scientometric techniques that had been developed for citation databases could also be applied to the web. This triggered two reactions: optimism and pessimism. On the one hand, a number of optimistic articles were written by Ray Larson, Tomas Almind, Josep-Manuel Rodríguez-Gairín, Peter Ingwersen, Blaise Cronin, Judit Bar-Ilan, Isidro Aguillo, Lennart Björneborn, Alastair Smith, and Ronald Rousseau that described or de- veloped new ways in which the web could be used for scientometrics. On the other hand, more critical articles were written by Anthony van Raan and Leo Egghe that focused on the problems that webometric methods were likely to face. This chapter argues that these two approaches represented different theoretical per- spectives on scientometrics and that time has shown both to be valid. This chapter concludes by contrasting the academic literature that gave birth to webometrics with the coherent set of goals that announced altmetrics on a public website. 2 The Birth of Webometrics Webometrics is an information science research field concerned with quantita- tive analyses of web data for various purposes. It was born from scientometrics, the quantitative analysis of science, with the realization that, using commer- cial search engines, some standard scientometric techniques could, and perhaps should, be applied to the web. The basic idea for webometrics apparently dawned on several people at almost the same time (Aguillo, 1998; Almind & Ingwersen, 1997; Cronin, Snyder, Rosenbaum, Martinson, & Callahan, 1998; Larson, 1996; Rodríguez-Gairín, 1997), although the most developed concept was that of Almind and Ingwersen (1997), who explicitly compared the web to traditional citation indexes and coined the term webometrics. The idea driving the birth of webometrics was that many scientometric stud- ies had analyzed information about sets of scientific documents extracted from a publication database or citation index, and that the same methods could be adapted for the web. For example, the productivity of one or more departments DOI 10.1515/9783110308464-019, © 2020 Michael Thelwall, published by De Gruyter. This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivs 4.0 License. 338 | Michael Thelwall could be assessed by counting the number of documents produced by them, as recorded in an international scientific database, but on the web the number of web pages produced by them could also be counted, either directly or using an appropriate search engine query. Perhaps more usefully, evaluative scientomet- ric studies often assessed the impact of the documents produced by one or more researchers by counting citations to them. On the web, the impact of collections of documents could be assessed to some extent by counting the number of hy- perlinks pointing to them (Ingwersen, 1998). In addition, new types of study of scientific impact were possible with the web because of the increasingly many activities that were recorded online, such as for education, public engagement, and talks. Hence, for example, it might be possible to trace the wider impact of individual academics by counting and analysing how often they were mentioned online (Cronin et al., 1998). There were two different reactions to the emergence of webometrics: enthu- siasm and scepticism. The enthusiastic perspective described the range of stud- ies that might be possible with webometrics and concentrated on its potential applications and strengths (Björneborn, & Ingwersen, 2001; Borgman, & Furner, 2002; Cronin, 2001; Leydesdorff, & Curran, 2000); the critical perspective instead emphasized the problems inherent in web data and focused on the limitations that webometric methods would necessarily face (Björneborn, & Ingwersen, 2001; Egghe, 2000; van Raan, 2001). Drawing from both strands, webometrics emerged as an empirical field that sought to identify limitations with web data, todevelop methods to circumvent these limitations, and to evaluate webometric methods through comparisons with related offline sources of evidence (e.g., comparing counts of hyperlinks to journal websites with counts of citations to the associated journals), when possible (Bar-Ilan, 1999; Rousseau, 1999; Smith, 1999). 3 Webometric Theories Like the mother field scientometrics, webometrics has not succeeded in creating a single unifying theory for its main tasks but it has produced a few theoretical con- tributions and one specific new theory. Within scientometrics, a detailed theory to deal with the practice of citation analysis has long been needed (Cronin, 1981), but has not been produced. Instead, practicing scientometricians draw upon a range of different theories (e.g., Merton, 1973; de Solla Price, 1976) and empiri- cal evidence (Oppenheim, & Renn, 1978) about why people cite and the biases in citation analysis (e.g., Cronin, 1984) in order to effectively process and interpret citation data (van Raan, 1998). Within webometrics and altmetrics there are spec- Webometrics and Altmetrics: Home Birth vs. Hospital Birth | 339 ulations about why people cite in the web as well as some empirical evidence, but both are weaker than for citation analysis. Although early webometric papers were mainly quite theoretical in terms of discussing the potential for webometrics, the first major clearly articulated the- oretical contribution to the mature field was an article that gave it a formal def- inition as “the study of the quantitative aspects of the construction and use of information resources, structures and technologies on the Web drawing on bib- liometric and informetric approaches” (Björneborn 2004, p. vii), placed the name of the field within a research taxonomy, systematically named the main objects of study for link analysis, and introduced a standard diagram style for link networks (Björneborn & Ingwersen, 2004). While no article has attempted to create a unifying theory of hyperlinks (or any other aspect of webometrics), a third article proposed a unifying “theoretical framework” for link analysis by specifying the stages that an effective link analysis wouldhavetoinclude,asfollows: 1. Link interpretation is required in any link analysis exercise if conclusions are to be drawn about underlying reasons for link creation. An exception would be made for evaluative link analysis if it could be consistently demonstrated that inlink counts correlate with the phenomenon desired to be evaluated. 2. No single method for link interpretation is perfect. Method triangulation is required, ideally including a direct method and a correlation testing method. 3. Fundamental problems, including the “rich get richer” property of link cre- ation and web dynamics, mean that definitive answers cannot be given to most research questions. As a result, research conclusions should always be expressed cautiously. 4. Extensive interpretation exercises are not appropriate because of the reasons given in point 3 above. (Thelwall, 2006) The purposes of the above theoretical framework were to create a consensus about the minimum requirements for a published link analysis study and to ensure that the conclusions drawn from future studies would always be tied to the evidence provided in them. In this sense it copies the earlier work of van Raan (1998). Al- though no new link analysis theories have been proposed since the above, more powerful link analysis methods have been developed (e.g., Seeber, Lepori, Lomi, Aguillo, & Barberio, 2012) and the field has continued to progress. Within webometrics, link analysis’ younger sibling is search engine evalu- ation, the analysis of properties of commercial web search engines. This area of study was conceived from its parent fields—scientometrics and informetrics— when it became apparent that link analysis could not survive alone because the results that web search engines gave for queries were inconsistent and sometimes 340 | Michael Thelwall illogical. Search engine evaluation studies attempted to cure this childhood dis- ease by identifying strategies for extracting information that was as reliable as possible (e.g., Bar-Ilan, 2001; Bar-Ilan, 2004), such as by conducting multiple searches and aggregating the results (Rousseau, 1999). In addition, by under- standing and mapping the extent of the problem of search engine reliability (e.g., Vaughan & Thelwall, 2004), the results of link analysis studies could be interpreted with an appropriate degree of caution. From a theoretical perspec- tive, this is a very odd type of information science research because the object of study is a computer system (Google, Bing etc.) created by a small number of humans (mainly computer scientists) that is, in theory, perfectly understandable to researchers. Nevertheless, the exact details of the functioning of all the major commercial search engines are proprietary secrets, thus necessitating empirical analysis. This area of research has generated no formal theories at all, perhaps because it seems illogical to theorize about a very small number of complex computer systems. To fill this gap, here are some new theoretical hypotheses about commercial search engines. These are consistent with the history of search engines over the past decade but are clearly not directly testable until one of them fails: – Incompleteness hypothesis: No major commercial search engine will ever be designed to give an accurate and complete set of
Recommended publications
  • “Altmetrics” Using Google Scholar, Twitter, Mendeley, Facebook
    Pre-Print Version Altmetrics of “altmetrics” using Google Scholar, Twitter, Mendeley, Facebook, Google-plus, CiteULike, Blogs and Wiki Saeed-Ul Hassan, Uzair Ahmed Gillani [email protected] Information Technology University, 346-B Ferozepur Road, Lahore (Pakistan) Abstract: We measure the impact of “altmetrics” field by deploying altmetrics indicators using the data from Google Scholar, Twitter, Mendeley, Facebook, Google- plus, CiteULike, Blogs and Wiki during 2010- 2014. To capture the social impact of scientific publications, we propose an index called alt-index, analogues to h-index. Across the deployed indices, our results have shown high correlation among the indicators that capture social impact. While we observe medium Pearson’s correlation (ρ= .247) among the alt-index and h-index, a relatively high correlation is observed between social citations and scholarly citations (ρ= .646). Interestingly, we find high turnover of social citations in the field compared with the traditional scholarly citations, i.e. social citations are 42.2% more than traditional citations. The social mediums such as Twitter and Mendeley appear to be the most effective channels of social impact followed by Facebook and Google-plus. Overall, altmetrics appears to be working well in the field of “altmetrics”. Keywords: Altmetrics, Social Media, Usage Indicators, Alt-index Pre-Print Version Introduction In scholarly world, altmetrics are getting popularity as to support and/or alternative to traditional citation-based evaluation metrics such as impact factor, h-index etc. (Priem et. al., 2010). The concept of altmetrics was initially proposed in 2010 as a generalization of article level metrics and has its roots in the #altmetrics hashtag (McIntyre et al, 2011).
    [Show full text]
  • An Initiative to Track Sentiments in Altmetrics
    Halevi, G., & Schimming, L. (2018). An Initiative to Track Sentiments in Altmetrics. Journal of Altmetrics, 1(1): 2. DOI: https://doi.org/10.29024/joa.1 PERSPECTIVE An Initiative to Track Sentiments in Altmetrics Gali Halevi and Laura Schimming Icahn School of Medicine at Mount Sinai, US Corresponding author: Gali Halevi ([email protected]) A recent survey from Pew Research Center (NW, Washington & Inquiries 2018) found that over 44 million people receive science-related information from social media channels to which they subscribe. These include a variety of topics such as new discoveries in health sciences as well as “news you can use” information with practical tips (p. 3). Social and news media attention to scientific publications has been tracked for almost a decade by several platforms which aggregate all public mentions of and interactions with scientific publications. Since the amount of comments, shares, and discussions of scientific publications can reach an audience of thousands, understanding the overall “sentiment” towards the published research is a mammoth task and typically involves merely reading as many posts, shares, and comments as possible. This paper describes an initiative to track and provide sentiment analysis to large social and news media mentions and label them as “positive”, “negative” or “neutral”. Such labels will enable an overall understanding of the content that lies within these social and news media mentions of scientific publications. Keywords: sentiment analysis; opinion mining; scholarly output Background Traditionally, the impact of research has been measured through citations (Cronin 1984; Garfield 1964; Halevi & Moed 2015; Moed 2006, 2015; Seglen 1997). The number of citations a paper receives over time has been directly correlated to the perceived impact of the article.
    [Show full text]
  • Piracy of Scientific Papers in Latin America: an Analysis of Sci-Hub Usage Data
    Developing Latin America Piracy of scientific papers in Latin America: An analysis of Sci-Hub usage data Juan D. Machin-Mastromatteo Alejandro Uribe-Tirado Maria E. Romero-Ortiz This article was originally published as: Machin-Mastromatteo, J.D., Uribe-Tirado, A., and Romero-Ortiz, M. E. (2016). Piracy of scientific papers in Latin America: An analysis of Sci-Hub usage data. Information Development, 32(5), 1806–1814. http://dx.doi.org/10.1177/0266666916671080 Abstract Sci-Hub hosts pirated copies of 51 million scientific papers from commercial publishers. This article presents the site’s characteristics, it criticizes that it might be perceived as a de-facto component of the Open Access movement, it replicates an analysis published in Science using its available usage data, but limiting it to Latin America, and presents implications caused by this site for information professionals, universities and libraries. Keywords: Sci-Hub, piracy, open access, scientific articles, academic databases, serials crisis Scientific articles are vital for students, professors and researchers in universities, research centers and other knowledge institutions worldwide. When academic publishing started, academies, institutions and professional associations gathered articles, assessed their quality, collected them in journals, printed and distributed its copies; with the added difficulty of not having digital technologies. Producing journals became unsustainable for some professional societies, so commercial scientific publishers started appearing and assumed printing, sales and distribution on their behalf, while academics retained the intellectual tasks. Elsevier, among the first publishers, emerged to cover operations costs and profit from sales, now it is part of an industry that grew from the process of scientific communication; a 10 billion US dollar business (Murphy, 2016).
    [Show full text]
  • Citations Versus Altmetrics
    Research Data Explored: Citations versus Altmetrics Isabella Peters1, Peter Kraker2, Elisabeth Lex3, Christian Gumpenberger4, and Juan Gorraiz4 1 [email protected] ZBW Leibniz Information Centre for Economics, Düsternbrooker Weg 120, D-24105 Kiel (Germany) & Kiel University, Christian-Albrechts-Platz 4, D-24118 Kiel (Germany) 2 [email protected] Know-Center, Inffeldgasse 13, A-8010 Graz (Austria) 3 [email protected] Graz University of Technology, Knowledge Technologies Institute, Inffeldgasse 13, A-8010 Graz (Austria) 4 christian.gumpenberger | [email protected] University of Vienna, Vienna University Library, Dept of Bibliometrics, Boltzmanngasse 5, A-1090 Vienna (Austria) Abstract The study explores the citedness of research data, its distribution over time and how it is related to the availability of a DOI (Digital Object Identifier) in Thomson Reuters’ DCI (Data Citation Index). We investigate if cited research data “impact” the (social) web, reflected by altmetrics scores, and if there is any relationship between the number of citations and the sum of altmetrics scores from various social media-platforms. Three tools are used to collect and compare altmetrics scores, i.e. PlumX, ImpactStory, and Altmetric.com. In terms of coverage, PlumX is the most helpful altmetrics tool. While research data remain mostly uncited (about 85%), there has been a growing trend in citing data sets published since 2007. Surprisingly, the percentage of the number of cited research data with a DOI in DCI has decreased in the last years. Only nine repositories account for research data with DOIs and two or more citations. The number of cited research data with altmetrics scores is even lower (4 to 9%) but shows a higher coverage of research data from the last decade.
    [Show full text]
  • Overview of the Altmetrics Landscape
    Purdue University Purdue e-Pubs Charleston Library Conference Overview of the Altmetrics Landscape Richard Cave Public Library of Science, [email protected] Follow this and additional works at: https://docs.lib.purdue.edu/charleston Part of the Library and Information Science Commons An indexed, print copy of the Proceedings is also available for purchase at: http://www.thepress.purdue.edu/series/charleston. You may also be interested in the new series, Charleston Insights in Library, Archival, and Information Sciences. Find out more at: http://www.thepress.purdue.edu/series/charleston-insights-library-archival- and-information-sciences. Richard Cave, "Overview of the Altmetrics Landscape" (2012). Proceedings of the Charleston Library Conference. http://dx.doi.org/10.5703/1288284315124 This document has been made available through Purdue e-Pubs, a service of the Purdue University Libraries. Please contact [email protected] for additional information. Overview of the Altmetrics Landscape Richard Cave, Director of IT and Computer Operations, Public Library of Science Abstract While the impact of article citations has been examined for decades, the “altmetrics” movement has exploded in the past year. Altmetrics tracks the activity on the Social Web and looks at research outputs besides research articles. Publishers of scientific research have enabled altmetrics on their articles, open source applications are available for platforms to display altmetrics on scientific research, and subscription models have been created that provide altmetrics. In the future, altmetrics will be used to help identify the broader impact of research and to quickly identify high-impact research. Altmetrics and Article-Level Metrics Washington as an academic research project to rank journals based on a vast network of citations The term “altmetrics” was coined by Jason Priem, (Eigenfactor.org, http://www.eigenfactor.org/ a PhD candidate at the School of Information and whyeigenfactor.php).
    [Show full text]
  • Impact on Citations and Altmetrics Peter E. Clayson*1, Scott
    1 The Open Access Advantage for Studies of Human Electrophysiology: Impact on Citations and Altmetrics Peter E. Clayson*1, Scott A. Baldwin2, and Michael J. Larson2,3 1Department of Psychology, University of South Florida, Tampa, FL 2Department of Psychology, Brigham Young University, Provo, UT 3Neuroscience Center, Brigham Young University, Provo, UT *Corresponding author at: Department of Psychology, University of South Florida, 4202 East Fowler Avenue, Tampa, FL, US, 33620-7200. Email: [email protected] 2 Disclosure Michael J. Larson, PhD, is the Editor-in-Chief of the International Journal of Psychophysiology. Editing of the manuscript was handled by a separate editor and Dr. Larson was blinded from viewing the reviews or comments as well as the identities of the reviewers. 3 Abstract Barriers to accessing scientific findings contribute to knowledge inequalities based on financial resources and decrease the transparency and rigor of scientific research. Recent initiatives aim to improve access to research as well as methodological rigor via transparency and openness. We sought to determine the impact of such initiatives on open access publishing in the sub-area of human electrophysiology and the impact of open access on the attention articles received in the scholarly literature and other outlets. Data for 35,144 articles across 967 journals from the last 20 years were examined. Approximately 35% of articles were open access, and the rate of publication of open-access articles increased over time. Open access articles showed 9 to 21% more PubMed and CrossRef citations and 39% more Altmetric mentions than closed access articles. Green open access articles (i.e., author archived) did not differ from non-green open access articles (i.e., publisher archived) with respect to citations and were related to higher Altmetric mentions.
    [Show full text]
  • Open Access Advantage? an Altmetric Analysis of Finnish Research Publications
    Open Access advantage? An altmetric analysis of Finnish research publications Introduction Altmetrics generally rely on openly available data about mentions of research products, such as scientific articles identified by DOIs. This would suggest that an open access advantage would be inherent in altmetrics. An open access advantage has indeed been discovered in earlier research. Wang et al. (2015) found that open access articles in a specific hybrid journal received more social media attention. Alhoori et al. (2015) came to similar conclusions, however, the open access advantage they discovered was lower when factors such as journal, publication year, and citation counts were taken into account. This research in progress investigates the open access advantage of the altmetrics of Finnish research publications and examines disciplinary differences in that advantage. Data and methods Data about Finnish research publications was retrieved from the Juuli database. The data in Juuli is collected annually from Finnish research organizations. Information from a total of 114,496 publications published from 2012-2014 by authors at 14 Finnish universities was retrieved from Juuli. CrossRef was queried through their API in an effort to add missing DOIs to the data, after which a total of 38,819 publications had a DOI attached to them. These DOIs were used to search the altmetric data provided from Altmetric.com and to retrieve Mendeley readership counts from the Mendeley API. A total of 12,438 Finnish research publications from 2012-2014 had at least one altmetric event. It was discovered that some publications were collaborative between Finnish universities; in our analysis these publications were counted for each participating university, thus resulting in a total of 13,301 Finnish research publications from 2012-2014 with at least one altmetric event captured by Altmetric.com or one Mendeley reader.
    [Show full text]
  • The Most Widely Disseminated COVID-19-Related Scientific Publications
    healthcare Article The Most Widely Disseminated COVID-19-Related Scientific Publications in Online Media: A Bibliometric Analysis of the Top 100 Articles with the Highest Altmetric Attention Scores Ji Yoon Moon, Dae Young Yoon * , Ji Hyun Hong , Kyoung Ja Lim, Sora Baek, Young Lan Seo and Eun Joo Yun Department of Radiology, Kangdong Sacred Heart Hospital, College of Medicine, Hallym University, 150, Seongan-ro, Gangdong-gu, Seoul 134-701, Korea; [email protected] (J.Y.M.); [email protected] (J.H.H.); [email protected] (K.J.L.); [email protected] (S.B.); [email protected] (Y.L.S.); [email protected] (E.J.Y.) * Correspondence: [email protected] Abstract: The novel coronavirus disease 2019 (COVID-19) is a global pandemic. This study’s aim was to identify and characterize the top 100 COVID-19-related scientific publications, which had received the highest Altmetric Attention Scores (AASs). Hence, we searched Altmetric Explorer using search terms such as “COVID” or “COVID-19” or “Coronavirus” or “SARS-CoV-2” or “nCoV” and then selected the top 100 articles with the highest AASs. For each article identified, we extracted the following information: the overall AAS, publishing journal, journal impact factor (IF), date of publication, language, country of origin, document type, main topic, and accessibility. The top 100 articles most frequently were published in journals with high (>10.0) IF (n = 67), were published Citation: Moon, J.Y.; Yoon, D.Y.; between March and July 2020 (n = 67), were written in English (n = 100), originated in the United Hong, J.H.; Lim, K.J.; Baek, S.; Seo, States (n = 45), were original articles (n = 59), dealt with treatment and clinical manifestations Y.L.; Yun, E.J.
    [Show full text]
  • Preprints and COVID-19: Analysis of 2020 Pandemic-Related Works from a Single Institution
    Preprints and COVID-19: Analysis of 2020 Pandemic-related Works from a Single Institution Medical Library, Memorial Sloan Kettering Cancer Center Konstantina Matsoukas, MLIS; Jeanine McSweeney, MLIS; Donna Gibson, MLS; Robin O’Hanlon, MIS; Marina Chilov, MLS; Johanna Goldberg, MSLIS; Lindsay Boyce, MLIS; Fig 1. Preprint Server Choice Fig 5. Links between Preprint & Journal Article Objectives 1. Preprint record links to Journal Article DOI arXiv - 3.0% The coronavirus pandemic accelerated the acceptance of preprints by Authorea - 6.1% the clinical research community, with the need to rapidly share new bioRxiv - 24.2% research findings even leading to the inclusion of COVID preprints in bioRxiv Bull World Health Organ. - 24.2% 3.0% PubMed/PMC. This occurred even though preprints are not properly chemRxiv - 3.0% medRxiv vetted, final published versions of works, but rather preliminary scientific 51.5% medRxiv - 51.5% research reports that have not yet been evaluated and certified by peer ResearchSquare - 3.0% review. To enhance our understanding of health science preprint server SSRN - 6.1% 2. PubMed Preprint record links to both posting adoption, we characterized/analyzed the 2020 research manuscripts related to COVID-19 posted to preprint servers by researchers at our institution. Fig 2. Preprint Indexing, Publication & Linking Methods 3. PMC Preprint record links to both Yes No Thirty-three preprints meeting our inclusion criteria were identified via 54.6% 45.5% Indexed targeted searching. The preprint server, PubMed/PMC, and journal in 18/33 Pubmed publisher records were consulted to verify: manuscript posting, revision, 61.3% 38.7% and publication dates; author and affiliation information; linkage to the Published as 19/31 Journal final published versions on the preprint and PubMed/PMC Articles records.
    [Show full text]
  • Altmetrics Data Providers: a Meta- Analysis Review of the Coverage of Metrics and Publication José-Luis Ortega
    Altmetrics data providers: A meta- analysis review of the coverage of metrics and publication José-Luis Ortega How to cite this article: Ortega, José-Luis (2020). “Altmetrics data providers: A meta-analysis review of the coverage of metrics and publications”. El profesional de la información, v. 29, n. 1, e290107. https://doi.org/10.3145/epi.2020.ene.07 Invited manuscript received on November, 20th 2019 Accepted on December, 10th 2019 José-Luis Ortega * https://orcid.org/0000-0001-9857-1511 Institute for Advanced Social Studies (IESA-CSIC) Joint Research Unit (CSIC-University of Córdoba) Innovation, Knowledge and Transfer Plaza Camposanto de los Mártires, 7. 14004 Córdoba, Spain [email protected] Abstract The aim of this paper is to review the current and most relevant literature on the use of altmetric providers since 2012. This review is supported by a meta-analysis of the coverage and metric counts obtained by more than 100 publications that have used these bibliographic platforms for altmetric studies. The article is the most comprehensive analysis of alt- metric data providers (Lagotto, Altmetric.com, ImpactStory, Mendeley, PlumX, Crossref Event Data) and explores the co- verage of publications, social media and events from a longitudinal view. Disciplinary differences were also analysed. The results show that most of the studies are based on Altmetric.com data. This provider is the service that captures most mentions from social media sites, blogs and news outlets.PlumX has better coverage, counting moreMendeley readers, but capturing fewer events. CED has a special coverage of mentions fromWikipedia , while Lagotto and ImpactStory are becoming disused products because of their limited reach.
    [Show full text]
  • Bibliometrics to Altmetrics: a Changing Trend
    International Journal of Library and Information Studies Vol.7(4) Oct-Dec, 2017 ISSN: 2231-4911 Bibliometrics to Altmetrics: A Changing Trend Ajay Kumar Chaubey Junior Research Fellow Department of Library and Information Science, Banaras Hindu University, Varanasi-221005 Email: [email protected] Abstract - Altmetrics include a much broader spectrum of measurements like citation counts, web based references, article views/downloads, social media mentions, news media mentions, etc. Traditional means of measuring the quality of new knowledge like the impact factor and h-index are being more effective and more meaningful through the addition of new, social media based altmetrics. altmetrics could reflect an alternative dimension of the research performance, close, perhaps, to science popularization and networking abilities, but far from citation impact. altmetrics are especially encouraging for research fields in the humanities that currently are difficult to study using established bibliometric methods. In this paper introduce altmetrics, their uses and its possible implications in libraries. The present study is based on review of published work in the field of altmetrics, bibliometrics and library and information Science. Keyword: Altmetrics, Social Media, Bibliometrics,Scientometrics Introduction: The statistical analysis of scientific literature began in 1917 by Cole and Eales in his study “History of Comparative anatomy”. These studied the contributions in the field of anatomy by counting the number of publications produced by different countries. In 1923 the study was conducted by Hulme, entitled "Statistical Analysis of the history of Science" which analysis was based on the original entries in the seventeen sections of the "English International catalogue of Scientific Literature. In 1926, A.J.
    [Show full text]
  • Bibliometrics, Scientometrics, Webometrics / Cybermetrics, Informetrics and Altmetrics - an Emerging Field in Library and Information Science Research OPEN ACCESS Dr
    SHANLAX s han lax International Journal of Education # S I N C E 1 9 9 0 Bibliometrics, Scientometrics, Webometrics / Cybermetrics, Informetrics and Altmetrics - An Emerging Field in Library and Information Science Research OPEN ACCESS Dr. P. Chellappandi Ph. D. Assistant Professor, Department of Library & Information Science Volume: 7 Madurai Kamaraj University, Madurai, Tamil Nadu, India Issue: 1 C.S. Vijayakumar Librarian, Sourashtra College, Madurai & Ph.D Scholar, Department of Library & Information Science Month: December Madurai Kamaraj University, Madurai, Tamil Nadu, India Year: 2018 Abstract This Article discusses the features and the use of Metrics such as Bibliometrics, Scientometrics, Webometrics, Informetrics and Altmetrics in the field of Library and Information Science ISSN: 2320-2653 Research at present. Received: 11.11.2018 Introduction The contemporary research thrust areas of Library and information science Accepted: 29.12.2018 based on measurement process. Present developments of Bibliometrics, Scientometrics, Webometrics and Informatrics in Library and Information Published: 31.12.2018 Science have the capacity to understand relevant ideas from different fields of knowledge. The Scientometric study of library and information Citation: science to standardize the techniques to understand the productive pattern Chellappandi, P., and C.S. of authors. Scientometrics is related to and has overlapping interests with Vijayakumar. “Bibliometrics, bibliometrics, Informetrics and webometrics. Scientometrics, Webometrics/ Cybermetrics, Informetrics and Altmetrics - An Emerging Field in Library and Information Science Research.” Shanlax International Journal of Education, vol. 7, no. 1, 2018, pp. 5–8. Bibliometrics is a study of relationship of numbers and patterns in DOI: bibliographic data and use i.e. number of papers, growth of literature and http/doi.org/10.5281/ patterns of library and data base usage.
    [Show full text]