A Comprehensive Analysis of the Journal Evaluation System in China

Total Page:16

File Type:pdf, Size:1020Kb

A Comprehensive Analysis of the Journal Evaluation System in China A comprehensive analysis of the journal evaluation system in China Ying HUANG1,2, Ruinan LI1, Lin ZHANG1,2,*, Gunnar SIVERTSEN3 1School of Information Management, Wuhan University, China 2Centre for R&D Monitoring (ECOOM) and Department of MSI, KU Leuven, Belgium 3Nordic Institute for Studies in Innovation, Research and Education, Tøyen, Oslo, Norway Abstract Journal evaluation systems are important in science because they focus on the quality of how new results are critically reviewed and published. They are also important because the prestige and scientific impact of journals is given attention in research assessment and in career and funding systems. Journal evaluation has become increasingly important in China with the expansion of the country’s research and innovation system and its rise as a major contributor to global science. In this paper, we first describe the history and background for journal evaluation in China. Then, we systematically introduce and compare the most influential journal lists and indexing services in China. These are: the Chinese Science Citation Database (CSCD); the journal partition table (JPT); the AMI Comprehensive Evaluation Report (AMI); the Chinese STM Citation Report (CJCR); the “A Guide to the Core Journals of China” (GCJC); the Chinese Social Sciences Citation Index (CSSCI); and the World Academic Journal Clout Index (WAJCI). Some other lists published by government agencies, professional associations, and universities are briefly introduced as well. Thereby, the tradition and landscape of the journal evaluation system in China is comprehensively covered. Methods and practices are closely described and sometimes compared to how other countries assess and rank journals. Keywords: scientific journal; journal evaluation; peer review; bibliometrics 1 1. Introduction China is among the many countries where the career prospects of researchers, in part, depend on the journals in which they publish. Knowledge of which journals are considered prestigious and which are of dubious quality is critical to the scientific community for assessing the standing of a research institution, for tenure decisions, grant funding, performance evaluations, etc. The process of journal evaluation dates back to the 1930s when the British mathematician, librarian and documentalist Samuel C. Bradford published his study on publications in geophysics and lubrication. The paper presented an empirical law now known as Bradford’s law of scattering along with the concept of core area journals (Bradford, 1934). Bradford influenced Eugene Garfield the USA, who subsequently published a groundbreaking paper on citation indexing, “Citation Indexes for Science: A New Dimension in Documentation through Association of Ideas.” According to Garfield (1955), “the citation index ... may help a historian to measure the influence of an article — that is, its ‘impact factor’”. In the 1960s, Garfield conducted a large-scale statistical analysis of citations in the literature, reaching the conclusion that many citations were concentrated in just a few journals and the many remaining journals only accounted for a few citations (Garfield, 1963; 1964). Garfield went on to create the Institute for Scientific Information (ISI), then successively published the Science Citation Index (SCI), Social Science Citation Index (SSCI), and the Art and Humanities Citation Index databases. Assessing the quality of previously published research output is important in all contexts where research assessment takes place, for example, when evaluating the success of research projects and when distributing research funding (Pan Su et al., 2017). As part of the assessment, evaluating and ranking the quality of the journals where the output was published has become increasingly important (Mingers and Yang, 2017). Journal evaluation and rankings are used by governments, organizations, universities, schools, and departments to evaluate the quality and quantity of faculty research productivity, ranging from promotion and tenure to monetary rewards (Black et al., 2017). Even though the merit of using such systems is not universally agreed upon (Dobson, 2014) and even contested (Lin Zhang et al., 2017),it is, however, widely believed that the rank or citation impact of a journal is supposed to represent its prestige, influence, and probably the difficulty of having a paper accepted for publication in it (Pan Su et al., 2017). Over the past few years, the number of papers published in international journals by Chinese researchers has seen a dramatic increase to the point that, today, China is 2 one of the world’s leading science and technology producers. In tandem, government policies and guidance, especially the call to “publish your best work in your motherland to benefit the local society”, proposed by President Xi in 20161, are leading more papers to be published in China’s domestic journals. With these increases in the number of papers and journals, it is an important task to explore the strengths and weaknesses of various methods for evaluating journals and to discuss what journal evaluation systems that might be suitable for China’s ambitions to contribute both internationally and locally. The journal evaluation system in China was established gradually, beginning with the introduction of foreign journal evaluation theories around 60 years ago. Over the last 30 years, these foreign theories have been adopted, adapted, researched, and vigorously redeveloped (Lan Ma, 2016). In the past, journal evaluation and selection results were mainly used to help librarians develop their collections and to help readers better identify core journals. However, in recent years, the journal evaluation and ranking have increasingly been applied research evaluation and management, i.e., in tenure decisions, grant funding, and performance evaluations, etc. (Shu et al., 2020). Many institutions are increasingly relying on journal impact factors to evaluate papers and researchers. This is commonly referred to in China as “evaluating a paper based on the journal and ranking list”. The higher the journal’s rank and impact factor, the higher is the expected outcome of evaluations. In the ever-changing environment of scientific research evaluation, the research and practice of journal evaluation in China is also evolving to meet different needs. Many influential journal evaluations and indexing systems have been established since the 1990s, with their evaluation methods and standards becoming increasingly mature. These activities have played a positive role in promoting the development of scientific research and have also been helpful for improving the quality of academic journals. The aim of this study is to review the progress of journal evaluation in China and present a comprehensive analysis of the current state-of-art. Hence, the main body of this article is a comparative analysis of the most influential journal lists in China. The results not only offer a deeper understanding of journal evaluation methods and practices in China but also reveal some insights into the journal evaluation activities of other countries. Overall, the aim is to make a valuable contribution to improving the theory and practice of journal evaluation and promoting the sustainable and healthy development of journal management and evaluation systems in China. 1 http://www.xinhuanet.com//politics/2016-05/31/c_1118965169.htm 3 2. Journal evaluation in China 2.1 A brief history Journal evaluation in China dates back to the 1960s with some fairly distinct stages over its development. Zhang Qiyu and Wang Enguang first introduced the Science Citation Index to Chinese readers in 1964 (Yaoming Zhang, 2015). In 1973, Wu Erzhong introduced a core journal list for chemistry. This was the first mention of the concept of a “core journal” (Erzhong Wu, 1973); In 1982, Liansheng Meng (1982) finished his Master’s thesis entitled “Chinese science citation analysis”, and then, in 1995, he published the Chinese Science Citation Index (CSCI) with the support of National Natural Science Foundation of China (NSFC). The main feature of this stage of development was translating international practice in journal evaluation and apply it to the Chinese context wholesale. At the same time, exploring the potential application of laws related to journal evaluation became an important topic for researchers in library and information science. In 1988, Jing Qinshu and Xian Jiaxiu used the “citation method” to identify a list of “Chinese Natural Science Core Journals”, which included 104 core Chinese journals in the natural sciences and was recognized as the first Chinese journal list (Jing and Xian, 1988). Around that same time, a few institutions began to undertake journal evaluation activities. As some examples, the Institute of Scientific and Technical Information of China (ISTIC) (commissioned by the Ministry of Science and Technology - formerly the National Scientific and Technological Commission) firstly analyzed the international publications indexed by Science Citation Index (SCI),Index to Scientific Reviews (ISR) and Index to Scientific&Technical Proceeding (ISTP) since 1987 and started to select domestic scientific journals to analyze the publications and citations of scientific journals in China since 1989. During this process, 1189 journals were selected from 3025 scientific journals nationwide and were selected as the journals of statistical sources that were adjusted annually thereafter (Qian, 2006). Hence, this second stage
Recommended publications
  • How to Search for Academic Journal Articles Online
    How to search for Academic Journal Articles Online While you CAN get to the online resources from the library page, I find that getting onto MyUT and clicking the LIBRARY TAB is much easier and much more familiar. I will start from there: Within the Library tab, there is a box called “Electronic Resources” and within that box is a hyperlink that will take you to “Research Databases by Name.” Click that link as shown below: The page it will take you looks like the picture below. Click “Listed by Name.” This will take you to a list starting with A, and the top selection is the one you want, it is called “Academic Search Complete.” Click it as pictured below: THIS SECTION IS ONLY IF YOU ARE ON AN OFF-CAMPUS COMPUTER: You will be required to log-in if you are off campus. The First page looks like this: Use the pull-down menu to find “University of Toledo” The Branch should default to “Main Campus,” which is what you want. Then click “Submit.” Next it will ask for you First and Last Name and your Rocket ID. If you want to use your social security number, that is also acceptable (but a little scary.). If you use your rocket ID, be sure to include the R at the beginning of the number. Then click Submit again and you are IN. The opening page has the searchbox right in the middle. When searching, start narrow and then get broader if you do not find enough results. For Example, when researching Ceremony by Leslie Silko, you may want your first search to be “Silko, Ceremony.” If you don’t find enough articles, you may then want to just search “Silko.” Finally, you may have to search for “Native American Literature.” And so on and so forth.
    [Show full text]
  • Sci-Hub Provides Access to Nearly All Scholarly Literature
    Sci-Hub provides access to nearly all scholarly literature A DOI-citable version of this manuscript is available at https://doi.org/10.7287/peerj.preprints.3100. This manuscript was automatically generated from greenelab/scihub-manuscript@51678a7 on October 12, 2017. Submit feedback on the manuscript at git.io/v7feh or on the analyses at git.io/v7fvJ. Authors • Daniel S. Himmelstein 0000-0002-3012-7446 · dhimmel · dhimmel Department of Systems Pharmacology and Translational Therapeutics, University of Pennsylvania · Funded by GBMF4552 • Ariel Rodriguez Romero 0000-0003-2290-4927 · arielsvn · arielswn Bidwise, Inc • Stephen Reid McLaughlin 0000-0002-9888-3168 · stevemclaugh · SteveMcLaugh School of Information, University of Texas at Austin • Bastian Greshake Tzovaras 0000-0002-9925-9623 · gedankenstuecke · gedankenstuecke Department of Applied Bioinformatics, Institute of Cell Biology and Neuroscience, Goethe University Frankfurt • Casey S. Greene 0000-0001-8713-9213 · cgreene · GreeneScientist Department of Systems Pharmacology and Translational Therapeutics, University of Pennsylvania · Funded by GBMF4552 PeerJ Preprints | https://doi.org/10.7287/peerj.preprints.3100v2 | CC BY 4.0 Open Access | rec: 12 Oct 2017, publ: 12 Oct 2017 Abstract The website Sci-Hub provides access to scholarly literature via full text PDF downloads. The site enables users to access articles that would otherwise be paywalled. Since its creation in 2011, Sci- Hub has grown rapidly in popularity. However, until now, the extent of Sci-Hub’s coverage was unclear. As of March 2017, we find that Sci-Hub’s database contains 68.9% of all 81.6 million scholarly articles, which rises to 85.2% for those published in toll access journals.
    [Show full text]
  • Open Access Availability of Scientific Publications
    Analytical Support for Bibliometrics Indicators Open access availability of scientific publications Analytical Support for Bibliometrics Indicators Open access availability of scientific publications* Final Report January 2018 By: Science-Metrix Inc. 1335 Mont-Royal E. ▪ Montréal ▪ Québec ▪ Canada ▪ H2J 1Y6 1.514.495.6505 ▪ 1.800.994.4761 [email protected] ▪ www.science-metrix.com *This work was funded by the National Science Foundation’s (NSF) National Center for Science and Engineering Statistics (NCSES). Any opinions, findings, conclusions or recommendations expressed in this report do not necessarily reflect the views of NCSES or the NSF. The analysis for this research was conducted by SRI International on behalf of NSF’s NCSES under contract number NSFDACS1063289. Analytical Support for Bibliometrics Indicators Open access availability of scientific publications Contents Contents .............................................................................................................................................................. i Tables ................................................................................................................................................................. ii Figures ................................................................................................................................................................ ii Abstract ............................................................................................................................................................
    [Show full text]
  • 1 Data Policies of Highly-Ranked Social Science
    1 Data policies of highly-ranked social science journals1 Mercè Crosas2, Julian Gautier2, Sebastian Karcher3, Dessi Kirilova3, Gerard Otalora2, Abigail Schwartz2 Abstract By encouraging and requiring that authors share their data in order to publish articles, scholarly journals have become an important actor in the movement to improve the openness of data and the reproducibility of research. But how many social science journals encourage or mandate that authors share the data supporting their research findings? How does the share of journal data policies vary by discipline? What influences these journals’ decisions to adopt such policies and instructions? And what do those policies and instructions look like? We discuss the results of our analysis of the instructions and policies of 291 highly-ranked journals publishing social science research, where we studied the contents of journal data policies and instructions across 14 variables, such as when and how authors are asked to share their data, and what role journal ranking and age play in the existence and quality of data policies and instructions. We also compare our results to the results of other studies that have analyzed the policies of social science journals, although differences in the journals chosen and how each study defines what constitutes a data policy limit this comparison. We conclude that a little more than half of the journals in our study have data policies. A greater share of the economics journals have data policies and mandate sharing, followed by political science/international relations and psychology journals. Finally, we use our findings to make several recommendations: Policies should include the terms “data,” “dataset” or more specific terms that make it clear what to make available; policies should include the benefits of data sharing; journals, publishers, and associations need to collaborate more to clarify data policies; and policies should explicitly ask for qualitative data.
    [Show full text]
  • Citation Indexing Services: ISI, Wos and Other Facilities
    Citation Indexing Impact Factor and “h-index” Where do we stand? Citation Indexing Services: ISI, WoS and Other Facilities Volker RW Schaa GSI Helmholtzzentrum für Schwerionenforschung GmbH Darmstadt, Germany JACoW Team Meeting 2012 IFIC@Valencia, Spain November, 2012 Volker RW Schaa Citation Indexing Services: ISI, WoS and Other Facilities Citation Indexing Impact Factor and “h-index” Where do we stand? Citation Indexing Services: ISI, WoS and Other Facilities Volker RW Schaa GSI Helmholtzzentrum für Schwerionenforschung GmbH Darmstadt, Germany JACoW Team Meeting 2012 IFIC@Valencia, Spain November, 2012 Volker RW Schaa Citation Indexing Services: ISI, WoS and Other Facilities Citation Indexing Impact Factor and “h-index” Where do we stand? 1 Citation Indexing What are we talking about? Citation Indexing: How does it work? 2 Impact Factor and “h-index” Popularity Usage JACoW in “ISI Web of Knowledge”? 3 Where do we stand? References, references, references, . Volker RW Schaa Citation Indexing Services: ISI, WoS and Other Facilities WoS — Web of Science “Web of Science” is an online academic citation index provided by Thomson Reuters. ISI Web of Knowledge Thomson Scientific & Healthcare bought “ISI” in 1992 and since then it is known as “Thomson ISI”. “Web of Science” is now called “ISI Web of Knowledge”. Citation Indexing What are we talking about? Impact Factor and “h-index” Citation Indexing: How does it work? Where do we stand? What are we talking about? ISI — Institute for Scientific Information ISI was founded in 1960. It publishes the annual “Journal Citation Reports” which lists impact factors for each of the journals that it tracks. Volker RW Schaa Citation Indexing Services: ISI, WoS and Other Facilities ISI Web of Knowledge Thomson Scientific & Healthcare bought “ISI” in 1992 and since then it is known as “Thomson ISI”.
    [Show full text]
  • D2.2: Research Data Exchange Solution
    H2020-ICT-2018-2 /ICT-28-2018-CSA SOMA: Social Observatory for Disinformation and Social Media Analysis D2.2: Research data exchange solution Project Reference No SOMA [825469] Deliverable D2.2: Research Data exchange (and transparency) solution with platforms Work package WP2: Methods and Analysis for disinformation modeling Type Report Dissemination Level Public Date 30/08/2019 Status Final Authors Lynge Asbjørn Møller, DATALAB, Aarhus University Anja Bechmann, DATALAB, Aarhus University Contributor(s) See fact-checking interviews and meetings in appendix 7.2 Reviewers Noemi Trino, LUISS Datalab, LUISS University Stefano Guarino, LUISS Datalab, LUISS University Document description This deliverable compiles the findings and recommended solutions and actions needed in order to construct a sustainable data exchange model for stakeholders, focusing on a differentiated perspective, one for journalists and the broader community, and one for university-based academic researchers. SOMA-825469 D2.2: Research data exchange solution Document Revision History Version Date Modifications Introduced Modification Reason Modified by v0.1 28/08/2019 Consolidation of first DATALAB, Aarhus draft University v0.2 29/08/2019 Review LUISS Datalab, LUISS University v0.3 30/08/2019 Proofread DATALAB, Aarhus University v1.0 30/08/2019 Final version DATALAB, Aarhus University 30/08/2019 Page | 1 SOMA-825469 D2.2: Research data exchange solution Executive Summary This report provides an evaluation of current solutions for data transparency and exchange with social media platforms, an account of the historic obstacles and developments within the subject and a prioritized list of future scenarios and solutions for data access with social media platforms. The evaluation of current solutions and the historic accounts are based primarily on a systematic review of academic literature on the subject, expanded by an account on the most recent developments and solutions.
    [Show full text]
  • Downloaded Manually1
    The Journal Coverage of Web of Science and Scopus: a Comparative Analysis Philippe Mongeon and Adèle Paul-Hus [email protected]; [email protected] Université de Montréal, École de bibliothéconomie et des sciences de l'information, C.P. 6128, Succ. Centre-Ville, H3C 3J7 Montréal, Qc, Canada Abstract Bibliometric methods are used in multiple fields for a variety of purposes, namely for research evaluation. Most bibliometric analyses have in common their data sources: Thomson Reuters’ Web of Science (WoS) and Elsevier’s Scopus. This research compares the journal coverage of both databases in terms of fields, countries and languages, using Ulrich’s extensive periodical directory as a base for comparison. Results indicate that the use of either WoS or Scopus for research evaluation may introduce biases that favor Natural Sciences and Engineering as well as Biomedical Research to the detriment of Social Sciences and Arts and Humanities. Similarly, English-language journals are overrepresented to the detriment of other languages. While both databases share these biases, their coverage differs substantially. As a consequence, the results of bibliometric analyses may vary depending on the database used. Keywords Bibliometrics, citations indexes, Scopus, Web of Science, research evaluation Introduction Bibliometric and scientometric methods have multiple and varied application realms, that goes from information science, sociology and history of science to research evaluation and scientific policy (Gingras, 2014). Large scale bibliometric research was made possible by the creation and development of the Science Citation Index (SCI) in 1963, which is now part of Web of Science (WoS) alongside two other indexes: the Social Science Citation Index (SSCI) and the Arts and Humanities Citation Index (A&HCI) (Wouters, 2006).
    [Show full text]
  • What Are Preprints and How Do They Accelerate Science Communication?
    What are preprints and how do they accelerate science communication? Sarvenaz Sarabipour, PhD Johns Hopkins University United States eLife Ambassador Outline • Scholarly Communication: Why Open Science? • Preprints • Preprint popularity on the Rise • Values of preprints • Perceived Concerns on Preprinting • Peer-review • Final Thoughts Scholarly Communication: Why Open Science? • The outputs of scientific research: research articles, reviews, theory manuscripts, methods, data, reagents, and highly trained researchers. • Research productivity is increasing at an unprecedented rate. • Technological innovations, a surge in available computing power, and • The ease with which digital information is stored and communicated is helping researchers to cross experimentation boundaries, to increase data availability, and to facilitate the transfer of knowledge. • Scientists communicate amongst themselves at conferences, via journal articles, and, increasingly in the life sciences, in preprint manuscripts which have not been subject to journal-mediated peer review. 4 Preprints • Preprints are online, freely available (open-access) scientific manuscripts posted by authors on dedicated servers prior to journal-mediated peer review and publication in an academic journal • Now over 1.3 million preprints on arXiv • Approximately 40,000 preprints on bioRxiv, the latter representing the work of over 160,000 researchers from more than 100 countries. • Approximately 70% of bioRxiv articles posted before 2017 were subsequently published in 1,531 journals Preprint
    [Show full text]
  • Web of Science™ Core Collection Current Contents Connect®
    WEB OF SCIENCE™ CORE COLLECTION CURRENT CONTENTS CONNECT® XML USER GUIDE March, 2020 Table of Contents Overview 3 Support and Questions 4 Selection Criteria 5 XML Schemas 7 Schema Diagram 8 Source Record Identifiers 9 Document and Source Titles 11 Source Author Names 12 Full Names and Abbreviations 13 Chinese Author Names 13 Authors and Addresses 15 Research and Reprint Addresses 17 Organizations 18 Contributors 19 Cited References 21 Citations to Articles from Journal Supplements 22 Issue Information in the Volume Field 23 Cited Authors in References to Proceedings and Patents 23 © 2020 Clarivate Analytics 1 Counting Citations 24 Times Cited File 25 Delivery Schedule 26 Corrections and Gap Records 27 Deletions 28 Journal Lists and Journal Changes 29 Appendix 1 Subject Categories 30 Subject Catagories (Ascatype) 30 Web of Science™ Core Collection Subject Areas (Traditional Ascatype) 30 Research Areas (Extended Ascatype) 34 Current Contents Subject Codes 38 Current Contents Editions and Subjects 38 Appendix 2 Document Types 43 Document Types 43 Web of Science Core Collection Document Types 43 Current Contents Connect Document Types 44 Appendix 3 Abbreviations and Acronyms 46 Address Abbreviations 46 Country Abbreviations 51 Cited Patent Country Abbreviations 57 © 2020 Clarivate Analytics 2 Overview Your contract for raw data entitles you to get timely updates, which you may store and process according to the terms of your agreement. The associated XML schemas describe the record structure of the data and the individual elements that define
    [Show full text]
  • The Politics of Indexing and Ranking Academic Journals
    The Politics of Indexing and Ranking Academic Journals The ACME Editorial Collective1 As part of a recent application to Canada’s Social Sciences and Humanities Research Council (SSHRC) for funding to support publication of ACME,2 the journal’s editorial collective was required to provide information on “impact measures” for the journal as well as information on the indexes and journal aggregators that list ACME. The SSHRC instructions stated simply: Provide information on impact measures (or, in the absence of any third-party impact measures, describe the impact your journal has had). Identify the individual indexes and journal aggregators in which the journal is listed. (SSHRC, 2007, 6) At first blush, these instructions are pretty straightforward, but on further reflection — and put in their wider context – it becomes clear that there are some rather significant politics at play within these seemingly simple requests for mere “technical” information (Berg, 2006; Paasi, 2005; Sheppard, 2006). The problematic character of impact measures was reinforced for us when, as a result of our SSHRC application, we recently considered submitting an application to include ACME in the widely used ISI Web of Science journal index. Our intention was to increase the exposure of ACME to students and members of the academic community. However, our objective was to have the journal indexed in ISI, but not included in what we see as the onerous system of journal rankings 1 © ACME Editorial Collective, 2007 2 This application was successful, and we are grateful to SSHRC for supporting the journal for 2007-08. This support will allow ACME to increase the number of issues per volume.
    [Show full text]
  • Journal List Emerging Sources Citation Index (Web of Science) 2020
    JOURNAL TITLE ISSN eISSN PUBSLISHER NAME PUBLISHER ADDRESS 3C EMPRESA 2254‐3376 2254‐3376 AREA INNOVACION & DESARROLLO C/ELS ALZAMORA NO 17, ALCOY, ALICANTE, SPAIN, 03802 3C TECNOLOGIA 2254‐4143 2254‐4143 3CIENCIAS C/ SANTA ROSA 15, ALCOY, SPAIN, 03802 3C TIC 2254‐6529 2254‐6529 AREA INNOVACION & DESARROLLO C/ELS ALZAMORA NO 17, ALCOY, ALICANTE, SPAIN, 03802 3D RESEARCH 2092‐6731 2092‐6731 SPRINGER HEIDELBERG TIERGARTENSTRASSE 17, HEIDELBERG, GERMANY, D‐69121 3L‐LANGUAGE LINGUISTICS LITERATURE‐THE SOUTHEAST ASIAN JOURNAL OF ENGLISH LANGUAGE STUDIES 0128‐5157 2550‐2247 PENERBIT UNIV KEBANGSAAN MALAYSIA PENERBIT UNIV KEBANGSAAN MALAYSIA, FAC ECONOMICS & MANAGEMENT, BANGI, MALAYSIA, SELANGOR, 43600 452 F‐REVISTA DE TEORIA DE LA LITERATURA Y LITERATURA COMPARADA 2013‐3294 UNIV BARCELONA, FACULTAD FILOLOGIA GRAN VIA DE LES CORTS CATALANES, 585, BARCELONA, SPAIN, 08007 AACA DIGITAL 1988‐5180 1988‐5180 ASOC ARAGONESA CRITICOS ARTE ASOC ARAGONESA CRITICOS ARTE, HUESCA, SPAIN, 00000 AACN ADVANCED CRITICAL CARE 1559‐7768 1559‐7776 AMER ASSOC CRITICAL CARE NURSES 101 COLUMBIA, ALISO VIEJO, USA, CA, 92656 A & A PRACTICE 2325‐7237 2325‐7237 LIPPINCOTT WILLIAMS & WILKINS TWO COMMERCE SQ, 2001 MARKET ST, PHILADELPHIA, USA, PA, 19103 ABAKOS 2316‐9451 2316‐9451 PONTIFICIA UNIV CATOLICA MINAS GERAIS DEPT CIENCIAS BIOLOGICAS, AV DOM JOSE GASPAR 500, CORACAO EUCARISTICO, CEP: 30.535‐610, BELO HORIZONTE, BRAZIL, MG, 00000 ABANICO VETERINARIO 2007‐4204 2007‐4204 SERGIO MARTINEZ GONZALEZ TEZONTLE 171 PEDREGAL SAN JUAN, TEPIC NAYARIT, MEXICO, C P 63164 ABCD‐ARQUIVOS
    [Show full text]
  • Bibliometrics Development and Indicators in Science
    Vol. 7(9), pp. 161-172, November 2015 DOI: 10.5897/IJLIS2015.0584 Article Number: C3192D155901 International Journal of Library and Information ISSN 2141-2626 Copyright © 2015 Science Author(s) retain the copyright of this article http://www.academicjournals.org/IJLIS Full Length Research Paper Development of bibliometrics in Colombia Cristina Restrepo Arango1* and Rubén Urbizagástegui Alvarado2 1Programa de Posgrado en Bibliotecología y Estudios de la Información Universidad Nacional Autónoma de México, México. 2Library and Information Science, University of California at Riverside, Science Library, Riverside, CA 92521 – 5900, USA. Received 8 April, 2015:Accepted 6 October, 2015 The literature on bibliometrics published by Colombian and foreign authors who selected Colombian academic journals or events held in the country to communicate their findings is analyzed. The publications by Colombian researchers published abroad were also collected and analyzed. The type of documents used by researchers, the journals most used, and the languages used for communicating their findings were also studied. The growth of the literature, the network of co-authorships, the more productive authors, and scientific fields most researched were also analyzed. It was found 255 papers were published in academic journals, most of them in Spanish language; 77% of authors have published just one document, while 23% of them made between 2 and 24 contributions. The production of this kind of literature is concentrated in the last decade as well as the collabortion behavior of authors. The literature is growing in an exponential form at a rate of 20% per year and doubling in size every 4 years. Network analysis identified six research groups in the country.
    [Show full text]