Choosing the Right Journal for Your Research a Comprehensive Guide for Researchers
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Sci-Hub Provides Access to Nearly All Scholarly Literature
Sci-Hub provides access to nearly all scholarly literature A DOI-citable version of this manuscript is available at https://doi.org/10.7287/peerj.preprints.3100. This manuscript was automatically generated from greenelab/scihub-manuscript@51678a7 on October 12, 2017. Submit feedback on the manuscript at git.io/v7feh or on the analyses at git.io/v7fvJ. Authors • Daniel S. Himmelstein 0000-0002-3012-7446 · dhimmel · dhimmel Department of Systems Pharmacology and Translational Therapeutics, University of Pennsylvania · Funded by GBMF4552 • Ariel Rodriguez Romero 0000-0003-2290-4927 · arielsvn · arielswn Bidwise, Inc • Stephen Reid McLaughlin 0000-0002-9888-3168 · stevemclaugh · SteveMcLaugh School of Information, University of Texas at Austin • Bastian Greshake Tzovaras 0000-0002-9925-9623 · gedankenstuecke · gedankenstuecke Department of Applied Bioinformatics, Institute of Cell Biology and Neuroscience, Goethe University Frankfurt • Casey S. Greene 0000-0001-8713-9213 · cgreene · GreeneScientist Department of Systems Pharmacology and Translational Therapeutics, University of Pennsylvania · Funded by GBMF4552 PeerJ Preprints | https://doi.org/10.7287/peerj.preprints.3100v2 | CC BY 4.0 Open Access | rec: 12 Oct 2017, publ: 12 Oct 2017 Abstract The website Sci-Hub provides access to scholarly literature via full text PDF downloads. The site enables users to access articles that would otherwise be paywalled. Since its creation in 2011, Sci- Hub has grown rapidly in popularity. However, until now, the extent of Sci-Hub’s coverage was unclear. As of March 2017, we find that Sci-Hub’s database contains 68.9% of all 81.6 million scholarly articles, which rises to 85.2% for those published in toll access journals. -
Gold Open Access Publishing in Mega-Journals: Developing Countries Pay the Price of Western Premium Academic Output
VU Research Portal Gold Open Access Publishing in Mega Journals: Developing Countries Pay the Price of Western Premium Academic Output Ellers, J.; Crowther, T.W.; Harvey, J.A. published in Journal of Scholarly Publishing 2017 DOI (link to publisher) 10.3138/jsp.49.1.89 Link to publication in VU Research Portal citation for published version (APA) Ellers, J., Crowther, T. W., & Harvey, J. A. (2017). Gold Open Access Publishing in Mega Journals: Developing Countries Pay the Price of Western Premium Academic Output. Journal of Scholarly Publishing , 49(1), 89-102. https://doi.org/10.3138/jsp.49.1.89 General rights Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights. • Users may download and print one copy of any publication from the public portal for the purpose of private study or research. • You may not further distribute the material or use it for any profit-making activity or commercial gain • You may freely distribute the URL identifying the publication in the public portal ? Take down policy If you believe that this document breaches copyright please contact us providing details, and we will remove access to the work immediately and investigate your claim. E-mail address: [email protected] Download date: 27. Sep. 2021 Gold Open Access Publishing in Mega-Journals developing countries pay the price of western premium academic output jacintha ellers, thomas w. -
Academic Communities: the Role of Journals and Open-Access Mega-Journals in Scholarly Communication
This is a repository copy of Academic communities: the role of journals and open-access mega-journals in scholarly communication. White Rose Research Online URL for this paper: http://eprints.whiterose.ac.uk/134132/ Version: Published Version Article: Wakeling, S., Spezi, V., Fry, J. et al. (3 more authors) (2019) Academic communities: the role of journals and open-access mega-journals in scholarly communication. Journal of Documentation, 75 (1). pp. 120-139. ISSN 0022-0418 https://doi.org/10.1108/JD-05-2018-0067 Reuse This article is distributed under the terms of the Creative Commons Attribution (CC BY) licence. This licence allows you to distribute, remix, tweak, and build upon the work, even commercially, as long as you credit the authors for the original work. More information and the full terms of the licence here: https://creativecommons.org/licenses/ Takedown If you consider content in White Rose Research Online to be in breach of UK law, please notify us by emailing [email protected] including the URL of the record and the reason for the withdrawal request. [email protected] https://eprints.whiterose.ac.uk/ Journal of Documentation Academic communities: The role of j ournals and open-access mega-j ournals in scholarly communication Simon Wakeling, Valerie Spezi, Jenny Fry, Claire Creaser, Stephen Pinfield, Peter Willett, Article information: To cite this document: Simon Wakeling, Valerie Spezi, Jenny Fry, Claire Creaser, Stephen Pinfield, Peter Willett, (2018) "Academic communities: The role of journals and open-access mega-journals in scholarly communication", Journal of Documentation, https://doi.org/10.1108/JD-05-2018-0067 Permanent link to this document: https://doi.org/10.1108/JD-05-2018-0067 Downloaded on: 30 October 2018, At: 08:43 (PT) References: this document contains references to 51 other documents. -
Motivations, Understandings, and Experiences of Open‐Access Mega
Motivations, Understandings, and Experiences of Open-Access Mega-Journal Authors: Results of a Large-Scale Survey Simon Wakeling Information School, University of Sheffield, Regent Court, 211 Portobello, Sheffield, S1 4DP, UK. E-mail: s.wakeling@sheffield.ac.uk Claire Creaser LISU, Centre for Information Management, School of Business and Economics, Loughborough University, Loughborough, LE11 3TU, UK. E-mail: [email protected] Stephen Pinfield Information School, University of Sheffield, Regent Court, 211 Portobello, Sheffield, S1 4DP, UK. E-mail: s.pinfield@sheffield.ac.uk Jenny Fry School of the Arts, English and Drama, Loughborough University, Loughborough, LE11 3TU, UK. E-mail: j. [email protected] Valérie Spezi LISU, Centre for Information Management, School of Business and Economics, Loughborough University, Loughborough, LE11 3TU, UK. E-mail: [email protected] Peter Willett Information School, University of Sheffield, Regent Court, 211 Portobello, Sheffield, S1 4DP, UK. E-mail: p.willett@sheffield.ac.uk Monica Paramita Information School, University of Sheffield, Regent Court, 211 Portobello, Sheffield, S1 4DP, UK. E-mail: m.paramita@sheffield.ac.uk Open-access mega-journals (OAMJs) are characterized by their large scale, wide scope, open-access (OA) busi- ness model, and “soundness-only” peer review. The last Additional Supporting Information may be found in the online version of of these controversially discounts the novelty, signifi- this article. cance, and relevance of submitted articles and assesses only their “soundness.” This article reports the results of Received January 9, 2018; revised June 21, 2018; accepted September an international survey of authors (n=11,883), compar- 25, 2018 ing the responses of OAMJ authors with those of other OA and subscription journals, and drawing comparisons © 2019 The Authors. -
The Journal Ranking System Undermining the Impact of 2 Brazilian Science 3 4 Rodolfo Jaffé1 5 6 1 Instituto Tecnológico Vale, Belém-PA, Brazil
bioRxiv preprint doi: https://doi.org/10.1101/2020.07.05.188425; this version posted July 6, 2020. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under aCC-BY-NC-ND 4.0 International license. 1 QUALIS: The journal ranking system undermining the impact of 2 Brazilian science 3 4 Rodolfo Jaffé1 5 6 1 Instituto Tecnológico Vale, Belém-PA, Brazil. Email: [email protected] 7 8 Abstract 9 10 A journal ranking system called QUALIS was implemented in Brazil in 2009, intended to rank 11 graduate programs from different subject areas and promote selected national journals. Since this 12 system uses a complicated suit of criteria (differing among subject areas) to group journals into 13 discrete categories, it could potentially create incentives to publish in low-impact journals ranked 14 highly by QUALIS. Here I assess the influence of the QUALIS journal ranking system on the 15 global impact of Brazilian science. Results reveal a steeper decrease in the number of citations 16 per document since the implementation of this QUALIS system, compared to the top Latin 17 American countries publishing more scientific articles. All the subject areas making up the 18 QUALIS system showed some degree of bias, with social sciences being usually more biased 19 than natural sciences. Lastly, the decrease in the number of citations over time proved steeper in a 20 more biased area, suggesting a faster shift towards low-impact journals ranked highly by 21 QUALIS. -
Conference Accreditation and Need of a Bibliometric Measure to Distinguish Predatory Conferences
publications Viewpoint Conference Accreditation and Need of a Bibliometric Measure to Distinguish Predatory Conferences Pooyan Makvandi 1,* , Anahita Nodehi 2 and Franklin R. Tay 3 1 Centre for Materials Interfaces, Istituto Italiano di Tecnologia, Viale Rinaldo Piaggio 34, 56025 Pontedera, Italy 2 Department of Statistics, Computer Science, Applications (DiSIA), Florence University, Viale Morgagni 59, 50134 Florence, Italy; Anahita.nodehi@unifi.it 3 The Graduate School, Augusta University, Augusta, GA 30912, USA; [email protected] * Correspondence: [email protected] or [email protected] Abstract: Academic conferences offer scientists the opportunity to share their findings and knowledge with other researchers. However, the number of conferences is rapidly increasing globally and many unsolicited e-mails are received from conference organizers. These e-mails take time for researchers to read and ascertain their legitimacy. Because not every conference is of high quality, there is a need for young researchers and scholars to recognize the so-called “predatory conferences” which make a profit from unsuspecting researchers without the core purpose of advancing science or collaboration. Unlike journals that possess accreditation indices, there is no appropriate accreditation for international conferences. Here, a bibliometric measure is proposed that enables scholars to evaluate conference quality before attending. Keywords: conference indicator; conference impact factor; conference accreditation; bibliometric measure Citation: Makvandi, P.; Nodehi, A.; Tay, F.R. Conference Accreditation and Need of a Bibliometric Measure 1. Introduction to Distinguish Predatory Conferences. Academic conferences offer scientists the opportunity to share their findings and Publications 2021, 9, 16. https:// knowledge with other researchers. Conferences are organized by institutions or societies, doi.org/10.3390/publications9020016 and in rare cases, by individuals [1]. -
Citation Analysis: Web of Science, Scopus
Citation analysis: Web of science, scopus Masoud Mohammadi Golestan University of Medical Sciences Information Management and Research Network Citation Analysis • Citation analysis is the study of the impact and assumed quality of an article, an author, or an institution based on the number of times works and/or authors have been cited by others • Citation analysis is the examination of the frequency, patterns, and graphs of citations in documents. It uses the pattern of citations, links from one document to another document, to reveal properties of the documents. A typical aim would be to identify the most important documents in a collection. A classic example is that of the citations between academic articles and books.The judgements produced by judges of law to support their decisions refer back to judgements made in earlier cases so citation analysis in a legal context is important. Another example is provided by patents which contain prior art, citation earlier patents relevant to the current claim. Citation Databases • Citation databases are databases that have been developed for evaluating publications. The citation databases enable you to count citations and check, for example, which articles or journals are the most cited ones • In a citation database you get information about who has cited an article and how many times an author has been cited. You can also list all articles citing the same source. • Most important citation database are • “Web of Science”, • “Scopus” • “Google Scholar” Web of Sciences • Web of Science is owned and produced by Thomson Reuters. WoS is composed of three databases containing citations from international scientific journals: • Arts & Humanities Citation Index - AHCI • Social Sciences Citation Index - SSCI • Science Citation Index - SCI • Journal Coverage: • Aims to include the best journals of all fields. -
Mega-Journals” Reached the Limits to Growth?
Have the “mega-journals” reached the limits to growth? Bo-Christer Bjo¨rk Hanken School of Economics, Department of Management and Organisation, Helsinki, Finland ABSTRACT A “mega-journal” is a new type of scientific journal that publishes freely accessible articles, which have been peer reviewed for scientific trustworthiness, but leaves it to the readers to decide which articles are of interest and importance to them. In the wake of the phenomenal success of PLOS ONE, several other publishers have recently started mega-journals. This article presents the evolution of mega-journals since 2010 in terms of article publication rates. The fastest growth seems to have ebbed out at around 35,000 annual articles for the 14 journals combined. Acceptance rates are in the range of 50–70%, and speed of publication is around 3–5 months. Common features in mega-journals are alternative impact metrics, easy reusability of figures and data, post-publication discussions and portable reviews from other journals. Subjects Science and Medical Education, Science Policy Keywords Open access, Open access journal INTRODUCTION The scientific scholarly journal emerged in the 17th century and evolved as an institution into its current form in the mid 20th century. A traditional scientific journal appears with a number of regular issues, the format of the articles is more or less standardized depending on the subject field, and the way articles are accepted and edited follows peer review Submitted 7 April 2015 Accepted 7 May 2015 routines which most academics become intimately familiar with during their careers Published 26 May 2015 as authors, reviewers and editors. -
Introduction to Bibliometrics
Introduction to Bibliometrics Bibliometrics A new service of the university library to support academic research Carolin Ahnert and Martin Bauschmann This work is licensed under: Creative Commons Attribution 4.0 International License. Chemnitz ∙ 16.05.17 ∙ Carolin Ahnert, Martin Bauschmann www.tu-chemnitz.de Introduction to Bibliometrics What is bibliometrics? – short definition „Bibliometrics is the statistical analysis of bibliographic data, commonly focusing on citation analysis of research outputs and publications, i.e. how many times research outputs and publications are being cited“ (University of Leeds, 2014) Quantitative analysis and visualisation of scientific research output based on publication and citation data Chemnitz ∙ 16.05.17 ∙ Carolin Ahnert, Martin Bauschmann www.tu-chemnitz.de Introduction to Bibliometrics What is bibliometrics? – a general classification descriptive bibliometrics evaluative bibliometrics Identification of relevant research topics Evaluation of research performance cognition or thematic trends (groups of researchers, institutes, Identification of key actors universities, countries) interests Exploration of cooperation patterns and Assessment of publication venues communication structures (especially journals) Interdisciplinarity Productivity → visibility → impact → Internationality quality? examined Topical cluster constructs Research fronts/ knowledge bases Social network analysis: co-author, co- Indicators: number of articles, citation methods/ citation, co-word-networks etc. rate, h-Index, -
Understanding Research Metrics INTRODUCTION Discover How to Monitor Your Journal’S Performance Through a Range of Research Metrics
Understanding research metrics INTRODUCTION Discover how to monitor your journal’s performance through a range of research metrics. This page will take you through the “basket of metrics”, from Impact Factor to usage, and help you identify the right research metrics for your journal. Contents UNDERSTANDING RESEACH METRICS 2 What are research metrics? What are research metrics? Research metrics are the fundamental tools used across the publishing industry to measure performance, both at journal- and author-level. For a long time, the only tool for assessing journal performance was the Impact Factor – more on that in a moment. Now there are a range of different research metrics available. This “basket of metrics” is growing every day, from the traditional Impact Factor to Altmetrics, h-index, and beyond. But what do they all mean? How is each metric calculated? Which research metrics are the most relevant to your journal? And how can you use these tools to monitor your journal’s performance? For a quick overview, download our simple guide to research metrics. Or keep reading for a more in-depth look at what’s in the “basket of metrics” and how to interpret it. UNDERSTANDING RESEACH METRICS 3 Citation-based metrics Citation-based metrics IMPACT FACTOR What is the Impact Factor? The Impact Factor is probably the most well-known metric for assessing journal performance. Designed to help librarians with collection management in the 1960s, it has since become a common proxy for journal quality. The Impact Factor is a simple research metric: it’s the average number of citations received by articles in a journal within a two-year window. -
1 Scientometric Indicators and Their Exploitation by Journal Publishers
Scientometric indicators and their exploitation by journal publishers Name: Pranay Parsuram Student number: 2240564 Course: Master’s Thesis (Book and Digital Media Studies) Supervisor: Prof. Fleur Praal Second reader: Dr. Adriaan van der Weel Date of completion: 12 July 2019 Word count: 18177 words 1 Contents 1. Introduction ............................................................................................................................ 3 2. Scientometric Indicators ........................................................................................................ 8 2.1. Journal Impact Factor ...................................................................................................... 8 2.2. h-Index .......................................................................................................................... 10 2.3. Eigenfactor™ ................................................................................................................ 11 2.4. SCImago Journal Rank.................................................................................................. 13 2.5. Source Normalized Impact Per Paper ........................................................................... 14 2.6. CiteScore ....................................................................................................................... 15 2.6. General Limitations of Citation Count .......................................................................... 16 3. Conceptual Framework ....................................................................................................... -
TABLE 46.4: COMPARISON of TOP 10 JOURNAL JIF, Citescore, SNIP, SIR, FCR and Google
TABLE 46.4: COMPARISON OF TOP 10 JOURNAL JIF, Citescore, SNIP, SIR, FCR and Google This table compares current rankings of journals across six sources, regardless of their subject categories. JCR is subscription only, uses journals from SCI and SSCI and calculates a ratio of citations comparing citations to citable documents.- articles and reviews. The methodology is size independent and having more documents and citations does not lead to a higher ranking. CiteScore uses a similar methodology and includes book chapters, conference papers and data papers. It uses a different time period than JIF. SNIP and SJR use the same dataset as Citescore. SNIP uses a similar ratio but normalizes based on subject. SJR uses the same size independent approach and also considers the prestige of the citing journal. It has its own H-Index. FCR, Field Citation Ratio, is normalized for subject and age of the article and requires a subscription. Google Metrics compiled by a computer program. RANKS SOURCE TITLE JIF CiteScore SNIP SJR FCR h5-index TOP TOP JCR Scopus CWTS Scimago Dimensions Google 10 20 CA:A Cancer Journal for Clinicians (1) 1 1 1 1 6 5 5 New England Journal of Medicine (2) 2 13 8 19 1 2 4 6 Nature Reviews Materials (1) 3 3 5 3 4 16 in subc 5 5 Nature Reviews Drug Discovery 4 53 22 156 5 1 in subc 2 2 The Lancet 5 7 4 29 44 4 4 4 Nature Reviews Molecular Cell Biology 6 8 20 5 15 67 3 5 Nature Reviews Clinical Oncology (3) 7 44 71 72 249 13 in subc 1 1 Nature Reviews Cancer 8 10 18 10 9 90 4 5 Chemical Reviews (2) 9 4 9 12 11 9 4 6 Nature Energy