Definitions of Predatory Publishing (E.G., Clark and Smith, 2015; Grudniewicz Et Al., 2019)

Total Page:16

File Type:pdf, Size:1020Kb

Definitions of Predatory Publishing (E.G., Clark and Smith, 2015; Grudniewicz Et Al., 2019) 1 The concept of predatory publishing is complex and often controversial. There are numerous – often similar – definitions of predatory publishing (e.g., Clark and Smith, 2015; Grudniewicz et al., 2019). However, these definitions are broad and rely on subjective, normative judgments of best practices and professional ethics in academic publishing. In part due to the subjectivity involved in many definitions of predatory publishing, debates about predatory publishing are often highly contentious. Different scholars and academic stakeholders may have different philosophies and beliefs regarding the appropriateness and legitimacy of varying publishing practices. Our research provides empirical data on non-indexed publishers to enable scholars and academic stakeholders to make informed decisions about the legitimacy – or lack thereof – of journals and publishers, regardless of professional preferences and philosophies. The opacity of peer review in most academic journals makes it difficult to directly observe the legitimacy and quality of peer review. Although manuscript development and gatekeeping processes vary between different journals, it is often difficult to know what happens in the “black box” of peer review. This challenge is compounded with predatory or questionable academic journals, as such publishers often operate in covert and/or deceptive manners. Further, predatory journals and publishers are rarely indexed by institutions like the Web of Science, which makes surveilling and analyzing such journals difficult. However, document data and metadata can provide empirical evidence about the professionalism and operating procedures of questionable academic journals. Using a variety of web scraping techniques, we developed a database of non-indexed academic publishers, which includes numerous publishers on the Cabells Predatory Reports list. This database – named The Lacuna Database – is currently comprised of over 1,000,000 articles and chronicles the scholarly work published through in journals. Recently, scholars have used computational analyses of large-scale data to reveal fraud, misinformation and questionable research practices in academic publishing (Bik, Casadevall and Fang, 2016; Petersen, 2019; Bishop, 2020). Our research follows this work. Using the Lacuna Database, we conducted empirical analyses of metadata of the most prominent publishers included on the Cabells Predatory Reports list. For additional comparison, we also included three prominent – but often controversial – Open Access-only publishers that are not on the Cabells list. However, the three publishers have also – fairly or not – historically faced charges of predation or illegitimacy: Frontiers, Hindawi and MDPI. Siler (2020) labelled those three publishers as grey publishers of uncertain and/or contested status. The metadata included in the Lacuna Database can be used to empirically identify characteristics of publishers widely seen as predatory, or merely borderline. Figure 1 – Publications, Peer Review Durations of Various Open Access Publishers 2 Figure 1 shows annual growth levels of publishers of varying degrees of status and legitimacy. Notably, MDPI and Frontiers exhibited exponential growth in the 2010s, while Hindawi kept publishing values relatively constant since 2013. The left side of the figure shows publisher strategies regarding growth. The right side of the figure shows contrasts in average time from submission to publication across various publishers. There is considerable variation in the time publishers take to develop, gatekeep and publish articles. Three publishers (Hindawi, Frontiers, Academic journals) cluster around 110-130 days, while the remainder cluster around 30-60 days. As a comparison point to the publishers visualized in Figure 1, submission to publication times are 162-178 days at PLOS ONE (PLOS, 2021). Publication speed is not necessarily deterministic of the quality of a journal or its peer review processes. However, both excessively fast or slow publication times are problematic, albeit for different reasons. For example, concerns have emerged regarding expedited peer review in COVID-19 research (Else, 2020). Many questionable or predatory publishers compete on speed; very rapid peer review attracts submissions for predatory publishers. Professional standards regarding optimal or acceptable peer review times are normative. Our research provides data to inform those normative decisions. For example, is MDPI’s publication speed – which is similar to that of clearly predatory publishers such as OMICS – a well-earned competitive advantage achieved via technical and logistical innovations, or is it indicative of haphazard peer review? Is the gradual slowing of Frontiers manuscripts over time indicative of increasingly arduous peer review or a negative consequence of the publisher’s exponential growth? The Lacuna Database will provide data to allow scholars and academic stakeholders make more-informed decisions about different academic publishers. These analyses represent the tip of the proverbial iceberg with the Lacuna Database. In the future, we will analyze the demographics of publishing scholars (country, institutional status) in varying questionable journals. Textual analyses can also provide empirical evidence of publishing quality and legitimacy. Misspellings and typos in affiliations (e.g., countries, universities), as well as topical and textual properties of manuscripts can provide empirical indices of publishing quality and legitimacy. A philosophy underpinning The Lacuna Database is that “sunshine is the best disinfectant.” Empirical analyses of questionable publishers can expose poor professional practices, but can also ‘exonerate’ publishers and journals unfairly under suspicion. It is expected that the first version of the Lacuna Database will be made public in late 2020, to enable researchers and academic stakeholders conduct large-scale research on questionable publishers. WORKS CITED Bik, Elisabeth M., Arturo Casadevall, Ferric C. Fang. 2016. “The Prevalence of Inappropriate Image Duplication in Biomedical Research Publications.” mBio, 7(3): e00809-16. Bishop, Dorothy. 2020. “Percent by most prolific author score: a red flag for possible editorial bias.” http://deevybee.blogspot.com/2020/07/percent-by-most-prolific-author-score.html Clark, Jocalyn and Richard Smith. 2015. “Firm action needed on predatory journals.” The BMJ, 350:h210. Else, Holly. 2020. “How a torrent of COVID science changed research publishing – in seven charts.” https://www.nature.com/articles/d41586-020-03564-y Grudniewicz, Agnes et al. 2019. “Predatory journals: no definition, no defence.” Nature, 576: 210-212. Petersen, Alexander M. 2019. “Megajournal mismanagement: Manuscript decision bias and anomalous editor activity at PLOS ONE.” Journal of Informetrics, 13(4): 1-22. PLOS. 2021. “Journal Information.” https://journals.plos.org/plosone/s/journal-information Sciencemag. 2021. “Journal Metrics Overview” https://www.sciencemag.org/journal-metrics Siler, Kyle. 2020. “Demarcating Spectrums of Predatory Publishing: Economic and Institutional Sources of Academic Legitimacy.” JASIST, 71(11): 1386-1401. .
Recommended publications
  • Is Sci-Hub Increasing Visibility of Indian Research Papers? an Analytical Evaluation Vivek Kumar Singh1,*, Satya Swarup Srichandan1, Sujit Bhattacharya2
    Journal of Scientometric Res. 2021; 10(1):130-134 http://www.jscires.org Perspective Paper Is Sci-Hub Increasing Visibility of Indian Research Papers? An Analytical Evaluation Vivek Kumar Singh1,*, Satya Swarup Srichandan1, Sujit Bhattacharya2 1Department of Computer Science, Banaras Hindu University, Varanasi, Uttar Pradesh, INDIA. 2CSIR-National Institute of Science Technology and Development Studies, New Delhi, INDIA. ABSTRACT Sci-Hub, founded by Alexandra Elbakyan in 2011 in Kazakhstan has, over the years, Correspondence emerged as a very popular source for researchers to download scientific papers. It is Vivek Kumar Singh believed that Sci-Hub contains more than 76 million academic articles. However, recently Department of Computer Science, three foreign academic publishers (Elsevier, Wiley and American Chemical Society) have Banaras Hindu University, filed a lawsuit against Sci-Hub and LibGen before the Delhi High Court and prayed for Varanasi-221005, INDIA. complete blocking these websites in India. It is in this context, that this paper attempts to Email id: [email protected] find out how many Indian research papers are available in Sci-Hub and who downloads them. The citation advantage of Indian research papers available on Sci-Hub is analysed, Received: 16-03-2021 with results confirming that such an advantage do exist. Revised: 29-03-2021 Accepted: 25-04-2021 Keywords: Indian Research, Indian Science, Black Open Access, Open Access, Sci-Hub. DOI: 10.5530/jscires.10.1.16 INTRODUCTION access publishing of their research output, and at the same time encouraging their researchers to publish in openly Responsible Research and Innovation (RRI) has become one accessible forms.
    [Show full text]
  • Sci-Hub Provides Access to Nearly All Scholarly Literature
    Sci-Hub provides access to nearly all scholarly literature A DOI-citable version of this manuscript is available at https://doi.org/10.7287/peerj.preprints.3100. This manuscript was automatically generated from greenelab/scihub-manuscript@51678a7 on October 12, 2017. Submit feedback on the manuscript at git.io/v7feh or on the analyses at git.io/v7fvJ. Authors • Daniel S. Himmelstein 0000-0002-3012-7446 · dhimmel · dhimmel Department of Systems Pharmacology and Translational Therapeutics, University of Pennsylvania · Funded by GBMF4552 • Ariel Rodriguez Romero 0000-0003-2290-4927 · arielsvn · arielswn Bidwise, Inc • Stephen Reid McLaughlin 0000-0002-9888-3168 · stevemclaugh · SteveMcLaugh School of Information, University of Texas at Austin • Bastian Greshake Tzovaras 0000-0002-9925-9623 · gedankenstuecke · gedankenstuecke Department of Applied Bioinformatics, Institute of Cell Biology and Neuroscience, Goethe University Frankfurt • Casey S. Greene 0000-0001-8713-9213 · cgreene · GreeneScientist Department of Systems Pharmacology and Translational Therapeutics, University of Pennsylvania · Funded by GBMF4552 PeerJ Preprints | https://doi.org/10.7287/peerj.preprints.3100v2 | CC BY 4.0 Open Access | rec: 12 Oct 2017, publ: 12 Oct 2017 Abstract The website Sci-Hub provides access to scholarly literature via full text PDF downloads. The site enables users to access articles that would otherwise be paywalled. Since its creation in 2011, Sci- Hub has grown rapidly in popularity. However, until now, the extent of Sci-Hub’s coverage was unclear. As of March 2017, we find that Sci-Hub’s database contains 68.9% of all 81.6 million scholarly articles, which rises to 85.2% for those published in toll access journals.
    [Show full text]
  • PUBLICATIONS and PUBLICATION POLICY (FLW, September 2016)
    PUBLICATIONS AND PUBLICATION POLICY (FLW, September 2016) General ............................................................................................................................. 2 Why publish? ........................................................................................................... 2 A publication strategy .............................................................................................. 2 Publish where? ........................................................................................................ 2 Aggregation level ..................................................................................................... 3 Social impact ........................................................................................................... 3 Publication types ............................................................................................................... 3 UGent publication typology ...................................................................................... 3 Web-of-science ........................................................................................................ 4 Vabb publication typology ........................................................................................ 4 FWO publication typology ........................................................................................ 5 Relevant publication channels .......................................................................................... 6 How to choose? ......................................................................................................
    [Show full text]
  • Sci-Hub Downloads Lead to More Article Citations
    THE SCI-HUB EFFECT:SCI-HUB DOWNLOADS LEAD TO MORE ARTICLE CITATIONS Juan C. Correa⇤ Henry Laverde-Rojas Faculty of Business Administration Faculty of Economics University of Economics, Prague, Czechia Universidad Santo Tomás, Bogotá, Colombia [email protected] [email protected] Fernando Marmolejo-Ramos Julian Tejada Centre for Change and Complexity in Learning Departamento de Psicologia University of South Australia Universidade Federal de Sergipe [email protected] [email protected] Štepánˇ Bahník Faculty of Business Administration University of Economics, Prague, Czechia [email protected] ABSTRACT Citations are often used as a metric of the impact of scientific publications. Here, we examine how the number of downloads from Sci-hub as well as various characteristics of publications and their authors predicts future citations. Using data from 12 leading journals in economics, consumer research, neuroscience, and multidisciplinary research, we found that articles downloaded from Sci-hub were cited 1.72 times more than papers not downloaded from Sci-hub and that the number of downloads from Sci-hub was a robust predictor of future citations. Among other characteristics of publications, the number of figures in a manuscript consistently predicts its future citations. The results suggest that limited access to publications may limit some scientific research from achieving its full impact. Keywords Sci-hub Citations Scientific Impact Scholar Consumption Knowledge dissemination · · · · Introduction Science and its outputs are essential in daily life, as they help to understand our world and provide a basis for better decisions. Although scientific findings are often cited in social media and shared outside the scientific community [1], their primary use is what we could call “scholar consumption.” This phenomenon includes using websites that provide subscription-based access to massive databases of scientific research [2].
    [Show full text]
  • How Frequently Are Articles in Predatory Open Access Journals Cited
    publications Article How Frequently Are Articles in Predatory Open Access Journals Cited Bo-Christer Björk 1,*, Sari Kanto-Karvonen 2 and J. Tuomas Harviainen 2 1 Hanken School of Economics, P.O. Box 479, FI-00101 Helsinki, Finland 2 Department of Information Studies and Interactive Media, Tampere University, FI-33014 Tampere, Finland; Sari.Kanto@ilmarinen.fi (S.K.-K.); tuomas.harviainen@tuni.fi (J.T.H.) * Correspondence: bo-christer.bjork@hanken.fi Received: 19 February 2020; Accepted: 24 March 2020; Published: 26 March 2020 Abstract: Predatory journals are Open Access journals of highly questionable scientific quality. Such journals pretend to use peer review for quality assurance, and spam academics with requests for submissions, in order to collect author payments. In recent years predatory journals have received a lot of negative media. While much has been said about the harm that such journals cause to academic publishing in general, an overlooked aspect is how much articles in such journals are actually read and in particular cited, that is if they have any significant impact on the research in their fields. Other studies have already demonstrated that only some of the articles in predatory journals contain faulty and directly harmful results, while a lot of the articles present mediocre and poorly reported studies. We studied citation statistics over a five-year period in Google Scholar for 250 random articles published in such journals in 2014 and found an average of 2.6 citations per article, and that 56% of the articles had no citations at all. For comparison, a random sample of articles published in the approximately 25,000 peer reviewed journals included in the Scopus index had an average of 18, 1 citations in the same period with only 9% receiving no citations.
    [Show full text]
  • Google Scholar: the Democratization of Citation Analysis?
    Google Scholar: the democratization of citation analysis? Anne-Wil Harzing Ron van der Wal Version November 2007 Accepted for Ethics in Science and Environmental Politics Copyright © 2007 Anne-Wil Harzing and Ron van der Wal. All rights reserved. Dr. Anne-Wil Harzing Email: [email protected] University of Melbourne Web: www.harzing.com Department of Management Faculty of Economics & Commerce Parkville Campus Melbourne, VIC 3010 Australia Google Scholar: the democratization of citation analysis? Anne-Wil Harzing* 1, Ron van der Wal2 1 Department of Management, University of Melbourne, Parkville Campus, Parkville, Victoria 3010, Australia * Email: [email protected] 2 Tarma Software Research, GPO Box 4063, Melbourne, Victoria 3001 Australia Running head: citation analysis with Google Scholar Key words: Google Scholar, citation analysis, publish or perish, h-index, g-index, journal impact factor Abstract Traditionally, the most commonly used source of bibliometric data is Thomson ISI Web of Knowledge, in particular the (Social) Science Citation Index and the Journal Citation Reports (JCR), which provide the yearly Journal Impact Factors (JIF). This paper presents an alternative source of data (Google Scholar, GS) as well as three alternatives to the JIF to assess journal impact (the h-index, g-index and the number of citations per paper). Because of its broader range of data sources, the use of GS generally results in more comprehensive citation coverage in the area of Management and International Business. The use of GS particularly benefits academics publishing in sources that are not (well) covered in ISI. Among these are: books, conference papers, non-US journals, and in general journals in the field of Strategy and International Business.
    [Show full text]
  • On the Relation Between the Wos Impact Factor, the Eigenfactor, the Scimago Journal Rank, the Article Influence Score and the Journal H-Index
    1 On the relation between the WoS impact factor, the Eigenfactor, the SCImago Journal Rank, the Article Influence Score and the journal h-index Ronald ROUSSEAU 1 and the STIMULATE 8 GROUP 2 1 KHBO, Dept. Industrial Sciences and Technology, Oostende, Belgium [email protected] 2Vrije Universiteit Brussel (VUB), Brussels, Belgium The STIMULATE 8 Group consists of: Anne Sylvia ACHOM (Uganda), Helen Hagos BERHE (Ethiopia), Sangeeta Namdev DHAMDHERE (India), Alicia ESGUERRA (The Philippines), Nguyen Thi Ngoc HOAN (Vietnam), John KIYAGA (Uganda), Sheldon Miti MAPONGA (Zimbabwe), Yohannis MARTÍ- LAHERA (Cuba), Kelefa Tende MWANTIMWA (Tanzania), Marlon G. OMPOC (The Philippines), A.I.M. Jakaria RAHMAN (Bangladesh), Bahiru Shifaw YIMER (Ethiopia). Abstract Four alternatives to the journal Impact Factor (IF) indicator are compared to find out their similarities. Together with the IF, the SCImago Journal Rank indicator (SJR), the EigenfactorTM score, the Article InfluenceTM score and the journal h- index of 77 journals from more than ten fields were collected. Results show that although those indicators are calculated with different methods and even use different databases, they are strongly correlated with the WoS IF and among each other. These findings corroborate results published by several colleagues and show the feasibility of using free alternatives to the Web of Science for evaluating scientific journals. Keywords: WoS impact factor, Eigenfactor, SCImago Journal Rank (SJR), Article Influence Score, journal h-index, correlations Introduction STIMULATE stands for Scientific and Technological Information Management in Universities and Libraries: an Active Training Environment. It is an international training programme in information management, supported by the Flemish Interuniversity Council (VLIR), aiming at young scientists and professionals from 2 developing countries.
    [Show full text]
  • Predatory Publishing: Top 10 Things You Need to Know
    Predatory Publishing: Top 10 Things You Need to Know By Gale A. Oren, MILS, Librarian, John W. Henderson Library, Kellogg Eye Center, University of Michigan 1. Open access vs. predatory journals Many open access journals are legitimate and reputable, and offer authors the means for maintaining copyright (right to distribute, etc.) over their own work. Those considered to be "predatory" are merely pay-to-publish websites that exploit researchers and ultimately reduce the credibility of published research. 2. Why are predatory journals on the rise? • Profitability for the predatory publishers • Author confusion as to which journals are reputable • Authors unaware of the harm caused by supporting this predatory industry • Demise of "Beall’s List" (2009–2016), a predatory journal blacklist that many relied upon for guidance 3. Obvious signs of predatory journals • Heavy solicitation of authors and editorial board members via email • Poor grammar, spelling, and punctuation on website and/or in emails • Journal titles similar to well-known reputable journals • Expedited peer review offered • Information about author fees, editorial policies, peer-review etc. not clearly stated • No verifiable contact information provided, including mailing address • Suspicious nature and quality of articles already published 4. Covert signs of predatory journals • Author fee charged before peer review, or author fee not mentioned at all • Unknown or unwilling editorial board members listed • Bogus impact factor • No response to emails once author fee is submitted 5. National Institutes of Health (NIH) position Ensuring the credibility of NIH funded research is important to maintaining public trust in research. 1,2 6. Federal Trade Commission (FTC) position The FTC brought a lawsuit against OMICS, a publisher based in India who went from 10 journals to over 700 in the past 8 years, claiming that publishing fees are not revealed prior to manuscript submission.
    [Show full text]
  • IT Reprint Hindawi
    Reprint and OA content. Emboldened by the suc- role in establishing the Open Access Hindawi cess of the hybrid model, Hindawi moved Scholarly Publishers Association (OASPA; to an all-OA approach for its journals by www.oaspa.org). According to OASPA, the 2007. By the end of that year, the company group’s aim is to support and “represent Publishing: announced a joint initiative with SAGE the interests of Open Access (OA) journal Publications to publish a collection of OA publishers globally in all scientific, tech- journals, leveraging Hindawi’s production nical, and scholarly disciplines … through A Working technology and SAGE’s editorial expertise. exchanging information, setting stan- dards, advancing models, advocacy, edu- OA Model Keys to Rapid Growth cation, and the promotion of innovation.” The move to an all-OA model has been Who’s the Customer? paying off. Paul Peters, Hindawi’s head by NANCY DAVIS KHO | of business development, says the com- In traditional scholarly publishing, the pany now publishes 275 journals, about customer has been a library subscriber to In the 3 years since academic pub- 10% of which are carried on the SAGE- journals. “We were terrible at dealing with lisher Hindawi Publishing Corp. moved Hindawi Access to Research platform. libraries,” says Peters. “Being in Egypt, to a strictly open access (OA) publishing Reflecting the growth in the number you just can’t send a sales team over to model, the perception of OA has shifted. of journals published is a commensurate talk to them about our journal offerings.” Once viewed as a radical upending of tra- jump in submissions.
    [Show full text]
  • Predatory Publishing in Management Research: a Call for Open Peer Review, Management Learning, 50(5): 607-619
    Predatory Publishing Working to eliminate predatory journals and conferences Twittter: @fake_journals Web site: https://predatory-publishing.com/ Thank you for downloading this document. It contains the three papers referred to in the blog post: https://predatory-publishing.com/read-these-three-articles-to-understand-predatory-publishing/ Please see the post if you need reminding. The citation for the next article is: Beall, J. (2013) Predatory publishing is just one of the consequences of gold open access, Learned Publishing, 26(2): pp 79-84. DOI: 1087/20130203 Predatory publishing is just one of the consequences of gold open access 79 Predatory publishing is just one of the consequences of gold open access Jeffrey Beall Learned Publishing, 26: 79–84 doi:10.1087/20130203 POINT OF VIEW Predatory publishing is just Introduction I have been closely following and par- one of the consequences of ticipating in the open-access (OA) movement since 2008. In that year, when the gold OA model fi rst began to be implemented on a large scale, gold open access I noticed the appearance of several new publishers that lacked trans- Jeffrey BEALL parency and used deceptive websites University of Colorado Denver to attract manuscript submissions and the accompanying author fees. This article examines the ways the gold open-access model is negatively affecting scholarly Initially, I printed out copies of their communication. web pages and placed them in a blue folder. In 2009, I published a review of the publisher Bentham Open the communication of science. I increased dramatically worldwide, in the library review journal the argue that the gold OA model is a creating the need and the markets for Charleston Advisor.
    [Show full text]
  • Citation Analysis: Web of Science, Scopus
    Citation analysis: Web of science, scopus Masoud Mohammadi Golestan University of Medical Sciences Information Management and Research Network Citation Analysis • Citation analysis is the study of the impact and assumed quality of an article, an author, or an institution based on the number of times works and/or authors have been cited by others • Citation analysis is the examination of the frequency, patterns, and graphs of citations in documents. It uses the pattern of citations, links from one document to another document, to reveal properties of the documents. A typical aim would be to identify the most important documents in a collection. A classic example is that of the citations between academic articles and books.The judgements produced by judges of law to support their decisions refer back to judgements made in earlier cases so citation analysis in a legal context is important. Another example is provided by patents which contain prior art, citation earlier patents relevant to the current claim. Citation Databases • Citation databases are databases that have been developed for evaluating publications. The citation databases enable you to count citations and check, for example, which articles or journals are the most cited ones • In a citation database you get information about who has cited an article and how many times an author has been cited. You can also list all articles citing the same source. • Most important citation database are • “Web of Science”, • “Scopus” • “Google Scholar” Web of Sciences • Web of Science is owned and produced by Thomson Reuters. WoS is composed of three databases containing citations from international scientific journals: • Arts & Humanities Citation Index - AHCI • Social Sciences Citation Index - SSCI • Science Citation Index - SCI • Journal Coverage: • Aims to include the best journals of all fields.
    [Show full text]
  • Predatory Publishing Practices: Is There Life After Beall's List?
    volume 27, issue 2, pages 53-70 (2017) Predatory Publishing Practices: Is There Life After Beall’s List? Denise Rosemary Nicholson Scholarly Communications and Copyright Services Office, University of the Witwatersrand, Johannesburg, South Africa [email protected] ABSTRACT Background. Scholarly communication is an ever-evolving practice. As publishing advanced from the printed format to digital formats, new trends, practices and platforms emerged in academia. As reputable publishers adapted their business models to accommodate open access, many non-reputable publishers have emerged with questionable business models and less-than- favourable or unacceptable publishing services. Objectives. This paper discusses changing trends in scholarly publishing, the advent of and problems caused by pervasive predatory publishing practices, and possible solutions. The paper also investigates possible alternatives to Beall’s list and whether a “one-stop shop” black- or white list would serve as a comprehensive tool for scholarly authors. Results. The paper concludes that there is no “one-stop shop” or comprehensive resource or guidelines available at this stage for scholarly authors to consult before publishing. It alerts scholars to be cautious and to do research about potential publishers, before submitting manuscripts for publication. Contributions. It provides recommendations and some useful resources to assist authors before they publish their works. INTRODUCTION The landscape of scholarly communication is ever-evolving. Ever since the first printed publication there have been variant policies, practices, standards and processes in publishing houses. There have been excellent high or gold standard publishers offering peer-review by expert researchers in their specific disciplines. They also offer impact factors attractive to researchers, reasonable subscription fees and ancillary services.
    [Show full text]