SCOPUS Vs. Web of Science SCOPUS Vs

Total Page:16

File Type:pdf, Size:1020Kb

SCOPUS Vs. Web of Science SCOPUS Vs SCOPUS Vs. Web of Science SCOPUS Vs. Web of Science Introduction Once you have retrieved articles through a global search, you will need to consult one of the two major search engines devoted to academic and scientific articles. Both SCOPUS and Web of Science are very large, subscription-based databases that allow users to search for indexed contents. Each database allows extensive searches into various domains of science and academic literature. The databases include impact, productivity, and citation algorithms that measure an article’s or author’s impact on their respective fields. Both databases have differences in metrics calculations, which should be taken into consideration (Table 1). Both systems rely on quartiles (Q) which are bands of serial titles that have been grouped together because they occupy a similar position within their subject categories. The four quartiles are denominated Q1, Q2, Q3, and Q4. Percentiles indicate the relative standing of a serial title in its subject field. Each subject field is divided into 100 equal-sized percentiles based on the number of serial titles, and a serial title is assigned to a percentile based on its CiteScore. 2 of 30 SCOPUS Vs. Web of Science Dating back to 1900, Web of Science (abbreviated as WoS), formerly the Web of Knowledge, is an online citation index service produced by the Institute for Scientific Information (ISI) and now maintained by Clarivate Analytics. In 2014, Clarivate declared that Web of Science’s databases host over 50,000 scholarly books, 12,000 Journals, and 160,000 conference proceedings. Web of Science specializes in the sciences, social sciences, art, and humanities. Annually, Clarivate publishes The Journal Citation Reports, which provide analytic information on the academic performance of its Journals through Journal Impact Factors (JIF), Eigenfactor and Citation indexes. JIF is an important measure for the performance of a Journal. It is calculated via the equation below: "#$%& '($%$(#)* () +%', -+%. .+'+(/+0 1#. $,+ ($+2* 345&(*,+0 () $,+ 3%*$ $6# -+%.* JIF = "#$%& )425+. #1 '($%5&+ ($+2* 3.#04'+0 () $,+ 3%*$ $6# -+%.* For example, if a Journal reports that JIF = 5 in 2019, this means that, on average, each article produced by the Journal in 2017 and 2018 received five citations. 3 of 30 SCOPUS Vs. Web of Science SCOPUS, on the other hand, is a database of peer-reviewed literature launched in 2004 by Elsevier. SCOPUS partners with over 5000 publishers, hosts over 22,000 serial titles, 150,000 books and has 1.4 billion cited references dating back to 1966. A simple comparison shows that the database of SCOPUS is much more extensive than that of WoS. All SCOPUS Journals are reviewed annually to measure quality standards. There are three quality indexes: The Cite-Score, SCImago Journal Rank (SJR), and Source Normalized Impact per Paper (SNIP). SCOPUS also provides citations and indexes for conference proceedings. CiteScore plays the same role as that of JIF but with one difference. The former evaluates a Journal’s performance over the past three years, while the latter measures the impact of a Journal over the past two years. For that reason, these two evaluation parameters are not directly comparable. Nonetheless, they show the importance and impact of the Journal in its field. "#$%& '($%$(#)* () +%', -+%. .+'+(/+0 1#. $,+ ($+2* 345&(*,+0 () $,+ 3%*$ $,.++ -+%.* CiteScore = "#$%& )425+. #1 0#'42+)$* 3.#04'+0 () $,+ 3%*$ $,.++ -+%.* 4 of 30 SCOPUS Vs. Web of Science Searching Through Web of Science To get the most out of searching through the Web of Science database, consider creating an account so that you can save searches and citations for consulting later. The Tecnologico de Monterrey provides faculty and students with access to the Web of Science through the institutional login. The next step in accessing the Web of Science is through the library’s website. As you take this step, consider using Firefox or Chrome as your web browser, because the library site is geared towards these search engines. 5 of 30 SCOPUS Vs. Web of Science Go to Biblioteca Tec by typing (https://biblioteca.tec.mx/inicio) into your browser bar. This will lead you to the BiblioXplora main page (Figure 1). Figure 1: Biblioteca search bar 6 of 30 SCOPUS Vs. Web of Science Click on the “Buscar” drop-down menu (Figure 2) and click the “Bases de Datos” tab (Figure 3). Figure 2: Buscar tab Figure 3: Web of Science through Biblioteca Tec 7 of 30 SCOPUS Vs. Web of Science In the search box, type, “Web of Science.” It is important to remember that this page is not typically accessible if you are outside an institution that has a subscription to the Web of Science. Click “Ir” (“Go”) next to the search box (Figure 4). Figure 4: Accessing Web of Science through Biblioteca Tec 8 of 30 SCOPUS Vs. Web of Science This will lead you to your Tec de Monterrey login page (Figure 5). On this page, enter your Tec credentials, such as your password and pin. Click “Enviar.” Figure 5: Biblioteca login 9 of 30 SCOPUS Vs. Web of Science By doing so, you are directly guided to the Web of Science website (Figure 6). Figure 6: Web of Science search 10 of 30 SCOPUS Vs. Web of Science Click the “Journal Citation Reports” tab at the top of the webpage (Figure 7). Type the journal name into the Web of Science search bar. Click the “search” button. A new tab will open with a list of matching journals. Figure 7: Journal citation reports, Web of Science 11 of 30 SCOPUS Vs. Web of Science However, if you do not find the journal in your first attempt, do not get disappointed. Sometimes the titles of the journals are slightly different in the databases. For example, Biosensors and Bioelectronics is one of the most influential journals in the field of biosensing. If you search "Biosensors and Bioelectronics," no results appear on the Web of Science platform. However, if you search “Biosensors & Bioelectronics,” the journal can be found. Worked Example – Web of Science Consider the example of searching the journal titled, Computers and Human Behavior. Follow the previous steps and log in through the Tec login at Biblioteca and reach the Web of Science search page. 12 of 30 SCOPUS Vs. Web of Science In the search term box, type in the name of the journal (Figure 8). Figure 8: “Computers in Human Behavior” entered into the Web of Science search bar 13 of 30 SCOPUS Vs. Web of Science The results should take you to the Web of Science metrics page for the journal as can be seen in Figure 9. This page provides a range of information including the ISSN, the publishing country the company, and the year journal was established. Figure 9: Results for Computers in Human Behavior from Web of Science metrics page. 14 of 30 SCOPUS Vs. Web of Science As you scroll down, the first graphic shows the JIF as an absolute value and the percentile score over the last 5 years. Computers in Human Behavior had a JIF of 3.536 in 2017 (Figure 10). Figure 10: Journal Impact Factor for Computers in Human Behavior, Web of Science 15 of 30 SCOPUS Vs. Web of Science Scrolling down will show how the JIF is calculated. The JIF is made by aggregating the number of citations made in the previous year (2017 in the above example) to journals in the two years prior (2015 and 2016) and dividing that by the number of documents published in the journal for those two prior years. (Figure 11). Figure 11: Method for JIF calculation, Web of Science 16 of 30 SCOPUS Vs. Web of Science The JIF has a great significance in the Web of Science indexing. The absolute values of the JIFs are reported in percentiles to show how a particular journal performs in comparison to other journals in the same academic field. Computers in Human Behavior is ranked in the 92nd percentile for Experimental Psychology and in the 89th percentile for Multidisciplinary Psychology. Interpreting Data JIF is the Web of Science method for organizing journal quartiles or ranks. Quartiles divide the entire spectrum of ranks into four parts. Quartile 1 (Q1) represents journals ranked from 100 to 75, for example. For reference, Q1 = 100–76th, Q2 = 75–51st, Q3 = 50–26th, and Q4 = 25–1st. 17 of 30 SCOPUS Vs. Web of Science Algorithm indexes show the most cited articles from the journals. You can see the most cited articles and distributions of citations by selecting the “All Years” tab (Figure 12). This provides a detailed table of various indicators, including total cites, journal impact factor, the 5-year impact factor, and the Eigenfactor score. Figure 12: Detailed indicators, Computers in Human Behavior 18 of 30 SCOPUS Vs. Web of Science When researching emerging fields, it is crucial to find articles from journals that are growing in influence. The most important data here are the Total Cites, Journal Impact Factor, the 5-year impact factor, and the Eigenfactor Score. Total Cites shows the number of citations given to journals published in that year. This data shows that Computers in Human Behavior has had a positive trend and has been consistently improving, as also shown in Figure 10. Check to see if your chosen article is from either a Q1 or a Q2 journal. If the journal is ranked Q1 or Q2, keep the article for more consideration. If the journal is Q3 or Q4, consider seriously whether the citation has enough relevance and timeliness to include the article in your final literature review. Use this quartile methodology for all your article citations gathered from the Global Search section. 19 of 30 SCOPUS Vs.
Recommended publications
  • Is Sci-Hub Increasing Visibility of Indian Research Papers? an Analytical Evaluation Vivek Kumar Singh1,*, Satya Swarup Srichandan1, Sujit Bhattacharya2
    Journal of Scientometric Res. 2021; 10(1):130-134 http://www.jscires.org Perspective Paper Is Sci-Hub Increasing Visibility of Indian Research Papers? An Analytical Evaluation Vivek Kumar Singh1,*, Satya Swarup Srichandan1, Sujit Bhattacharya2 1Department of Computer Science, Banaras Hindu University, Varanasi, Uttar Pradesh, INDIA. 2CSIR-National Institute of Science Technology and Development Studies, New Delhi, INDIA. ABSTRACT Sci-Hub, founded by Alexandra Elbakyan in 2011 in Kazakhstan has, over the years, Correspondence emerged as a very popular source for researchers to download scientific papers. It is Vivek Kumar Singh believed that Sci-Hub contains more than 76 million academic articles. However, recently Department of Computer Science, three foreign academic publishers (Elsevier, Wiley and American Chemical Society) have Banaras Hindu University, filed a lawsuit against Sci-Hub and LibGen before the Delhi High Court and prayed for Varanasi-221005, INDIA. complete blocking these websites in India. It is in this context, that this paper attempts to Email id: [email protected] find out how many Indian research papers are available in Sci-Hub and who downloads them. The citation advantage of Indian research papers available on Sci-Hub is analysed, Received: 16-03-2021 with results confirming that such an advantage do exist. Revised: 29-03-2021 Accepted: 25-04-2021 Keywords: Indian Research, Indian Science, Black Open Access, Open Access, Sci-Hub. DOI: 10.5530/jscires.10.1.16 INTRODUCTION access publishing of their research output, and at the same time encouraging their researchers to publish in openly Responsible Research and Innovation (RRI) has become one accessible forms.
    [Show full text]
  • Sci-Hub Provides Access to Nearly All Scholarly Literature
    Sci-Hub provides access to nearly all scholarly literature A DOI-citable version of this manuscript is available at https://doi.org/10.7287/peerj.preprints.3100. This manuscript was automatically generated from greenelab/scihub-manuscript@51678a7 on October 12, 2017. Submit feedback on the manuscript at git.io/v7feh or on the analyses at git.io/v7fvJ. Authors • Daniel S. Himmelstein 0000-0002-3012-7446 · dhimmel · dhimmel Department of Systems Pharmacology and Translational Therapeutics, University of Pennsylvania · Funded by GBMF4552 • Ariel Rodriguez Romero 0000-0003-2290-4927 · arielsvn · arielswn Bidwise, Inc • Stephen Reid McLaughlin 0000-0002-9888-3168 · stevemclaugh · SteveMcLaugh School of Information, University of Texas at Austin • Bastian Greshake Tzovaras 0000-0002-9925-9623 · gedankenstuecke · gedankenstuecke Department of Applied Bioinformatics, Institute of Cell Biology and Neuroscience, Goethe University Frankfurt • Casey S. Greene 0000-0001-8713-9213 · cgreene · GreeneScientist Department of Systems Pharmacology and Translational Therapeutics, University of Pennsylvania · Funded by GBMF4552 PeerJ Preprints | https://doi.org/10.7287/peerj.preprints.3100v2 | CC BY 4.0 Open Access | rec: 12 Oct 2017, publ: 12 Oct 2017 Abstract The website Sci-Hub provides access to scholarly literature via full text PDF downloads. The site enables users to access articles that would otherwise be paywalled. Since its creation in 2011, Sci- Hub has grown rapidly in popularity. However, until now, the extent of Sci-Hub’s coverage was unclear. As of March 2017, we find that Sci-Hub’s database contains 68.9% of all 81.6 million scholarly articles, which rises to 85.2% for those published in toll access journals.
    [Show full text]
  • PUBLICATIONS and PUBLICATION POLICY (FLW, September 2016)
    PUBLICATIONS AND PUBLICATION POLICY (FLW, September 2016) General ............................................................................................................................. 2 Why publish? ........................................................................................................... 2 A publication strategy .............................................................................................. 2 Publish where? ........................................................................................................ 2 Aggregation level ..................................................................................................... 3 Social impact ........................................................................................................... 3 Publication types ............................................................................................................... 3 UGent publication typology ...................................................................................... 3 Web-of-science ........................................................................................................ 4 Vabb publication typology ........................................................................................ 4 FWO publication typology ........................................................................................ 5 Relevant publication channels .......................................................................................... 6 How to choose? ......................................................................................................
    [Show full text]
  • Sci-Hub Downloads Lead to More Article Citations
    THE SCI-HUB EFFECT:SCI-HUB DOWNLOADS LEAD TO MORE ARTICLE CITATIONS Juan C. Correa⇤ Henry Laverde-Rojas Faculty of Business Administration Faculty of Economics University of Economics, Prague, Czechia Universidad Santo Tomás, Bogotá, Colombia [email protected] [email protected] Fernando Marmolejo-Ramos Julian Tejada Centre for Change and Complexity in Learning Departamento de Psicologia University of South Australia Universidade Federal de Sergipe [email protected] [email protected] Štepánˇ Bahník Faculty of Business Administration University of Economics, Prague, Czechia [email protected] ABSTRACT Citations are often used as a metric of the impact of scientific publications. Here, we examine how the number of downloads from Sci-hub as well as various characteristics of publications and their authors predicts future citations. Using data from 12 leading journals in economics, consumer research, neuroscience, and multidisciplinary research, we found that articles downloaded from Sci-hub were cited 1.72 times more than papers not downloaded from Sci-hub and that the number of downloads from Sci-hub was a robust predictor of future citations. Among other characteristics of publications, the number of figures in a manuscript consistently predicts its future citations. The results suggest that limited access to publications may limit some scientific research from achieving its full impact. Keywords Sci-hub Citations Scientific Impact Scholar Consumption Knowledge dissemination · · · · Introduction Science and its outputs are essential in daily life, as they help to understand our world and provide a basis for better decisions. Although scientific findings are often cited in social media and shared outside the scientific community [1], their primary use is what we could call “scholar consumption.” This phenomenon includes using websites that provide subscription-based access to massive databases of scientific research [2].
    [Show full text]
  • A Citation Analysis of the Quantitative/Qualitative Methods Debate's Reflection in Sociology Research: Implications for Library Collection Development
    Georgia State University ScholarWorks @ Georgia State University University Library Faculty Publications Georgia State University Library January 2004 A Citation Analysis of the Quantitative/Qualitative Methods Debate's Reflection in Sociology Research: Implications for Library Collection Development Amanda J. Swygart-Hobaugh M.L.S., Ph.D. Georgia State University, [email protected] Follow this and additional works at: https://scholarworks.gsu.edu/univ_lib_facpub Part of the Library and Information Science Commons, and the Sociology Commons Recommended Citation Swygart-Hobaugh, A. J. (2004). A citation analysis of the quantitative/qualitative methods debate's reflection in sociology esearr ch: Implications for library collection development. Library Collections, Acquisitions, and Technical Services, 28, 180-95. This Article is brought to you for free and open access by the Georgia State University Library at ScholarWorks @ Georgia State University. It has been accepted for inclusion in University Library Faculty Publications by an authorized administrator of ScholarWorks @ Georgia State University. For more information, please contact [email protected]. A Citation Analysis of the Quantitative/Qualitative Methods Debate’s Reflection in Sociology Research: Implications for Library Collection Development Amanda J. Swygart-Hobaugh Consulting Librarian for the Social Sciences Russell D. Cole Library Cornell College 600 First Street West Mt. Vernon, IA 52314-1098 [email protected] NOTICE: This is the author’s version of a work that was accepted for publication in Library Collections, Acquisitions, and Technical Services. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication.
    [Show full text]
  • How Frequently Are Articles in Predatory Open Access Journals Cited
    publications Article How Frequently Are Articles in Predatory Open Access Journals Cited Bo-Christer Björk 1,*, Sari Kanto-Karvonen 2 and J. Tuomas Harviainen 2 1 Hanken School of Economics, P.O. Box 479, FI-00101 Helsinki, Finland 2 Department of Information Studies and Interactive Media, Tampere University, FI-33014 Tampere, Finland; Sari.Kanto@ilmarinen.fi (S.K.-K.); tuomas.harviainen@tuni.fi (J.T.H.) * Correspondence: bo-christer.bjork@hanken.fi Received: 19 February 2020; Accepted: 24 March 2020; Published: 26 March 2020 Abstract: Predatory journals are Open Access journals of highly questionable scientific quality. Such journals pretend to use peer review for quality assurance, and spam academics with requests for submissions, in order to collect author payments. In recent years predatory journals have received a lot of negative media. While much has been said about the harm that such journals cause to academic publishing in general, an overlooked aspect is how much articles in such journals are actually read and in particular cited, that is if they have any significant impact on the research in their fields. Other studies have already demonstrated that only some of the articles in predatory journals contain faulty and directly harmful results, while a lot of the articles present mediocre and poorly reported studies. We studied citation statistics over a five-year period in Google Scholar for 250 random articles published in such journals in 2014 and found an average of 2.6 citations per article, and that 56% of the articles had no citations at all. For comparison, a random sample of articles published in the approximately 25,000 peer reviewed journals included in the Scopus index had an average of 18, 1 citations in the same period with only 9% receiving no citations.
    [Show full text]
  • Google Scholar: the Democratization of Citation Analysis?
    Google Scholar: the democratization of citation analysis? Anne-Wil Harzing Ron van der Wal Version November 2007 Accepted for Ethics in Science and Environmental Politics Copyright © 2007 Anne-Wil Harzing and Ron van der Wal. All rights reserved. Dr. Anne-Wil Harzing Email: [email protected] University of Melbourne Web: www.harzing.com Department of Management Faculty of Economics & Commerce Parkville Campus Melbourne, VIC 3010 Australia Google Scholar: the democratization of citation analysis? Anne-Wil Harzing* 1, Ron van der Wal2 1 Department of Management, University of Melbourne, Parkville Campus, Parkville, Victoria 3010, Australia * Email: [email protected] 2 Tarma Software Research, GPO Box 4063, Melbourne, Victoria 3001 Australia Running head: citation analysis with Google Scholar Key words: Google Scholar, citation analysis, publish or perish, h-index, g-index, journal impact factor Abstract Traditionally, the most commonly used source of bibliometric data is Thomson ISI Web of Knowledge, in particular the (Social) Science Citation Index and the Journal Citation Reports (JCR), which provide the yearly Journal Impact Factors (JIF). This paper presents an alternative source of data (Google Scholar, GS) as well as three alternatives to the JIF to assess journal impact (the h-index, g-index and the number of citations per paper). Because of its broader range of data sources, the use of GS generally results in more comprehensive citation coverage in the area of Management and International Business. The use of GS particularly benefits academics publishing in sources that are not (well) covered in ISI. Among these are: books, conference papers, non-US journals, and in general journals in the field of Strategy and International Business.
    [Show full text]
  • On the Relation Between the Wos Impact Factor, the Eigenfactor, the Scimago Journal Rank, the Article Influence Score and the Journal H-Index
    1 On the relation between the WoS impact factor, the Eigenfactor, the SCImago Journal Rank, the Article Influence Score and the journal h-index Ronald ROUSSEAU 1 and the STIMULATE 8 GROUP 2 1 KHBO, Dept. Industrial Sciences and Technology, Oostende, Belgium [email protected] 2Vrije Universiteit Brussel (VUB), Brussels, Belgium The STIMULATE 8 Group consists of: Anne Sylvia ACHOM (Uganda), Helen Hagos BERHE (Ethiopia), Sangeeta Namdev DHAMDHERE (India), Alicia ESGUERRA (The Philippines), Nguyen Thi Ngoc HOAN (Vietnam), John KIYAGA (Uganda), Sheldon Miti MAPONGA (Zimbabwe), Yohannis MARTÍ- LAHERA (Cuba), Kelefa Tende MWANTIMWA (Tanzania), Marlon G. OMPOC (The Philippines), A.I.M. Jakaria RAHMAN (Bangladesh), Bahiru Shifaw YIMER (Ethiopia). Abstract Four alternatives to the journal Impact Factor (IF) indicator are compared to find out their similarities. Together with the IF, the SCImago Journal Rank indicator (SJR), the EigenfactorTM score, the Article InfluenceTM score and the journal h- index of 77 journals from more than ten fields were collected. Results show that although those indicators are calculated with different methods and even use different databases, they are strongly correlated with the WoS IF and among each other. These findings corroborate results published by several colleagues and show the feasibility of using free alternatives to the Web of Science for evaluating scientific journals. Keywords: WoS impact factor, Eigenfactor, SCImago Journal Rank (SJR), Article Influence Score, journal h-index, correlations Introduction STIMULATE stands for Scientific and Technological Information Management in Universities and Libraries: an Active Training Environment. It is an international training programme in information management, supported by the Flemish Interuniversity Council (VLIR), aiming at young scientists and professionals from 2 developing countries.
    [Show full text]
  • The Journal Impact Factor Denominator Defining Citable (Counted) Items
    COMMENTARIES The Journal Impact Factor Denominator Defining Citable (Counted) Items Marie E. McVeigh, MS The items counted in the denominator of the impact fac- tor are identifiable in the Web of Science database by hav- Stephen J. Mann ing the index field document type set as “Article,” “Re- view,” or “Proceedings Paper” (a specialized subset of the VER ITS 30-YEAR HISTORY, THE JOURNAL IMPACT article document type). These document types identify the factor has been the subject of much discussion scholarly contribution of the journal to the literature and and debate.1 From its first release in 1975, bib- are counted as “citable items” in the denominator of the im- liometricians and library scientists discussed its pact factor. A journal accepted for coverage in the Thom- Ovalue and its vagaries. In the last decade, discussion has 6 son Reuters citation database is reviewed by experts who shifted to the way in which impact factor data are used. In consider the bibliographic and bibliometric characteristics an environment eager for objective measures of productiv- of all article types published by that journal (eg, items), which ity, relevance, and research value, the impact factor has been are covered by that journal in the context of other materi- applied broadly and indiscriminately.2,3 The impact factor als in the journal, the subject, and the database as a whole. has gone from being a measure of a journal’s citation influ- This journal-specific analysis identifies the journal sec- ence in the broader literature to a surrogate that assesses tions, subsections, or both that contain materials likely to the scholarly value of work published in that journal.
    [Show full text]
  • The Journal Ranking System Undermining the Impact of 2 Brazilian Science 3 4 Rodolfo Jaffé1 5 6 1 Instituto Tecnológico Vale, Belém-PA, Brazil
    bioRxiv preprint doi: https://doi.org/10.1101/2020.07.05.188425; this version posted July 6, 2020. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under aCC-BY-NC-ND 4.0 International license. 1 QUALIS: The journal ranking system undermining the impact of 2 Brazilian science 3 4 Rodolfo Jaffé1 5 6 1 Instituto Tecnológico Vale, Belém-PA, Brazil. Email: [email protected] 7 8 Abstract 9 10 A journal ranking system called QUALIS was implemented in Brazil in 2009, intended to rank 11 graduate programs from different subject areas and promote selected national journals. Since this 12 system uses a complicated suit of criteria (differing among subject areas) to group journals into 13 discrete categories, it could potentially create incentives to publish in low-impact journals ranked 14 highly by QUALIS. Here I assess the influence of the QUALIS journal ranking system on the 15 global impact of Brazilian science. Results reveal a steeper decrease in the number of citations 16 per document since the implementation of this QUALIS system, compared to the top Latin 17 American countries publishing more scientific articles. All the subject areas making up the 18 QUALIS system showed some degree of bias, with social sciences being usually more biased 19 than natural sciences. Lastly, the decrease in the number of citations over time proved steeper in a 20 more biased area, suggesting a faster shift towards low-impact journals ranked highly by 21 QUALIS.
    [Show full text]
  • Conference Accreditation and Need of a Bibliometric Measure to Distinguish Predatory Conferences
    publications Viewpoint Conference Accreditation and Need of a Bibliometric Measure to Distinguish Predatory Conferences Pooyan Makvandi 1,* , Anahita Nodehi 2 and Franklin R. Tay 3 1 Centre for Materials Interfaces, Istituto Italiano di Tecnologia, Viale Rinaldo Piaggio 34, 56025 Pontedera, Italy 2 Department of Statistics, Computer Science, Applications (DiSIA), Florence University, Viale Morgagni 59, 50134 Florence, Italy; Anahita.nodehi@unifi.it 3 The Graduate School, Augusta University, Augusta, GA 30912, USA; [email protected] * Correspondence: [email protected] or [email protected] Abstract: Academic conferences offer scientists the opportunity to share their findings and knowledge with other researchers. However, the number of conferences is rapidly increasing globally and many unsolicited e-mails are received from conference organizers. These e-mails take time for researchers to read and ascertain their legitimacy. Because not every conference is of high quality, there is a need for young researchers and scholars to recognize the so-called “predatory conferences” which make a profit from unsuspecting researchers without the core purpose of advancing science or collaboration. Unlike journals that possess accreditation indices, there is no appropriate accreditation for international conferences. Here, a bibliometric measure is proposed that enables scholars to evaluate conference quality before attending. Keywords: conference indicator; conference impact factor; conference accreditation; bibliometric measure Citation: Makvandi, P.; Nodehi, A.; Tay, F.R. Conference Accreditation and Need of a Bibliometric Measure 1. Introduction to Distinguish Predatory Conferences. Academic conferences offer scientists the opportunity to share their findings and Publications 2021, 9, 16. https:// knowledge with other researchers. Conferences are organized by institutions or societies, doi.org/10.3390/publications9020016 and in rare cases, by individuals [1].
    [Show full text]
  • Citation Analysis: Web of Science, Scopus
    Citation analysis: Web of science, scopus Masoud Mohammadi Golestan University of Medical Sciences Information Management and Research Network Citation Analysis • Citation analysis is the study of the impact and assumed quality of an article, an author, or an institution based on the number of times works and/or authors have been cited by others • Citation analysis is the examination of the frequency, patterns, and graphs of citations in documents. It uses the pattern of citations, links from one document to another document, to reveal properties of the documents. A typical aim would be to identify the most important documents in a collection. A classic example is that of the citations between academic articles and books.The judgements produced by judges of law to support their decisions refer back to judgements made in earlier cases so citation analysis in a legal context is important. Another example is provided by patents which contain prior art, citation earlier patents relevant to the current claim. Citation Databases • Citation databases are databases that have been developed for evaluating publications. The citation databases enable you to count citations and check, for example, which articles or journals are the most cited ones • In a citation database you get information about who has cited an article and how many times an author has been cited. You can also list all articles citing the same source. • Most important citation database are • “Web of Science”, • “Scopus” • “Google Scholar” Web of Sciences • Web of Science is owned and produced by Thomson Reuters. WoS is composed of three databases containing citations from international scientific journals: • Arts & Humanities Citation Index - AHCI • Social Sciences Citation Index - SSCI • Science Citation Index - SCI • Journal Coverage: • Aims to include the best journals of all fields.
    [Show full text]