2017​ ​Metrics​ ​Tools​ ​-​ ​Current​ ​Strengths​ ​And

Total Page:16

File Type:pdf, Size:1020Kb

2017​ ​Metrics​ ​Tools​ ​-​ ​Current​ ​Strengths​ ​And 2017 Metrics Tools - Current Strengths and Issues ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Author evaluation ​ ​ Strengths Known Issues ​ ​ h-index ● Combines measure of quantity ● Different for different fields of research – ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ (publications) and impact (citations) may be more useful for ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Provides high objectivity Science/Engineering, less suitable for ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Easily obtained Education and the Social Sciences ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Measures impact of researcher rather ● Depends on duration of researcher’s ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ than of any particular publication career ​ ​ ​ ​ ​ ​ ​ ​ ● Does not highlight highly cited papers ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Difficult to obtain for complete output ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ including all material types ​ ​ ​ ​ ​ ​ ● Different sources may give different ​ ​ ​ ​ ​ ​ ​ ​ values e.g. Google Scholar vs Web of ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Science ● Authors need to check that publications ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ counted belong to them ​ ​ ​ ​ ​ ​ i10-index ● Simple to calculate ● Used only in Google Scholar ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Authors need to check that publications ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ counted belong to them ​ ​ ​ ​ ​ ​ Publication count ​ ​ ● Easily obtained - most databases and ● Does not distinguish between different ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Google Scholar give publication counts document types and duration of ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ publication process ​ ​ ● Dependent on material types indexed by ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ particular databases ​ ​ ● Authors need to check that publications ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ counted belong to them ​ ​ ​ ​ ​ ​ Citation count ​ ​ ● Easily obtained – most databases and ● Will vary between databases ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Google Scholar give citation counts ● Depends on duration of researcher’s ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ career ● Depends on field of research ​ ​ ​ ​ ​ ​ ​ ​ ● Doesn’t benefit emerging and specialized ​ ​ ​ ​ ​ ​ ​ ​ fields of research ​ ​ ​ ​ ● Numbers could be skewed by ​ ​ ​ ​ ​ ​ ​ ​ self-citations ● Authors need to check that publications ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ counted belong to them ​ ​ ​ ​ ​ ​ Journal evaluation ​ ​ Strengths Known Issues ​ ​ Journal Impact factor (JIF)/CiteScore ​ ​ ​ ​ ​ ​ ● Standard, widely accepted measure ● Journal Impact Factors/CiteScore values ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Easy to understand ranking system should not be used to compare journals ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Journal Impact Factors/CiteScore are across disciplines ​ ​ ​ ​ ​ ​ ​ ​ becoming more important to the PBRF ● Discipline specific ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ process ● Open to manipulation ​ ​ ​ ​ Percentile/Quartile Rankings ​ ​ ● Simple to understand ● Used with Journal Impact Factor-ranked ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Easy to identify the top ranked journals in journals included in Journal Citation ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ a discipline Reports database ​ ​ ​ ​ ● Used mostly for Science ​ ​ ​ ​ ​ ​ SJR (SCIMago Journal & Country Rank) ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Provides an alternative to Journal Impact ● Citations from more established journals ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Factor/CiteScore. Similar to Eigenfactor are more valuable than those from ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ but based on the Scopus database lesser-known journals ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Giving different weight to citations from ● Conservative ranking system; favours ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ different journals reflects hierarchy of status quo ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ journals ● Discipline specific ​ ​ ● Only peer reviewed articles are counted ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ SNIP - Source Normalized Impact per Paper ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Elsevier’s impact factor is based on ● Dependent on the content of Scopus ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Scopus ● Doesn’t differentiate between the prestige ​ ​ ​ ​ ​ ​ ​ ​ ● Corrects for differences between of citing journals ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ disciplines ● Only peer reviewed articles are counted ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Quality Ranking ​ ​ ● Easy to understand and use ● Subject specific and not easy to compare ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Often adopted by organizations as an journals from different disciplines ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ official research evaluation tool ● Lists may reflect organizational, ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ geographic, cultural or compiler bias ​ ​ ​ ​ ​ ​ ​ ​ Eigenfactor ● More robust than Journal Impact ● Not widely used ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Factor/CiteScore metrics ● Global rankings tending to move away ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Based on Web of Science database which from Web of Science ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ indexes the most academic journals ● Not easy to compare journals from ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Easy to find and use different disciplines ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Citations may include more than articles, ● Large journals will have high Eigenfactor ​ ​ ​ ​​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ e.g. editorials scores based on size alone ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Altmetrics General Strengths General Known Issues ​ ​ ​ ​ ​ ​ ● Provide a broader research impact ● Lack of consistent adoption across ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ measurement in both the social and disciplines and institutions ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ scholarly realm ● Lack of agreement about what metrics, ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Can be used to augment traditional data sources or tools are most valuable ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ metrics ● Can be easily distorted or misinterpreted ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Gauge impact of research before it enters ● Measurement reflects all attention paid to ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ the citation cycle an output whether positive or negative - ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Accumulate faster compared to traditional may not reflect scholarly quality or impact ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ citation counts ​ ​ ● Measure the impact of different types of ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ scholarly or creative outputs (e.g. ​ ​ ​ ​ ​ ​ ​ ​ datasets or visual arts, software, ​ ​ ​ ​ ​ ​ ​ ​ presentations) Strengths Known Issues ​ ​ UoW Research Commons ​ ​ ​ ​ ● Items are publically available and ● Doesn’t contain all institutional output ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ accessible by internet search engines ● May not be as widely searched/used by ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Number of file downloads for items, researchers as other databases and ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ authors and collections, and top views by services ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ city and country are readily accessible ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ImpactStory ● Tracks wide range of research output ● Interpreting reports can be challenging ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ types owing to the high number of metrics ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Aggregates impact data from many displayed, and lack of context and ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ different sources comparison ​ ​ ● Calculates the impact of research outputs ● Experience with social media is helpful for ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ using DOI, PMID or arXivID as tracking interpreting the report ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ IDs ● Relies on resource identification ​ ​ ​ ​ ​ ​ standards for compiling the reports - a ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ lack of standards is a core problem ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Figshare ● Collaborates with Symplectic and all of ● Uploading a dataset on figshare can result ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ the altmetric solutions, also on Facebook, in numerous DOIs, since both the dataset ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Google+, Twitter, and Vimeo and the individual files are assigned DOIs ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Partners with publishers such as Nature, ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ PLOS and F1000 (life sciences, clinical ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ research) ● Uploaded research is citable and ​ ​ ​ ​ ​ ​ ​ ​ trackable via a DOI ​ ​ ​ ​ ​ ​ ● Detailed reporting metrics are available ​ ​ ​ ​ ​ ​ ​ ​ for the institution ​ ​ ​ ​ ResearchGate ● Can be used in conjunction with other ● Care needs to be taken with copyright, ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ sites such as Academia.edu, Mendeley, particularly with relation to published ​ ​​ ​ ​ ​ ​​ ​ ​ ​​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Google Scholar or figshare. versions of articles. Check first on ​ ​ ​ ​ ​ ​​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Uploaded full text publications are SHERPARoMEO ​ ​ ​ ​ ​ ​ ​ ​ indexed by Google ● Many of the publications available through ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Copies of papers (either pre- or ResearchGate are actually uploaded ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ post-review) and associated raw data can illegally in terms of publisher open access ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ be uploaded; all are searchable policy ​ ​ ​ ​ ​ ​ ​ ​ ● Collates useful information about journals, ● A high percentage of ResearchGate ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ such as impact factors, metrics and some members are postgraduate and other ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ details of open access policy students ​ ​ ​ ​ ​ ​ ​ ​ ● Provides an RG Score based on how ● There have been complaints about ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ other researchers interact with your unwanted email spamming ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ content, how often, and who they are ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Academia.edu ● Social networking platform which provides ● Enhanced service to researcher’s ​ ​ ​ ​ ​ ​
Recommended publications
  • Sci-Hub Provides Access to Nearly All Scholarly Literature
    Sci-Hub provides access to nearly all scholarly literature A DOI-citable version of this manuscript is available at https://doi.org/10.7287/peerj.preprints.3100. This manuscript was automatically generated from greenelab/scihub-manuscript@51678a7 on October 12, 2017. Submit feedback on the manuscript at git.io/v7feh or on the analyses at git.io/v7fvJ. Authors • Daniel S. Himmelstein 0000-0002-3012-7446 · dhimmel · dhimmel Department of Systems Pharmacology and Translational Therapeutics, University of Pennsylvania · Funded by GBMF4552 • Ariel Rodriguez Romero 0000-0003-2290-4927 · arielsvn · arielswn Bidwise, Inc • Stephen Reid McLaughlin 0000-0002-9888-3168 · stevemclaugh · SteveMcLaugh School of Information, University of Texas at Austin • Bastian Greshake Tzovaras 0000-0002-9925-9623 · gedankenstuecke · gedankenstuecke Department of Applied Bioinformatics, Institute of Cell Biology and Neuroscience, Goethe University Frankfurt • Casey S. Greene 0000-0001-8713-9213 · cgreene · GreeneScientist Department of Systems Pharmacology and Translational Therapeutics, University of Pennsylvania · Funded by GBMF4552 PeerJ Preprints | https://doi.org/10.7287/peerj.preprints.3100v2 | CC BY 4.0 Open Access | rec: 12 Oct 2017, publ: 12 Oct 2017 Abstract The website Sci-Hub provides access to scholarly literature via full text PDF downloads. The site enables users to access articles that would otherwise be paywalled. Since its creation in 2011, Sci- Hub has grown rapidly in popularity. However, until now, the extent of Sci-Hub’s coverage was unclear. As of March 2017, we find that Sci-Hub’s database contains 68.9% of all 81.6 million scholarly articles, which rises to 85.2% for those published in toll access journals.
    [Show full text]
  • The Journal Ranking System Undermining the Impact of 2 Brazilian Science 3 4 Rodolfo Jaffé1 5 6 1 Instituto Tecnológico Vale, Belém-PA, Brazil
    bioRxiv preprint doi: https://doi.org/10.1101/2020.07.05.188425; this version posted July 6, 2020. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under aCC-BY-NC-ND 4.0 International license. 1 QUALIS: The journal ranking system undermining the impact of 2 Brazilian science 3 4 Rodolfo Jaffé1 5 6 1 Instituto Tecnológico Vale, Belém-PA, Brazil. Email: [email protected] 7 8 Abstract 9 10 A journal ranking system called QUALIS was implemented in Brazil in 2009, intended to rank 11 graduate programs from different subject areas and promote selected national journals. Since this 12 system uses a complicated suit of criteria (differing among subject areas) to group journals into 13 discrete categories, it could potentially create incentives to publish in low-impact journals ranked 14 highly by QUALIS. Here I assess the influence of the QUALIS journal ranking system on the 15 global impact of Brazilian science. Results reveal a steeper decrease in the number of citations 16 per document since the implementation of this QUALIS system, compared to the top Latin 17 American countries publishing more scientific articles. All the subject areas making up the 18 QUALIS system showed some degree of bias, with social sciences being usually more biased 19 than natural sciences. Lastly, the decrease in the number of citations over time proved steeper in a 20 more biased area, suggesting a faster shift towards low-impact journals ranked highly by 21 QUALIS.
    [Show full text]
  • Conference Accreditation and Need of a Bibliometric Measure to Distinguish Predatory Conferences
    publications Viewpoint Conference Accreditation and Need of a Bibliometric Measure to Distinguish Predatory Conferences Pooyan Makvandi 1,* , Anahita Nodehi 2 and Franklin R. Tay 3 1 Centre for Materials Interfaces, Istituto Italiano di Tecnologia, Viale Rinaldo Piaggio 34, 56025 Pontedera, Italy 2 Department of Statistics, Computer Science, Applications (DiSIA), Florence University, Viale Morgagni 59, 50134 Florence, Italy; Anahita.nodehi@unifi.it 3 The Graduate School, Augusta University, Augusta, GA 30912, USA; [email protected] * Correspondence: [email protected] or [email protected] Abstract: Academic conferences offer scientists the opportunity to share their findings and knowledge with other researchers. However, the number of conferences is rapidly increasing globally and many unsolicited e-mails are received from conference organizers. These e-mails take time for researchers to read and ascertain their legitimacy. Because not every conference is of high quality, there is a need for young researchers and scholars to recognize the so-called “predatory conferences” which make a profit from unsuspecting researchers without the core purpose of advancing science or collaboration. Unlike journals that possess accreditation indices, there is no appropriate accreditation for international conferences. Here, a bibliometric measure is proposed that enables scholars to evaluate conference quality before attending. Keywords: conference indicator; conference impact factor; conference accreditation; bibliometric measure Citation: Makvandi, P.; Nodehi, A.; Tay, F.R. Conference Accreditation and Need of a Bibliometric Measure 1. Introduction to Distinguish Predatory Conferences. Academic conferences offer scientists the opportunity to share their findings and Publications 2021, 9, 16. https:// knowledge with other researchers. Conferences are organized by institutions or societies, doi.org/10.3390/publications9020016 and in rare cases, by individuals [1].
    [Show full text]
  • Citation Analysis: Web of Science, Scopus
    Citation analysis: Web of science, scopus Masoud Mohammadi Golestan University of Medical Sciences Information Management and Research Network Citation Analysis • Citation analysis is the study of the impact and assumed quality of an article, an author, or an institution based on the number of times works and/or authors have been cited by others • Citation analysis is the examination of the frequency, patterns, and graphs of citations in documents. It uses the pattern of citations, links from one document to another document, to reveal properties of the documents. A typical aim would be to identify the most important documents in a collection. A classic example is that of the citations between academic articles and books.The judgements produced by judges of law to support their decisions refer back to judgements made in earlier cases so citation analysis in a legal context is important. Another example is provided by patents which contain prior art, citation earlier patents relevant to the current claim. Citation Databases • Citation databases are databases that have been developed for evaluating publications. The citation databases enable you to count citations and check, for example, which articles or journals are the most cited ones • In a citation database you get information about who has cited an article and how many times an author has been cited. You can also list all articles citing the same source. • Most important citation database are • “Web of Science”, • “Scopus” • “Google Scholar” Web of Sciences • Web of Science is owned and produced by Thomson Reuters. WoS is composed of three databases containing citations from international scientific journals: • Arts & Humanities Citation Index - AHCI • Social Sciences Citation Index - SSCI • Science Citation Index - SCI • Journal Coverage: • Aims to include the best journals of all fields.
    [Show full text]
  • Introduction to Bibliometrics
    Introduction to Bibliometrics Bibliometrics A new service of the university library to support academic research Carolin Ahnert and Martin Bauschmann This work is licensed under: Creative Commons Attribution 4.0 International License. Chemnitz ∙ 16.05.17 ∙ Carolin Ahnert, Martin Bauschmann www.tu-chemnitz.de Introduction to Bibliometrics What is bibliometrics? – short definition „Bibliometrics is the statistical analysis of bibliographic data, commonly focusing on citation analysis of research outputs and publications, i.e. how many times research outputs and publications are being cited“ (University of Leeds, 2014) Quantitative analysis and visualisation of scientific research output based on publication and citation data Chemnitz ∙ 16.05.17 ∙ Carolin Ahnert, Martin Bauschmann www.tu-chemnitz.de Introduction to Bibliometrics What is bibliometrics? – a general classification descriptive bibliometrics evaluative bibliometrics Identification of relevant research topics Evaluation of research performance cognition or thematic trends (groups of researchers, institutes, Identification of key actors universities, countries) interests Exploration of cooperation patterns and Assessment of publication venues communication structures (especially journals) Interdisciplinarity Productivity → visibility → impact → Internationality quality? examined Topical cluster constructs Research fronts/ knowledge bases Social network analysis: co-author, co- Indicators: number of articles, citation methods/ citation, co-word-networks etc. rate, h-Index,
    [Show full text]
  • Australian Business Deans Council 2019 Journal Quality List Review Final Report 6 December 2019
    Australian Business Deans Council 2019 Journal Quality List Review Final Report 6 December 2019 1 About the Australian Business Deans Council The Australian Business Deans Council (ABDC) is the peak body of Australian university business schools. Our 38 members graduate one-third of all Australian university students and more than half of the nation’s international tertiary students. ABDC’s mission is to make Australian business schools better, and to foster the national and global impact of Australian business education and research. ABDC does this by: • Being the collective and collegial voice of university business schools • Providing opportunities for members to share knowledge and best practice • Creating and maintaining strong, collaborative relationships with affiliated national and international peak industry, higher education, professional and government bodies • Engaging in strategic initiatives and activities that further ABDC’s mission. Australian Business Deans Council Inc. UNSW Business School, Deans Unit, Level 6, West Lobby, College Road, Kensington, NSW, Australia 2052 T: +61 (0)2 6162 2970 E: [email protected] 2 Table of Contents Acknowledgements 4 Background and Context 4 Method and Approach 7 Outcomes 10 Beyond the 2019 Review 13 Appendix 1 – Individual Panel Reports 14 Information Systems 15 Economics 20 Accounting 37 Finance including Actuarial Studies 57 Management, Commercial Services and Transport and Logistics 63 (and Other, covering 1599) Marketing and Tourism 78 Business and Taxation Law 85 Appendix 2 – Terms
    [Show full text]
  • Understanding Research Metrics INTRODUCTION Discover How to Monitor Your Journal’S Performance Through a Range of Research Metrics
    Understanding research metrics INTRODUCTION Discover how to monitor your journal’s performance through a range of research metrics. This page will take you through the “basket of metrics”, from Impact Factor to usage, and help you identify the right research metrics for your journal. Contents UNDERSTANDING RESEACH METRICS 2 What are research metrics? What are research metrics? Research metrics are the fundamental tools used across the publishing industry to measure performance, both at journal- and author-level. For a long time, the only tool for assessing journal performance was the Impact Factor – more on that in a moment. Now there are a range of different research metrics available. This “basket of metrics” is growing every day, from the traditional Impact Factor to Altmetrics, h-index, and beyond. But what do they all mean? How is each metric calculated? Which research metrics are the most relevant to your journal? And how can you use these tools to monitor your journal’s performance? For a quick overview, download our simple guide to research metrics. Or keep reading for a more in-depth look at what’s in the “basket of metrics” and how to interpret it. UNDERSTANDING RESEACH METRICS 3 Citation-based metrics Citation-based metrics IMPACT FACTOR What is the Impact Factor? The Impact Factor is probably the most well-known metric for assessing journal performance. Designed to help librarians with collection management in the 1960s, it has since become a common proxy for journal quality. The Impact Factor is a simple research metric: it’s the average number of citations received by articles in a journal within a two-year window.
    [Show full text]
  • New Perspectives Welcome to the First Research Trends Maga- Zine, Which Accompanies Our 15Th Issue of Research Trends Online
    r ReseaRch TRendst magazine New perspectives Welcome to the first Research Trends maga- zine, which accompanies our 15th issue of Research Trends online. Research Trends 15 is a special issue devoted to the role of bibliometric indicators in journal evaluation and research assessment. Over the past 40 years, academic evaluation has changed radically, in both its objectives and methods. Bibliometrics has grown to meet emerging challenges, and new metrics are frequently introduced to measure different aspects of academic performance and improve upon earlier, pioneering work. it is becoming increasingly apparent that assessment and benchmarking are here to stay, and bibliometrics are an intrinsic aspect of today’s evaluation landscape, whether of journals, researchers or insti- tutes. This is not uncontroversial, and we have featured many critics of bibliometric analysis in general and of specific tools or their applications in past issues (please visit www.researchtrends.com). The bibliometric community acknowl- edges these shortcomings and is tirelessly campaigning for better understanding and careful use of the metrics they produce. Therefore, in this issue, we speak to three of the world’s leading researchers in this area, who all agree that bibliometric indi- cators should always be used with exper- tise and care, and that they should never completely supercede more traditional methods of evaluation, such as peer review. in fact, they all advocate a wider choice of more carefully calibrated metrics as the only way to ensure fair assessment. if you would like to read any of our back issues or subscribe to our bi-monthly e-newsletter, please visit www.researchtrends.com Kind regards, The Research Trends editorial Board Research Trends is a bi-monthly online newsletter providing objective, up-to-the-minute insights into scientific trends based on bibliometric analysis.
    [Show full text]
  • 1 Scientometric Indicators and Their Exploitation by Journal Publishers
    Scientometric indicators and their exploitation by journal publishers Name: Pranay Parsuram Student number: 2240564 Course: Master’s Thesis (Book and Digital Media Studies) Supervisor: Prof. Fleur Praal Second reader: Dr. Adriaan van der Weel Date of completion: 12 July 2019 Word count: 18177 words 1 Contents 1. Introduction ............................................................................................................................ 3 2. Scientometric Indicators ........................................................................................................ 8 2.1. Journal Impact Factor ...................................................................................................... 8 2.2. h-Index .......................................................................................................................... 10 2.3. Eigenfactor™ ................................................................................................................ 11 2.4. SCImago Journal Rank.................................................................................................. 13 2.5. Source Normalized Impact Per Paper ........................................................................... 14 2.6. CiteScore ....................................................................................................................... 15 2.6. General Limitations of Citation Count .......................................................................... 16 3. Conceptual Framework .......................................................................................................
    [Show full text]
  • TABLE 46.4: COMPARISON of TOP 10 JOURNAL JIF, Citescore, SNIP, SIR, FCR and Google
    TABLE 46.4: COMPARISON OF TOP 10 JOURNAL JIF, Citescore, SNIP, SIR, FCR and Google This table compares current rankings of journals across six sources, regardless of their subject categories. JCR is subscription only, uses journals from SCI and SSCI and calculates a ratio of citations comparing citations to citable documents.- articles and reviews. The methodology is size independent and having more documents and citations does not lead to a higher ranking. CiteScore uses a similar methodology and includes book chapters, conference papers and data papers. It uses a different time period than JIF. SNIP and SJR use the same dataset as Citescore. SNIP uses a similar ratio but normalizes based on subject. SJR uses the same size independent approach and also considers the prestige of the citing journal. It has its own H-Index. FCR, Field Citation Ratio, is normalized for subject and age of the article and requires a subscription. Google Metrics compiled by a computer program. RANKS SOURCE TITLE JIF CiteScore SNIP SJR FCR h5-index TOP TOP JCR Scopus CWTS Scimago Dimensions Google 10 20 CA:A Cancer Journal for Clinicians (1) 1 1 1 1 6 5 5 New England Journal of Medicine (2) 2 13 8 19 1 2 4 6 Nature Reviews Materials (1) 3 3 5 3 4 16 in subc 5 5 Nature Reviews Drug Discovery 4 53 22 156 5 1 in subc 2 2 The Lancet 5 7 4 29 44 4 4 4 Nature Reviews Molecular Cell Biology 6 8 20 5 15 67 3 5 Nature Reviews Clinical Oncology (3) 7 44 71 72 249 13 in subc 1 1 Nature Reviews Cancer 8 10 18 10 9 90 4 5 Chemical Reviews (2) 9 4 9 12 11 9 4 6 Nature Energy
    [Show full text]
  • Choosing the Right Journal for Your Research a Comprehensive Guide for Researchers
    Choosing the right journal for your research A comprehensive guide for researchers Guidance, developments, news, and ideas for Taylor & Francis authors @tandfonline TaylorandFrancisGroup authorservices.taylorandfrancis.com AUTHOR SERVICES /// Choosing the right journal for your research / 03 CONTENTS / INTRODUCTION 4 / UNDERSTANDING JOURNAL METRICS 18 Types of journal metric 18 / BEFORE YOU START 4 Impact Factor 18 5-year Impact Factor 18 / CHOOSING A JOURNAL CHECKLIST 5 SNIP (Source Normalized Impact Per Paper) 19 SJR (Scimago Journal Rank) 19 / WHERE TO START WHEN CHOOSING A JOURNAL 6 CiteScore 19 Do some desk research 6 Eigenfactor 19 Speak to colleagues, supervisors and your librarians 7 Cited Half-Life 19 Explore journal suggester tools 7 Altmetric Attention Score 19 Search calls for papers 7 How much should you rely on metrics when choosing a journal? 20 / NARROWING DOWN YOUR SHORTLIST 8 / GETTING TO GRIPS WITH JOURNAL INDEXING AND DISCOVERABILITY 22 Get familiar with the journal’s content 8 Abstracting and indexing 23 Review the journal aims and scope 8 Discovery Services 23 Understand the journal’s audience 10 Review journal affiliations, the editorial board, and previous authors 10 Get to know the journal’s policies and procedures 11 / IS THE JOURNAL TRUSTWORTHY? – HOW TO AVOID ‘PREDATORY PUBLISHERS’ 24 Check which article formats are accepted 12 Think. Check. Submit. 24 Understand your publishing options 12 Quality open access publishing 25 Ask about journal metrics and discoverability 12 / WHERE TO NEXT? 25 / CHOOSING OPEN ACCESS
    [Show full text]
  • How to Turn Phd Thesis Into an Article?
    How to turn PhD thesis into an article? Presented by Ozge Sertdemir [email protected] AGENDA o What is a journal article? o Differences of journal article from thesis o Identifying the right journal o Typical Journal Article Structure o How to build your article o Tips WHAT IS A JOURNAL ARTICLE? | 4 What is a Journal Article? Many first-time authors use the research conducted as a part of their PhD as a basis of for a journal article While this is logical, requirements of a journal article significantly differ from those of a thesis Journal articles focus on a specific research written by experts and other professionals and usually scholarly and peer reviewed | 5 Different Audience, Different Standards ▪ Dissertation committees assess whether a student’s work has fulfilled program outcomes and requirements, not whether it’s ready for publication or even widespread release. ▪ Dissertation review certifies the student’s capabilities within the context of the discipline and the institution. (Hawkins AR, et al, The Journal of Academic Librarianship) Editors and reviewers of peer-reviewed journals are experts in their fields and well versed in scholarly communication. Their task is to assess whether papers and projects further the knowledge and thus should be published! | 6 Differences between Thesis and Article THESIS ARTICLE • Meets academic requirements • Meets journal requirements • Reviewed by selected committee members • Reviewed by panel of blind reviewers • Chapters • Sections • Lengthy, no word limits • Word limits • Table of Contents • Manuscript format • Lengthy research of literature • Succinct research of literature • Institutional Review Board approval • Institutional Review Board described described in detail in 1-3 sentences • Description and copies of tools used • Essential and succinct tool • All findings presented information • Verb tenses vary • Selected findings presented • Verb tenses are consistent | 7 Peer – Review Process The editorial process selects suitable articles for publication and publishes papers in one standard format.
    [Show full text]