2017 Metrics Tools - Current Strengths and Issues ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Author evaluation ​ ​ Strengths Known Issues ​ ​ h-index

● Combines measure of quantity ● Different for different fields of research – ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ (publications) and impact () may be more useful for ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Provides high objectivity Science/Engineering, less suitable for ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Easily obtained Education and the Social Sciences ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Measures impact of researcher rather ● Depends on duration of researcher’s ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ than of any particular publication career ​ ​ ​ ​ ​ ​ ​ ​ ● Does not highlight highly cited papers ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Difficult to obtain for complete output ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ including all material types ​ ​ ​ ​ ​ ​ ● Different sources may give different ​ ​ ​ ​ ​ ​ ​ ​ values e.g. vs Web of ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Science ● Authors need to check that publications ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ counted belong to them ​ ​ ​ ​ ​ ​ i10-index

● Simple to calculate ● Used only in Google Scholar ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Authors need to check that publications ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ counted belong to them ​ ​ ​ ​ ​ ​ Publication count ​ ​ ● Easily obtained - most databases and ● Does not distinguish between different ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Google Scholar give publication counts document types and duration of ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ publication process ​ ​ ● Dependent on material types indexed by ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ particular databases ​ ​ ● Authors need to check that publications ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ counted belong to them ​ ​ ​ ​ ​ ​ count ​ ​ ● Easily obtained – most databases and ● Will vary between databases ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Google Scholar give citation counts ● Depends on duration of researcher’s ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ career ● Depends on field of research ​ ​ ​ ​ ​ ​ ​ ​ ● Doesn’t benefit emerging and specialized ​ ​ ​ ​ ​ ​ ​ ​ fields of research ​ ​ ​ ​ ● Numbers could be skewed by ​ ​ ​ ​ ​ ​ ​ ​ self-citations ● Authors need to check that publications ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ counted belong to them ​ ​ ​ ​ ​ ​ Journal evaluation ​ ​ Strengths Known Issues ​ ​ Journal (JIF)/CiteScore ​ ​ ​ ​ ​ ​ ● Standard, widely accepted measure ● Journal Impact Factors/CiteScore values ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Easy to understand ranking system should not be used to compare journals ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Journal Impact Factors/CiteScore are across disciplines ​ ​ ​ ​ ​ ​ ​ ​ becoming more important to the PBRF ● Discipline specific ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ process ● Open to manipulation ​ ​ ​ ​ Percentile/Quartile Rankings ​ ​ ● Simple to understand ● Used with Journal Impact Factor-ranked ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Easy to identify the top ranked journals in journals included in Journal Citation ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ a discipline Reports database ​ ​ ​ ​ ● Used mostly for Science ​ ​ ​ ​ ​ ​ SJR (SCIMago Journal & Country Rank) ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Provides an alternative to Journal Impact ● Citations from more established journals ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Factor/CiteScore. Similar to are more valuable than those from ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ but based on the database lesser-known journals ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Giving different weight to citations from ● Conservative ranking system; favours ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ different journals reflects hierarchy of status quo ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ journals ● Discipline specific ​ ​ ● Only peer reviewed articles are counted ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ SNIP - Source Normalized Impact per Paper ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● ’s impact factor is based on ● Dependent on the content of Scopus ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Scopus ● Doesn’t differentiate between the prestige ​ ​ ​ ​ ​ ​ ​ ​ ● Corrects for differences between of citing journals ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ disciplines ● Only peer reviewed articles are counted ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Quality Ranking ​ ​

● Easy to understand and use ● Subject specific and not easy to compare ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Often adopted by organizations as an journals from different disciplines ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ official research evaluation tool ● Lists may reflect organizational, ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ geographic, cultural or compiler bias ​ ​ ​ ​ ​ ​ ​ ​ Eigenfactor

● More robust than Journal Impact ● Not widely used ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Factor/CiteScore metrics ● Global rankings tending to move away ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Based on database which from Web of Science ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ indexes the most academic journals ● Not easy to compare journals from ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Easy to find and use different disciplines ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Citations may include more than articles, ● Large journals will have high Eigenfactor ​ ​ ​ ​​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ e.g. editorials scores based on size alone ​ ​ ​ ​ ​ ​ ​ ​ ​ ​

General Strengths General Known Issues ​ ​ ​ ​ ​ ​ ● Provide a broader research impact ● Lack of consistent adoption across ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ measurement in both the social and disciplines and institutions ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ scholarly realm ● Lack of agreement about what metrics, ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Can be used to augment traditional data sources or tools are most valuable ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ metrics ● Can be easily distorted or misinterpreted ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Gauge impact of research before it enters ● Measurement reflects all attention paid to ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ the citation cycle an output whether positive or negative - ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Accumulate faster compared to traditional may not reflect scholarly quality or impact ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ citation counts ​ ​ ● Measure the impact of different types of ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ scholarly or creative outputs (e.g. ​ ​ ​ ​ ​ ​ ​ ​ datasets or visual arts, software, ​ ​ ​ ​ ​ ​ ​ ​ presentations)

Strengths Known Issues ​ ​ UoW Research Commons ​ ​ ​ ​ ● Items are publically available and ● Doesn’t contain all institutional output ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ accessible by internet search engines ● May not be as widely searched/used by ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Number of file downloads for items, researchers as other databases and ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ authors and collections, and top views by services ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ city and country are readily accessible ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ImpactStory

● Tracks wide range of research output ● Interpreting reports can be challenging ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ types owing to the high number of metrics ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Aggregates impact data from many displayed, and lack of context and ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ different sources comparison ​ ​ ● Calculates the impact of research outputs ● Experience with social media is helpful for ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ using DOI, PMID or arXivID as tracking interpreting the report ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ IDs ● Relies on resource identification ​ ​ ​ ​ ​ ​ standards for compiling the reports - a ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ lack of standards is a core problem ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Figshare

● Collaborates with Symplectic and all of ● Uploading a dataset on figshare can result ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ the altmetric solutions, also on Facebook, in numerous DOIs, since both the dataset ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Google+, Twitter, and Vimeo and the individual files are assigned DOIs ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Partners with publishers such as Nature, ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ PLOS and F1000 (life sciences, clinical ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ research) ● Uploaded research is citable and ​ ​ ​ ​ ​ ​ ​ ​ trackable via a DOI ​ ​ ​ ​ ​ ​ ● Detailed reporting metrics are available ​ ​ ​ ​ ​ ​ ​ ​ for the institution ​ ​ ​ ​ ResearchGate ● Can be used in conjunction with other ● Care needs to be taken with copyright, ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ sites such as Academia.edu, Mendeley, particularly with relation to published ​ ​​ ​ ​ ​ ​​ ​ ​ ​​ ​ ​ ​ ​ ​ ​ ​ ​ ​ Google Scholar or figshare. versions of articles. Check first on ​ ​ ​ ​ ​ ​​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Uploaded full text publications are SHERPARoMEO ​ ​ ​ ​ ​ ​ ​ ​ indexed by Google ● Many of the publications available through ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Copies of papers (either pre- or ResearchGate are actually uploaded ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ post-review) and associated raw data can illegally in terms of publisher ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ be uploaded; all are searchable policy ​ ​ ​ ​ ​ ​ ​ ​ ● Collates useful information about journals, ● A high percentage of ResearchGate ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ such as impact factors, metrics and some members are postgraduate and other ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ details of open access policy students ​ ​ ​ ​ ​ ​ ​ ​ ● Provides an RG Score based on how ● There have been complaints about ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ other researchers interact with your unwanted email spamming ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ content, how often, and who they are ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​

Academia.edu

● Social networking platform which provides ● Enhanced service to researcher’s ​ ​ ​ ​ ​ ​ ​ ​​ ​ ​ ​ ​ ​ ​ analytics as well as exposure personal web page requires payment ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Tracks the number of users reading and ● Care needs to be taken with copyright, ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ downloading your articles and also especially published versions of articles. ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ identifies who is searching for you online Check first on SHERPARoMEO or seek ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​​ ​ ​ ​ ​ prior permission from the publisher ​ ​ ​ ​ ​ ​ ​ ​ Altmetric bookmarklet (Altmetric.com) ​ ​ ​ ​ ● Measures mentions of articles in ● Twitter mentions are only available for ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ mainstream media, social media, policy articles published since July 2011 ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ documents, and a number of other places ● Bookmarklet only works on PubMed, ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Altmetric Attention Score and donut arXiv or pages containing a DOI including ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ identify the amount and type of attention a publisher sites like Nature, Springer and ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ research output has received. The colors selected Elsevier journal titles ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ of the Altmetric donut each represent a ● Subscription-only access to the full range ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ different source of attention: of altmetrics reports ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ ● Data are not normalized ​ ​ ​ ​ ​ ​ ● Time-dependent - lifespan of measured ​ ​ ​ ​ ​ ​ ​ ​ attention is unknown ​ ​ ​ ​

● Integrated within a number of databases ​ ​ ​ ​ ​ ​ ​ ​ ​ ​ and repositories e.g. Wiley, Scopus, ​ ​ ​ ​ ​ ​ ​ ​ Research Commons, Springer, Nature ​ ​ ​ ​ ​ ​ ● Free for individual researchers to install ​ ​ ​ ​ ​ ​ ​ ​ ​ ​