EUROPE’S RESEARCH LIBRARY NETWORK

Scholarly Metrics Recommendations for Research Libraries: Deciphering the Trees in the Forest Table of Contents

01 3D. Learn About Platforms Before Implementing Them In Services & 15 INTRODUCTION 03 Homogenize Different Sources ABOUT LIBER 3E. Work With Researchers To Build Awareness Of Benefits But Educate 16 1. DISCOVERY & DISCOVERABILITY 05 About Weaknesses of Scholarly Metrics 1A. Provide Contextual Information To Allow the Discovery of Related 05 3F. Expand Your Perspective When Developing Services; Avoid Single- 17 Work & Users Purpose Approaches 1B. Exploit Rich Network Structures & Implement Bibliometric Methods To 07 3G. Teach Colleagues New Skills & Library Goals & Services 17 Enable Discovery 1C. Encourage Sharing Of Library Collections Under Open Licenses 08 4. RESEARCH ASSESSMENT 19 4A. Establish Appropriate Goals for Assessment Exercises Before Selecting 19 2. SHOWCASING ACHIEVEMENTS 09 Databases & Metrics; Be Transparent About Use & Interpretation 2A. Incentivize Researchers To Share Scholarly Works, Promote Achievements 09 4B. Use Different Data Sources to Include Various Scholarly Works & 20 Online & Engage With Audiences Disciplinary Communication & Publication Cultures 2B. Encourage Researchers To Showcase Scientific Contributions & Monitor 11 4C. Rely on Objective, Independent & Commonly-Accepted Data Sources 21 Impact To Provide Sound & Transparent Scholarly Metrics 4D. Avoid Using Composite Indicators & Conflating Different Aspects 21 3. SERVICE DEVELOPMENT 13 Of Scholarly Works & Impact; Don’t Lose The Multifaceted Nature of 3A. Join Forces With Stakeholders To Provide Services Reusing Existing 13 Metrics & Distort Interpretation Resources, Tools, Methods & Data 3B. Value Various Levels of Engagement; Favour Standardized, Well-Established 15 23 CONCLUSION & CALL FOR ACTION Practices & Easy-To-Use Tools 25 REFERENCES 3C. Make Full Use Of Open Data Sources; Sustain Not-For-Profit Enterprises 15 27 CREDITS With Open Business Models; Openly Share Data, Tools & Services OPEN SCHOLARLY The recommendations in this report are METRICS INTRODUCTION organised into three levels: zz Initial Steps (circular bullet) Open scholarly metrics are output „„ Intermediate Steps (square bullet) and impact indicators whose data SS Advanced Steps (triangular bullet) are open. We use ‘open’ in the sense of the Open Definition3 and we Which recommendations a library adopts apply it to data, metrics, indicators, will depend on their current level of methods, software, and services. In engagement with scholarly metrics. addition, open data should follow the FAIR Principles.4 The order in which the recommendations appear are in correlation with the potential When talking about data we mean importance they can have for an (bibliographic) metadata, data institution. related to scholarly outputs (e.g., The digital era is bringing new and exciting The recommendations are grouped into number of published articles), data changes to scholarly communication. four sections: These two indications of use were devel- related to the impact of scholarly Modern scientific libraries and information oped during the Working Group’s works (e.g., number of citations), infrastructures are obliged to face these 1. Discovery and Discoverability workshop during LIBER’s 2017 Annual and qualitative information challenges in a professional way, and 2. Showcasing Achievements Conference. They are there to assist in pri- about engagement with scholarly sooner rather than later. 3. Service Development oritizing, but are not mandatory to follow works and other stakeholders 4. Research Assessment and are not dependent on each other. of the research enterprise (e.g., The monitoring and execution of policies, demographic information about the facilitation of , Each section covers a set of activities, and users tweeting a scientific article and support for research data manage- makes suggestions for libraries which want and the tweet containing the DOI to ment are but a few examples of adaptation to promote the transparent, standardized the article). to the digital era. The use of scholarly met- and responsible use of scholarly metrics. rics1 is also an emerging field for academic Open should be the default way libraries, brought on by digital change. As part of LIBER’s focus on Open Science, of providing scholarly metrics and the Working Group has placed a special of developing services and tools. To foster this vision, LIBER’s Innovative emphasis on recommendations addressing Where openness is not possible (e.g., Metrics Working Group2 has set out rec- open scholarly metrics. 1 Following Haustein (2016, p. 416) we define scholarly because of third party restrictions), ommendations on how academic libraries metrics as “indicators based on recorded events of acts transparency should be aimed for. (e.g., viewing, reading, saving, diffusing, mentioning, citing, and information infrastructures can deal reusing, modifying) related to scholarly documents (e.g., with scholarly metrics, and how to get papers, , blog posts, datasets, code) or scholarly started with the development of services agents (e.g., researchers, universities, funders, journals)”. to support this. Concepts such as or are subsumed under this definition. 2 https://libereurope.eu/strategy/innovative-scholarly- communication/metrics 3 https://opendefinition.org 4 https://www.force11.org/group/fairgroup/fairprinciples

1 2 OUR NETWORK

About LIBER libraries are at the heart of our net- Anyone working in a LIBER library is invited work. Staff from these libraries directly both to follow the progress of these groups LIBER influence LIBER’s work by serving on our and to get involved. Participating in a Executive Board, Steering Committees and Working Group is a wonderful way to ex- LIBER (Ligue des Bibliothèques Européennes By 2022, we envision a world where: Working Groups. change experiences and challenges with de Recherche – Association of European Re- your professional peers, while at the same search Libraries) represents 430 university, • Open Access is the predominant form of The Innovative Metrics group which pro- time making a valuable contribution to the national and special libraries in 40 countries, publishing; duced this document is one of nine current wider research library community. making us Europe’s largest research library • Research Data is Findable, Accessible, LIBER Working Groups. The activities network. Interoperable and Reusable (FAIR); undertaken by each group are varied but To learn more about participating, simply • Digital Skills underpin a more open they share one common aim: to advance contact the chair of the group which you Our 2018-2022 Strategy, Powering Sus- and transparent research life cycle; Open Science. would like to join. Contact details are avail- tainable Knowledge in the Digital Age, • Research Infrastructure is participatory, able on LIBER’s website. outlines how libraries can prepare them- tailored and scaled to the needs of the selves for coming changes in the research diverse disciplines; www.libereurope.eu landscape. It is based on three key focus • The cultural heritage of tomorrow is built areas: Innovative Scholarly Publishing, on today’s digital information. Digital Skills and Services, and Research Infrastructures.

3 4 FIVE PRINCIPLES FOROpen OPENKnowledge ACCESS Maps.20 The list is information about their users that expanding as you read. add to the understanding of who is Discovery & zzNEGOTIATIONSReveal the relations between WITH concerned with research results. 1. PUBLISHERSscholarly works by using ranking Depending on the user base usage Discoverability functionalities (by year, times cited, information can reflect societal h-index, journal , usage, impact of scholarly works, Scholarly metrics can be used to discover scholarly works and researchers, and to highlight or other metrics or formal metadata). institutions, disciplines, and ways of increasing their discoverability. Context, such as reader demographics, helps when SS Visualize knowledge flows via researchers. Readership and evaluating scholarly works. Context can also aid stakeholders of the scientific enterprise, for citation networks (where does attention metrics can be found in example, to address the right target group in articles or to find the right publication outlet for knowledge come from, i.e. reference services like Altmetric, PlumX the topic. Metadata and scholarly metrics are rich sources for the provision of context on works, list, where does it go, i.e. citations). Metrics and Kudos.25 publication venues, institutions and individuals. Moreover, scholarly metrics support knowledge Use tools like VOSViewer,21 Gephi22 „„ Solicit platforms to focus after discovery. They can lead to the finding of information not explicitly mentioned in the dataset, or CitNetExplorer23 to create maps publication and where to engage in such as influential authors in a scientific field. Re-using methods in studies of scholarly metrics of science through citation networks discussions by exploring mentions on can open new search opportunities and help library patrons — researchers, students and uni- and enhance them with information social platforms. This context also versity administrators — to become acquainted with these methods. from scholarly metrics just like Open provides authors of scholarly works Knowledge Maps do. with the full view of the scholarly „„ Reveal collaboration patterns (e.g., debate around works or people. 1A. PROVIDE CONTEXTUAL zz Add available metadata to the on institutional level) through co- „„ Use scholarly metrics to aid discovery INFORMATION TO ALLOW scholarly works and people to authorship analyses. Take advantage of research trends and upcoming THE DISCOVERY OF RELATED provide rich contextual information. of tools like VOSViewer or Pajek.24 topics, e.g., output and/or impact WORK & USERS There are numerous resources to SS Reflect readership via usage indicators. Take a look at tools such as mine for valid information, such as information and show routes to AIDA – a tool co-developed -by TU SS Track appearances of scholarly Sherpa/ROMEO 14 for journals or other scholarly works and people Delft Library and the Leiden works and people from your ORCID or Google Citation Profiles15 also concerned with the topic (often University’s Centre for Science and institution. Link to or provide for authors. outside of academia). Social media Technology Studies – that visualizes information on the context. To do „„ Establish links between CRISs and platforms often provide and analyses of research areas and this, a full list of scholarly works digital repositories and provide demographic and professional research trends.26 including bibliographic metadata and interfaces to institutional research persistent identifiers is needed. information services with shared 4 CrossRef Event Data, https://www.crossref.org/ 16 1findr, https://1finder.1science.com Depending on whether searching for infrastructures in the field of services/event-data 17 Unpaywall, https://unpaywall.org a scholarly document (work), an scholarly communication, such as 5 , https://clarivate.com/products/web-of- 18 , https://harzing.com/resources/ agent (person or institution) or an ORCID. Demonstrate researchers’ science publish-or-perish 6 , https://www.scopus.com 19 rOpenSci, https://ropensci.org event, you may use CrossRef Event or institutions’ compliance with open 7 , https://scholar.google.com 20 Open Knowledge Maps, https://openknowledgemaps. 4 5 6 Data, Web of Science or Scopus, science principles by checking for 8 Google Analytics, https://analytics.google.com org Google Scholar,7 Google Analytics,8 open access versions of their articles 9 ORCID, https://orcid.org 21 VOSViewer, http://www.vosviewer.com ORCID,9 Altmetric,10 Plum via databases such as 1findr16 or 10 Altmetric, https://altmetric.com 22 Gephi, https://gephi.org 11 23 Analytics,11 a Twitter search query Unpaywall.17 Plum Analytics, https://plumanalytics.com CitNetExplorer, http://citnetexplorer.nl 12 Research Gate, https://www.researchgate.net 24 Pajek, http://mrvar.fdv.uni-lj.si/pajek or other social networks (e.g., „ Process context data and implement „ 13 Mendeley, https://www.mendeley.com 25 Kudos, https://www.growkudos.com 12 13 ResearchGate or Mendeley ), to it accordingly with tools such as 14 Sherpa/ROMEO, https://www.sherpa.ac.uk/romeo 26 AIDA Toolbox, http://aida.tudelft.nl/toolbox gather such information. Publish or Perish,18 rOpenSci,19 or 15 Google Citations, https://www.scholar.google.com/ citations

5 6 zz Support journal selection for articles analysis to create clusters of related zz Make your dataset referable by via additional information about scholarly works, topics or people. adding a persistent identifier from journals via impact indicators, e.g., VOSviewer or Sci232 are powerful the plethora of options you have readership demographics, impact free tools to do so. from DOIs and Handles to URNs and factor etc. Tools about rankings, such SS Illustrate the topology of DataCite.37 as JCR27 and SciMago,28 should researchers’ networks and use them „„ Increase visibility of the dataset by work together with services that to detect key scholarly works, key linking it with existing resources, provide journal information or people, and communities; key people sharing it via wikidata or Wikimedia impact indications, such as Impact serve as bridges between etc. The GLAM-WIKI of WikiMedia38 Factor, SNIP, SJR, etc. communities. You may build these project supports GLAMs and other zz Demonstrate impact of research networks easily with NodeXL33 or if institutions to produce open-access, works outside of academia by linking you are more advanced to networks freely-reusable content for the them with patents and policy you may try Gephi and Pajek. public. documents that include bibliographic SS Demonstrate reusability and value of information. Google Patents can be a 1C. ENCOURAGE SHARING OF data via training sessions or open useful resource to start with, but LIBRARY COLLECTIONS hack events. Include the local com- also Dimensions.29 UNDER OPEN LICENSES TO munity and promote success stories STRENGTHEN REUSABILITY & of reusability. 1B. EXPLOIT RICH NETWORK INCREASE DISCOVERABILITY zz Open your library data to enable STRUCTURES & IMPLEMENT OF SCHOLARLY WORKS & service development which can be BIBLIOMETRIC METHODS RESEARCHERS later reused across libraries. TO ENABLE DISCOVERY OF EXPERTS, POTENTIAL zz Publish data produced in your library COLLABORATORS & NON- in a standard format on a persistent ACADEMIC AUDIENCES platform. If possible publish your data as linked open data. If you do „„ Build a database that contains not own a platform to do so, then try network data (e.g., citations), or use Zenodo34 or FigShare.35 27 Journal Citation Report, https://jcr.incites. ready-to-go-products. The current z Choose a license complying with the z thomsonreuters.com market provides several options open definition when publishing data 28 Simago Journal & Country Rank, http://www.scimagojr. including Web of Science, Scopus, and allow for accessing these data- com Microsoft Academic Search,30 sets (e.g., provide persistent identi- 29 Dimensions, https://dimensions.ai 30 Google Scholar, and Initiative for fiers and support various types of Microsoft Academic Search, https://www.microsoft. com/en-us/research/project/microsoft-academic-graph Open Citations.31 APIs). Learn more about Open Data36 31 Initiative for Open Citations, https://i4oc.org zz Use your own system to track impact and engage in discussions with the 32 Sci2, https://sci2.cns.iu.edu indicators and engagement with community). 33 NodeXL, https://www.smrfoundation.org/nodexl library services and own databases. zz Increase usability of library data. 34 Zenodo, https://zenodo.org 35 Use your data to analyse and Provide thorough documentation on FigShare, https://figshare.com 36 as described in the Open Data Handbook optimise library-owned services. how the data was collected, what it 37 DataCite, https://datacite.org „„ Analyze network data with contains, what not, definition of 38 GLAM-Wiki, https://outreach.wikimedia.org/wiki/ bibliographic coupling or co-citation- fields, possible use cases, etc. GLAM

7 6 FIVE PRINCIPLES zzFORCreate OPEN a workflow ACCESS to receive green Open Access versions of your Showcasing NEGOTIATIONSresearchers’ works immediately WITH 2. PUBLISHERSafter publication. Do not waste time Achievements as it is difficult to find the right version two years later and to remind These recommendations focus on how to share scholarly works and how to track and highlight them that any embargo has ended. the attention they receive. Sharing, access, online presence, visibility of scholarly works and Sherpa/ROMEO is always a researchers are keys in this endeavour. They also positively influence each other: sharing in- trustworthy service to consult. creases visibility, visibility increases access, access increases impact indicators, increased impact „„ Upload posters and presentations to indicators enhance visibility. However, to fully exploit the benefits of the positive feedback public and/or university repositories loop, information on scholarly works and researchers must be complete and up-to-date. Libraries and link to them in profile pages. have to roles in these regards: they support knowledge building and provide resources that aid From FigShare and F1000 Posters45 showcasing. to SlideShare and SpeakerDeck, there are numerous options to record talks of researchers and 2A. INCENTIVIZE RESEARCHERS and present success stories (from engage in public discussion. TO SHARE SCHOLARLY your institution or the particular WORKS, PROMOTE disciplines). ACHIEVEMENTS ONLINE & „„ Train your customers to use social ENGAGE WITH AUDIENCES media and online platforms and VIA OTHER MEANS THAN provide online-training material TRADITIONAL SCHOLARLY which discusses issues and benefits MEDIA of being present online. Collaborate with the communications/marketing „„ Know about the online platforms and department. audiences that are of importance to „„ Provide an institutional repository or your researchers (e.g., Academia39 or recommend trustworthy (inter-) Research Blogging40). disciplinary repositories from third „„ Provide your customers with the parties and provide information resources to build an online about Open Access journals.41 presence. If you have a CRIS help Arxiv,42 bioRxiv43 and other related them set up their webpage or advise services are very important as they them to create a blog page exploiting hold a central position in the the CRIS information and scholarly communication ecosystem. 39 Academia.edu, https://www.academia.edu infrastructure. SS Establish sharing and citation 40 Research Blogging, http://researchblogging.org SS Inform your customers about the guidelines just like Open Access NL44 41 Directory of Open Access Journals, https://doaj.org 42 various options, benefits (e.g., has done. Arxiv, https://arxiv.org 43 bioRxiv, https://www.biorxiv.org visibility), and pitfalls (e.g., copyright) 44 OpenAccess NL, http://openaccess.nl/en of online social networking platforms 45 F1000 Research – Posters, https://f1000research. for scholarly works and researchers com/browse/posters

9 2B. ENCOURAGE RESEARCHERS zz Track online attention around TO SHOWCASE SCIENTIFIC scholarly works and researchers. It is CONTRIBUTIONS & MONITOR easy to set up citation alerts in any of IMPACT TO BE AWARE OF WHO your preferred services. ENGAGES WITH WORK AND zz Increase visibility (i.e. findability) of HOW scholarly works and stakeholders of the scientific enterprise by SS Support and educate your optimizing academic search engines researchers how to build an online from your end.49 Public databases identity and discuss the pros and containing information about cons of such an approach. scholarly works, researchers, and „„ Demonstrate how researchers institutions should be machine- showcase their achievements and readable and findable (from websites what tools they use. Discuss options to repositories). for self-assessment, for example in relation to the academic age, i.e. years since the PhD degree or the first academic publication. A good example for self-assessment and showcasing is the portfolio template presented by the Acumen project.46 „„ Support your researchers registering an ORCID or other author identifiers like ResearcherID,47 show them how to use them, and explain their benefits. Similarly, explain the benefits of author badges and where to get them, e.g. Mozilla Authorship Contribution Badges.48 „„ Make your researchers aware of the benefits of persistent identifiers for their works and show them where to get these. There is abundance of services for any aspect of their works, from Handles in institutional 46 Acumen Portfolio, http://research-acumen.eu/ repositories to DataCite for datasets. wp-content/uploads/MS5-ACUMEN-portfolio.pdf IDA „„ Provide your customers with Toolbox 47 complete information about their ResearcherID, http://www.researcherid.com 48 Mozilla Authorship Contribution Badges, https:// scholarly works. This requires badges.mozillascience.org keeping the institutional CRIS or 49 Academic optimization, http://guides. university bibliography up-to-date. library.ucla.edu/seo

11 „„ Find and define tasks that academic libraries should conduct by Service themselves, such as the assessment 3. of individuals and the interpretation Development of results. „„ Standardize workflows that suit the These recommendations focus on service development by reusing existing services and tools needs of your researchers if you and returning them to the community of academic libraries. Reuse and mash-ups of tools, data, follow a We Do It For You mode of methods, and services, as well as standardization and interoperability, are critical for exhaus- engagement. Reuse existing tive provision of responsible services around scholarly metrics. This enables tailoring services processes, methods, and data where to the needs of academic libraries and increases the reusability of services and tools. This is possible. especially important for small and medium-sized libraries that lack manpower and are there- fore the main beneficiary of collaborative and open service development.

3A. JOIN FORCES WITH 3B. VALUE VARIOUS LEVELS OF STAKEHOLDERS TO PROVIDE ENGAGEMENT; FAVOUR SERVICES REUSING EXISTING STANDARDIZED, WELL- RESOURCES, TOOLS, ESTABLISHED PRACTICES & METHODS & DATA EASY-TO-USE TOOLS

zz Start with enriching existing sources „„ Find and define tasks that and databases at your library (e.g., researchers can do by themselves, add scholarly metrics to the CRIS such as updating user profiles on and/or institutional repository).50 social media platforms. (Do It zz Learn about scholarly metrics via any Yourself-level of engagement). possible means including libguides, Provide information about the goals summer schools and product of the tasks and how to perform presentations.51 them (e.g., via websites, wikis, or zz Engage with the scholarly metrics trainings). Provide self-explanatory community. Attend conferences, tools for execution. workshops and summer schools, and „„ Find and define tasks that 50 joining mailing lists. Draw experience researchers or university Academic search engine optimization, http://guides. library.ucla.edu/seo from existing services and tools. administration and libraries should 51 For example, http://www.metrics-toolkit.org and Tailor these services to your specific conduct together, such as planning https://www.openuphub.eu/assess needs and capabilities. Collaborate the quantitative assessment of an 52 For example, on the establishment of standardized when developing new services.52 individual (We Do It Together-level of bibliometrics services and workflows in libraries: „„ Learn about the usability and engagement). Gorraiz, Wieland, & Gumpenberger (2016): Individual Bibliometric Assessment @ University of Vienna: From functionalities of these services and Numbers to Multidimensional Profiles. DOI: 10.5281/ tools by running pilots and sharing zenodo.45402 the findings (e.g., via blogs).53 53 For example, https://thebibliomagician.wordpress.com

13 3C. MAKE FULL USE OF available to increase transparency 3E. WORK WITH RESEARCHERS OPEN DATA SOURCES; and build awareness on the costs of TO BUILD AWARENESS OF SUSTAIN NOT-FOR-PROFIT quantitative analyses based on BENEFITS BUT EDUCATE ENTERPRISES WITH OPEN scholarly metrics. Also, be ABOUT WEAKNESSES OF BUSINESS MODELS; OPENLY transparent about how you collect SCHOLARLY METRICS SHARE DATA, TOOLS & data and use tools and for what SERVICES purposes. Important institutions zz Address those that might be have prepared code of conducts, like interested in scholarly metrics zz Find, use and provide open data and the NISO Data Quality Code of services (e.g., as strategic tools for exercises including scholarly Conduct for Altmetrics.55 departments of the academic library metrics54 if they prove to be and/or the university, faculties, and consistent, reliable, sustainable, and 3D. LEARN ABOUT PLATFORMS researchers). Learn about their goals transparent. BEFORE IMPLEMENTING so that you can tailor services to zz Identify the license for data reuse by THEM IN SERVICES & their needs. Consult case studies by third party (CC BY or CC0) and check HOMOGENIZE DIFFERENT universities such as Loughborough61 whether the developed services in SOURCES and Vienna61 for ideas. the libraries can comply with the „„ Find out which level of engagement requirements the license entails. „„ Know about rankings that might be is really needed from libraries to zz Share your data and tools (eg., of major importance for your meet the goals of their customers. library-/institution-owned data sets researchers, such as discipline- zz To increase acceptance and trust in such as loan statistics) with an open specific journal rankings or university the data underlying scholarly metrics license. Open means to the internal rankings. services, engage customers in data community of library employees, SS Demonstrate the value of different collection and make them an integral researchers and university types of impact indicators in part of the system. For example, the administrators as well as the public. comparison to citations. system should be interactive and Consult Open Definition and Fair zz Look for open data sets in the allow for correction or improvement Data for thorough and clear required disciplines and use them in of the content. Reuse existing data interpretations of ‘open’. If this is not addition to proprietary databases and workflows as much as possible possible, aim for maximum (eg., Web of Science). RefSeer,56 to keep the user burden low. 57 58 transparency about data collection, Paperscape, PubMed, Arxiv, zz Educate your customers about the 54 Such as Initiative for Open Citations processing, tools used, results and Repec,59 Scielo60 and numerous strengths and weaknesses of output 55 https://www.niso.org/standards-committees/altmetrics interpretation. others provide datasets to and impact indicators as a proxy for 56 RefSeer, http://refseer.ist.psu.edu 57 SS Support alternative funding models specialized communities. quality (e.g., Journal Impact Factor, PaperScape, http://paperscape.org 58 PubMed, https://www.ncbi.nlm.nih.gov/pubmed (ie. models not based on licensing S Be prepared for heterogeneous h-Index), and of data sources (e.g., S 59 Repec, http://repec.org data and tools) when distributing impact indicators that reflect citation databases such as Web of 60 Scielo, http://www.scielo.org your services. This also allows third engagement with scientific work Science or Google Scholar) and 61 Loughborough University – Assessment, http://www. parties to scrutinize data and (e.g., likes, shares, retweets, forks) altmetrics (e.g., altmetrics lboro.ac.uk/research/scholcomms/assessment/respmetrics 62 methods and will act as quality check that may require separate analysis bookmarklet).63 Make your Bibliometric Services for the Administration at the University of Vienna, http://bibliothek.univie.ac.at/ for your data. and interpretation. The same holds explanations accessible and bibliometrie/en/fur_die_administration.html zz Make your spending for proprietary for the platforms that record convincing by using in-house 63 Altmetric.com Bookmarklet, https://www.altmetric. data sources and tools publicly scholarly works. examples or disciplinary case studies. com/products/free-tools/bookmarklet

15 16 3F. EXPAND YOUR PERSPECTIVE 3G. TEACH COLLEAGUES NEW WHEN DEVELOPING SKILLS & LIBRARY GOALS SERVICES; AVOID SINGLE- & SERVICES; LIBRARIANS PURPOSE APPROACHES IN ARE MULTIPLIERS WHEN FAVOR OF ADDRESSING THE IT COMES TO PROMOTING WIDER COMMUNITY METRICS SERVICES

zz Start on the level of the individual to SS Provide material for self-education allow for easy maintenance of and provide resources for further scholarly metrics. information (e.g., mailing lists64 or „„ Build modular services that can serve blogs65). individual as well as customer needs SS Provide local trainings for staff, (e.g., including inter-changeable provide webinars, or encourage staff denominators for calculation of to attend to trainings by established impact) and that allow for different institutions, such as the Summer levels of complexity. School for by CWTS66 or the European Summer School for Scientometrics.67 SS Help by supporting each other (e.g., by arranging discussion groups on site).

64 We indicate two mailing lists for start: JISC Bibliometrics, https://www.jiscmail.ac.uk/cgi-bin/ webadmin?A0=LIS-BIBLIOMETRICS and ASIST SIGMETRICS http://mail.asis.org/mailman/listinfo/ sigmetrics 65 See for instance, https://thebibliomagician.wordpress. com 66 CWTS Scientometrics Summer School (CS3), https:// www.cwts.nl/training-education/cwts-scientometrics- summer-school 67 European Summer School for Scientometrics, http:// www.scientometrics-school.eu

17 „„ Scholarly metrics, indicators, and „„ Use other sources, aside from data used should correspond with citation databases, that reflect the Research what is to be measured. attention that scholarly works and 4. researchers have received in order to Assessment 4B. USE DIFFERENT DATA achieve a more complete picture of SOURCES TO INCLUDE their impact. For instance, you may The following recommendations focus on the use of scholarly metrics in the context of research VARIOUS SCHOLARLY record any appearance in syllabi as assessment. They apply and can be used on various levels, from the individual researcher and WORKS & DISCIPLINARY ‘essential reading’, and underline the librarian to larger conglomerations, which have not been sufficiently elaborated on in the Leiden COMMUNICATION & additional impact it has had. Manifesto68. It is important to keep in mind that quantitative evaluation (i.e., bibliometrics) PUBLICATION CULTURES zz Add to the basket of metrics by should always be accompanied by qualitative assessments such as peer-review and expert providing scholarly metrics that can recommendations. When assessing individual researchers, it is especially important that SS Use other data sources (e.g., patent be produced and shared by your guidelines and recommendations from documents such as the Leiden Manifesto, DORA,69 databases) to include publication library. Get a sense of these by Metric Tide Report (responsible use of metrics),70 NISO Guidelines for Altmetrics,71 and Next types and scholarly works that are querying catalogues like WorldCat73 Generation Metrics Report from the EU expert Group on Altmetrics 72 be followed. not appropriately represented in to see where works are part of generic citation databases. library collections, or check your SS Alongside generic citation indexes catalogue for circulation figures. 4A. ESTABLISH GOALS FOR SS The level of assessment (individual, and bibliographic databases ASSESSMENT EXERCISES institutions, countries, or disciplines) (e.g.,WoS Core Collection or Scopus), BEFORE SELECTING determines the use of databases, try to include subject-specific DATABASES & METRICS; BE scholarly metrics, and indicators. databases which may better cover TRANSPARENT ABOUT USE & zz When selecting databases, scholarly the discipline under study. INTERPRETATION metrics and indicators, make sure to follow procedures that are zz To insure transparency and increase standardized and grounded in credibility of the results, provide research community (e.g., disciplines) context and explain the procedure of needs and perceptions. the research assessment (e.g., goals, zz Conduct structured interviews with methods, databases, selection principals to receive serious process, and scholarly metrics) in the information about the goals of the 68 Leiden Manifesto, http://www.leidenmanifesto.org. The final report. Also provide the raw and research assessment. Ensure the role of the Leiden Manifesto for libraries has extensively aggregated data used for the researchers’ perspective is included been discussed in Coombs, S.K., & Peters, I. (2017). The assessment to allow for in the process. Always ask: What is it Leiden Manifesto under review: what libraries can learn from it. Digital Library Perspectives, 33(4), 324-338. DOI: scrutinization. Always discuss the you want to measure? Reputation 10.1108/DLP-01-2017-0004 results of the assessment with the (what kind?), publication activity, 69 San Francisco Declaration on Research Assessment researcher or other representatives visibility, impact (what type of Academic https://sfdora.org (e.g., of the assessed department) to impact?), etc. Pay attention to the 70 Responsible Metrics, https://responsiblemetrics.org 71 learn more about the context of their work of the European Commission’s NISO Guidelines for Altmetrics, https://www.niso.org/ standards-committees/altmetrics research and publication activities. Expert Group on Altmetrics and to 72 Expert Group on Indicators, https://ec.europa.eu/ This will prevent incorrect initiatives like Responsible Metrics. research/openscience/index.cfm?pg=altmetrics_eg interpretation of raw data. 73 OCLC WorldCat, https://www.worldcat.org

19 20 4C. RELY ON OBJECTIVE, 4D. AVOID USING COMPOSITE INDEPENDENT AND INDICATORS & CONFLATING COMMONLY-ACCEPTED DIFFERENT ASPECTS OF DATA SOURCES TO PROVIDE SCHOLARLY WORKS & SOUND & TRANSPARENT IMPACT; DON’T LOSE THE SCHOLARLY METRICS MULTIFACETED NATURE OF METRICS & DISTORT zz Increase objectivity of assessment INTERPRETATION based on scholarly metrics by analysing data from various SS Collect as much attention data as platforms. You may need to possible and use (i.e. display) a harmonize before combining data. variety of simple output and impact This is because each database has a indicators to allow for different way of counting units (e.g., multidimensional interpretation of what is considered a research the results. article?). zz Support understanding of those zz Remember: all databases that can be indicators by aiding interpretation used for analyses or scholarly (e.g., via discipline-based metrics are subjective since they use comparisons).. normative selection procedures for their content. Even CRISs are subjective since it is the researchers themselves that provide the information about their scholarly works. Hence, data from these databases need to be edited, controlled, and harmonized. Elements of subjectivity should be made transparent and considered in the analyses with scholarly metrics. „„ Use standards when building own databases for assessments with scholarly metrics. See as an example Snowball Metrics73 that uses source- and system-agnostic services to facilitate reliable and commonly agreed benchmarking methods.

74 Snowball Metrics, https://www.snowballmetrics.com

21 OPEN SCHOLARLY CONCLUSION METRICS & CALL FOR ACTION

The 16 sets of recommendations in this We hope that this community will share document were formulated by LIBER’s best practices, introduce and develop new Innovative Metrics Working Group. tools, build and expand skills and expertise, Our goal was to help research libraries and inspire joint development of services worldwide to foster the understanding of for the use and application of scholarly available scholarly metrics, and to provide metrics. guidelines with respect to the increasing amounts of information regarding scholarly The Working Group encourages research evaluation. libraries to jointly follow the road of responsible use and development of The recommendations aim to prepare scholarly metrics for discovery and librarians to take action: to enable and discoverability, showcasing scholarly advance the use of scholarly metrics achievements, service development, and regardless of the level of expertise available research assessment. at the institutional library.

However, these recommendations serve only as starting points for librarians and service providers. They are a first step of what the Working Group hopes will become a dialog among LIBER members, to encourage the formation of an international community of academic librarians, metrics service providers and research support staff.

23 27 References

Bik, H.M. & Goldstein, M.C., 2013. An Introduction to Social Media for Scientists. PLoS Haustein, S., Bowman, T.D. & Costas, R., 2016. Interpreting “altmetrics”: viewing acts on Biology, 11(4), p.e1001535. Available at: http://dx.plos.org/10.1371/journal.pbio.1001535 social media through the lens of citation and social theories. In C. R. Sugimoto, ed. Theories [Accessed April 15, 2018]. of Informetrics and Scholarly Communication. De Gruyter, pp. 372–406. Available at: http://arxiv.org/abs/1502.05701. Coombs, S.K. & Peters, I., 2017. The Leiden Manifesto under review: what libraries can learn from it. Digital Library Perspectives, 33(4), pp.324–338. Available at: http://www. National Information Standards Organization, 2016. Outputs of the NISO Alternative emeraldinsight.com/doi/10.1108/DLP-01-2017-0004 [Accessed April 15, 2018]. Assessment Metrics Project, Available at: http://www.niso.org/apps/group_public/download. php/17091/NISO RP-25-2016 Outputs of the NISO Alternative Assessment Project.pdf. Gingras, Y., 2014. Criteria for evaluating indicators. In B. Cronin & C. R. Sugimoto, eds. Beyond Bibliometrics: Harnessing Multidimensional Indicators of Scholarly Impact. Peters, I. et al., 2014. Altmetrics for large, multidisciplinary research groups: Comparison Cambridge,MA: MIT Press, pp. 109–125. of current tools. Bibliometrie - Praxis und Forschung, 3(0). Available at: http://www. bibliometrie-pf.de/article/view/205 [Accessed April 15, 2018]. Gorraiz, J., Wieland, M. & Gumpenberger, C., 2016. Individual Bibliometric Assessment @ University of Vienna: From Numbers to Multidimensional Profiles. Available at:http:// Wilsdon, J. et al., 2017. Next-generation metrics: Responsible metrics and evaluation for arxiv.org/abs/1601.08049 [Accessed April 15, 2018]. open science, Brussels. Available at: https://publications.europa.eu/en/publication-detail/-/ publication/b858d952-0a19-11e7-8a35-01aa75ed71a1/language-en [Accessed April 15, Gumpenberger, C., Wieland, M. & Gorraiz, J., 2012. Bibliometric practices and activities 2018]. at the University of Vienna. Library Management, 33(3), pp.174–183. Available at: http:// www.emeraldinsight.com/doi/ 10.1108/01435121211217199 [Accessed April 15, 2018]. You may download the citations of this reference list at: https://www.zotero.org/ groups/2197598/liber_metrics Haustein, S., 2016. Grand challenges in altmetrics: heterogeneity, data quality and dependencies. Scientometrics, 108(1), pp.413–423. Available at: http://link.springer. com/10.1007/s11192-016-1910-9 [Accessed April 15, 2018].

25 26 Credits

This report and list of recommendations has been compiled by LIBER’s Innovative PHOTOS Metrics Working Group. Cover Image - Stefan Steinbauer, CC0 • Isabella Peters, ZBW Leibniz Information Centre for Economics, Germany Table of Contents — Patricia Serna, CC0 • Sarah Coombs, Saxion Library, The Netherlands P. 8 — Cantonal and University Library of Lausanne, CC BY • Dr Birgit Schmidt, Göttingen State and University Library, Germany P. 10 — Tracy Le Blanc, CC0 • Isidro F. Aguillo, Cybermetrics Lab, Spain P. 12 — Kristians Luhaers, CC BY • Alenka Princic, Delft University of Technology Library, The Netherlands P. 14 — Stefam Stefanick, CC0 • Marc Martinez, Université Jean Moulin - Lyon 3, France P. 18 — Kasper Rasmusse, CC0 • Dr Peter Kraker, Open Knowledge Maps, Wien, Austria P. 22 — Kaitlyn Baker, CC0 • Najko Jahn, Göttingen State and University Library, Germany P. 24 — Max van den Oetelaar, CC0 • Dr Stefanie Haustein, School of Information Studies, University of Ottawa, P. 25 — Pixabay, CC0 Canada P. 27 — Göttingen State and University Library, CC BY • Nathalie Cornée, London School of Economics and Political Science, UK • Dr Kasper Abcouwer, University of Amsterdam, The Netherlands • Kim Holmberg, University of Turku, Finland • Ania López, Duisburg-Essen University Library, Germany • Juan Gorraiz, Vienna University Library, Austria • Ellen Fest, Wageningen UR Library, The Netherlands • Dr Giannis Tsakonas, Acting Director, Library & Information Center, University of Patras, Greece The products and services mentioned herein are indicative and should not be conceived as the only one suitable. LIBER’s Innovative Metrics Working Group is not promoting any product or service and has no conflicts of interest. To the knowledge of This publication is is distributed under the terms of the Creative Commons Attribution 4.0 International Public License, which permits any use, distribution its members, the mentioned platforms reflect easy, accessible and affordable tools to and reproduction, as long as the original author(s) and source are credited. support and generate scholarly metrics services.

27 28