Best Practices for Google Analytics in Digital Libraries
Total Page:16
File Type:pdf, Size:1020Kb
Best Practices for Google Analytics in Digital Libraries Authored by the Digital Library Federation Assessment Interest Group Analytics working group Molly Bragg, Duke University Libraries Joyce Chapman, Duke University Libraries Jody DeRidder, University of Alabama Libraries Rita Johnston, University of North Carolina at Charlotte Ranti Junus, Michigan State University Martha Kyrillidou, Association of Research Libraries Eric Stedfeld, New York University September 2015 The purpose of this white paper is to provide digital libraries with guidelines that maximize the effectiveness and relevance of data collected through the Google Analytics service for assessment purposes. The document recommends tracking 14 specific metrics within Google Analytics, and provides library-centric examples of how to employ the resulting data in making decisions and setting institutional goals and priorities. The guidelines open with a literature review, and also include theoretical and structural methods for approaching analytics data gathering, examples of platform specific implementation considerations, Google Analytics set-up tips and terminology, as well as recommended resources for learning more about web analytics. The DLF Assessment Interest Group Analytics working group, which produced this white paper, looks forward to receiving feedback and additional examples of using the recommended metrics for digital library assessment activities. 2 Table of contents Section I: Introduction and Literature Review A. Introduction B. Literature Review Section II: Google Analytics Prerequisites A. Learn About Google Analytics Practices and Policies B. Understand Local Digital Library Infrastructure C. Goal Setting Section III: Recommended Metrics to Gather A. Content Use and Access Counts 1. Content Use and Access Counts Defined 2. Site Content Reports 3. Bounce Rate 4. Download Counts 5. Time 6. Pageviews 7. Sessions B. Audience Metrics 1. Location 2. Mode of Access 3. Network Domain 4. Users C. Navigational Metrics 1. Path Through the Site 2. Referral Traffic 3. Search Terms Section IV: Additional Metrics and Custom Approaches A. Dashboards and Custom Reports B. Event Tracking C. Goals and Conversions Section V: Examples of Platform-specific Considerations A. CONTENTdm B. DLXS Section VI: Tips for Google Analytics Account Setup A. Terminology 1. Properties 2. Views Section VII: Further Resources on Google Analytics Section VIII: Conclusions and Next Steps Appendix Other Methods for Collecting Analytics Data Bibliography 3 Section I: Introduction and Literature Review A. Introduction Libraries invest significant resources in providing online access to scholarly resources, whether through digitization, detailed cataloging and metadata production, or other methods. Once digital materials are published online, resource managers can take a number of approaches to assessing a digital library program. Qualitative methods include web usability studies, focus groups, surveys, and anecdotal information gathering. Quantitative methods to assess digital libraries often center on tracking information about website visitors and their actions through some means of web traffic monitoring, also known as web analytics. This paper will focus on best practices for collecting and using web analytics data in digital libraries, specifically data gathered through Google Analytics.1 While this paper focuses on web analytics, collecting any type of assessment data and using it to inform decision-making can: ● increase understanding of the return on the investment of digital libraries; ● provide more information about users, use, cost, impact, and value; ● help guide improvement of digital library services and the user experience; ● assist in decision-making and strategic focus. No single type of data or technique can assess every aspect of a digital library; analytics are a single piece of a larger assessment puzzle. This document is intended for digital library managers and curators who want to use analytics to understand more about users of, access to, and use of digital library materials. The Digital Library Federation Assessment Interest Group (DLF AIG) is using Matusiak’s definition of a digital library as “the collections of digitized or digitally born items that are stored, managed, serviced, and preserved by libraries or cultural heritage institutions, excluding the digital content purchased from publishers.”2 The authors hope to pave the way for cross-institutional resource managers to share benchmarkable and comparable analytics. Their intention is for the information in this paper to evolve over time as more institutions utilize and enhance these guidelines. We chose to limit our scope to Google Analytics because many libraries use this tool, and because our task needed to be scoped in order to be attainable.3 It is important to be aware, however, that changes in technology may cause fluctuation in the usefulness of any tool -- including Google Analytics -- in the future. If there is enough community interest and volunteers, 1 "Google Analytics," accessed August 4, 2015, http://www.google.com/analytics/. 2 Matusiak, K. (2012). Perceptions of usability and usefulness of digital libraries. International Journal of Humanities and Arts Computing, 6(1-2), 133-147. DOI: http://dx.doi.org/10.3366/ijhac.2012.0044. 3 Over 60% of all websites use Google Analytics: see “Piwik, Privacy,” accessed September 16, 2015, http://piwik.org/privacy/. 4 other web analytics services could be considered for inclusion after the Digital Library Federation 2015 Forum. An overview of other methods for collecting web analytics data can be found in the appendix of this document. This document was authored by the analytics working group of the DLF AIG. The DLF AIG was formed in spring of 2014.4 The group arose from two working sessions that took place at the 2013 DLF Forum, “Determining Assessment Strategies for Digital Libraries and Institutional Repositories Using Usage Statistics and Altmetrics”5 and “Hunting for Best Practices in Digital Library Assessment.”6 The first of these sessions was concerned with determining how to measure the impact of digital collections; developing areas of commonality and benchmarks in how the community measures collections across various platforms; understanding cost and benefit of digital collections; and exploring how such information can be best collected, analyzed, communicated, and shared effectively with various stakeholders. The second working session set out to test the waters for the potential of a collaborative effort to build community guidelines for best practices in digital library assessment. The two working sessions were well attended and group leaders formed the DLF AIG to foster ongoing conversation. In the fall of 2014, volunteers from the DLF AIG formed four working groups around citations, analytics, cost assessment, and user studies. The primary purpose of each working group is to develop best practices and guidelines that can be used by all to assess digital libraries in their particular area; the white papers and other products of these working groups can be found on the DLF wiki.7 The Analytics working group is composed of library staff from around the United States working in the fields of digital programs, assessment, and electronic resources. The authors of this document are: ● Molly Bragg (Co-coordinator of the working group, Digital Collections Program Manager, Duke University Libraries) ● Joyce Chapman (Co-coordinator of the working group, Assessment Coordinator, Duke University Libraries) ● Jody DeRidder (Head of Metadata and Digital Services, University of Alabama Libraries) ● Rita Johnston (Digitization Project Librarian, University of North Carolina at Charlotte) ● Ranti Junus (Electronic Resources Librarian, Michigan State University) ● Martha Kyrillidou (Senior Director, Statistics and Service Quality Programs, ARL) ● Eric Stedfeld (Project Manager/Systems Analyst, New York University) 4 See Joyce Chapman’s blog post "Introducing the New DLF Assessment Interest Group," posted May 12, 2014, http://www.diglib.org/archives/5901/. 5 "Determining Assessment Strategies for Digital Libraries and Institutional Repositories Using Usage Statistics and Altmetrics," accessed August 4, 2015, http://www.diglib.org/forums/2013forum/schedule/21-2/. 6 "Hunting for Best Practices in Digital Library Assessment," accessed August 4, 2015, http://www.diglib.org/forums/2013forum/schedule/30-2/. 7 The DLF Assessment Interest Group wiki can be found at http://wiki.diglib.org/Assessment. As of the DLF 2015 annual meeting the citations, users and user studies, and analytics working groups have each produced a white paper and the cost assessment working group has defined digitization processes for data collection and created a digitization cost calculator using data contributed by the community. See each of the group’s individual wiki pages for links and details. 5 The group began its work by performing a literature review and defining types of audiences, content, and metrics pertinent to digital library assessment. The group continued to refine a list of core metrics to recommend for baseline collection in a digital library program. In this paper, each metric includes a definition and explanation of importance, as well as library-centric examples for how to work with the metric in Google Analytics. This document was distributed to the larger DLF AIG for feedback and comments in two drafts in July and August 2015. The white paper