Building the Assessment Librarian Guildhall: Criteria and Skills for Quality Assessment
Total Page:16
File Type:pdf, Size:1020Kb
ACALIB-01403; No. of pages: 3; 4C: The Journal of Academic Librarianship xxx (2013) xxx–xxx Contents lists available at SciVerse ScienceDirect The Journal of Academic Librarianship METRICS Building the Assessment Librarian Guildhall: Criteria and Skills for Quality Assessment Megan Oakleaf 326 Hinds Hall, Syracuse University, Syracuse, NY 13244, USA Keeping up on the “latest and greatest” in library assessment isn't the conference; I set out to capture a sense of the current quality easy. Basic strategies for being in-the-know include scanning the pro- criteria developing in the library assessment community. By the end fessional literature or attending conferences. Professional journals of the conference, I noted three overarching quality criteria that like the Journal of Academic Librarianship, College & Research Libraries, may guide library assessment guild practices: 1) an emphasis on and portal: Libraries and the Academy are great venues for library as- value, 2) the use of the “right” tools and data, and 3) the generation sessment publications, and the Library Assessment Conference (LAC) of decisions, actions, and communications based on assessment is arguably the best professional meeting for learning about current results. Certainly, this is a short and simple list, and I do not intend assessment projects taking place in academic libraries nationwide. A it to be exhaustive or prescriptive. I offer it only as a means to begin few months ago, as an attendee at my fourth LAC, I was excited to a discussion amongst our guild members about what indicators, note the increase in attendance at this conference, both in the variety themes, or standards we may use to guide our pursuit of quality of librarian roles represented and in the total number of participants. library assessment. This growth indicates that a community of “assessment librarians” (or librarians who engage in assessment) is fast developing in aca- EMPHASIZE VALUE demic libraries. Increasingly, compelling library assessment focuses on the existing THE LIBRARY ASSESSMENT GUILD and potential value of academic libraries. Academic library value, in this context, is derived from the degree of alignment between the library services, expertise, and resources and an institution's focus In his LAC keynote titled “Living in the Cloud: Who Owns It, Who areas (Oakleaf, 2012a). Quality library assessments that fit this criterion Pays for It, Who Keeps it Safe, and Will My Kids Inherit the Wind,” investigate the ways in which library services, expertise, and resources John Lombardi (former president of the University of Florida and (e.g., instruction, reference, data curation, collections, facilities) impact, the Louisiana State University system as well as past chancellor of contribute to, affect, influence, relate to, cause, determine, or help an insti- University of Massachusetts—Amherst) described higher education tution achieve its focus areas (e.g., strategic priorities, missions, financial as a “guild” structure in which faculty communities form the core of resources or commitments, and stakeholder needs) (Oakleaf, 2012b). the institution. There are history guilds, biology guilds, and art guilds Institutional focus areas might include: with each guild “defin[ing] itself in terms of the intellectual methodology that its • Student recruitment members apply to their field of study…. The guild's definition of • Student retention, completion, graduation standards based on these methods and the evaluation of quality • Student career success based on the standards are what define the guild's responsibility. • Student grades, test achievement Members of the guild must meet these academic and methodologi- • Student learning outcomes cal standards, or the guild will not recognize the validity of their • Student experience, engagement work. As has been the case for all guilds since medieval times, the • Student–faculty academic rapport methodological standards guarantee that the members' products • Alumni lifelong learning meet guild criteria” (Lombardi, Craig, Capaldi, Gater, & Mendonça, • Faculty recruitment 2001, p. 5). • Faculty tenure, promotion • Faculty teaching At LAC, Lombardi's comments were limited to disciplinary teach- • Faculty service ing and research faculty; he did not mention a “librarian guild” or • Faculty research productivity an “assessment guild” in his keynote. Even so, his talk inspired me • Faculty grant seeking to think about potential quality criteria for our burgeoning library • Faculty patents, technology transfer assessment guild. His remarks framed my thinking about the rest of • Faculty innovation, entrepreneurship • Institutional prestige E-mail address: [email protected]. • Institutional affordability 0099-1333/$ – see front matter © 2013 Elsevier Inc. All rights reserved. http://dx.doi.org/10.1016/j.acalib.2013.02.004 Please cite this article as: Oakleaf, M., Building the Assessment Librarian Guildhall: Criteria and Skills for Quality Assessment, The Journal of Aca- demic Librarianship (2013), http://dx.doi.org/10.1016/j.acalib.2013.02.004 2 M. Oakleaf / The Journal of Academic Librarianship xxx (2013) xxx–xxx • Institutional efficiencies Anticipated limitations may be appropriate to include in documenta- • Institutional accreditation, program review tion completed to ensure research ethics are followed (i.e., human • Institutional brand subjects review), and final limitations should be clearly described to • Institutional athletics facilitate peer review of resulting publications and presentations. • Institutional development, funding, endowments Third, library assessment is most powerful when it embraces the • Local, global workforce development use of specific, exact, individualized data. Academic library value • Local, global economic growth and impact exist on an individual level. In other words, the difference • Local, global engagement, community building, social inclusion that academic libraries make in the lives of users typically happens • And so on (Oakleaf, 2012a). one user at a time. Consequently, the group-level data collected by many libraries is less useful than data that focuses on individual According to this “library value” criterion, quality library assessment experiences and behaviors as well as the impact of those experiences 1) focuses on measures and methods, 2) uses appropriate assessment and behaviors on the lives of individual users. Furthermore, while designs (i.e., longitudinal, cohort, control, sampling, etc.), 3) employs individual-level data can be aggregated, group data is generally very or generates evidence and data, 4) involves partners, collaborators, difficult to disaggregate. Thus, the collection of specific, exact, individ- and stakeholders, 5) seeks funding, and 6) disseminates results—all to ualized data is critical to establish the quality of many library assess- provide actionable information about the connections between libraries ment studies. and areas of institutional focus in an effort to demonstrate library value. A great example of this criterion shared at LAC is the University of GENERATE DECISIONS, ACTIONS, AND COMMUNICATIONS Minnesota Libraries' investigation of connections between library use and academic achievement and retention. At LAC, this work won recog- Library assessments that do not lead to decisions, actions, and nition in the form of a poster titled, “Making Use of What You are communications with stakeholders are not worth doing; therefore, Already Collecting”; it is also due to be published in article format the basic utility of library assessment deserves its own quality criteri- soon (Soria, Fransen, & Nackerud, 2013). on. Assessment is, by nature, iterative and cyclical (Oakleaf, 2009, 2011). In each assessment cycle, some result is attained. Hopefully, USE THE “RIGHT” TOOLS & DATA the results are revealing, interesting, or even inspiring. Other times, the results may be confusing, faulty, redundant, or otherwise frustrat- For the second library assessment guild criterion, the focus is ing. Either way, assessment results should be used; results may three-fold. First, using the “right” tools and data means ensuring a help librarians decide to act in a variety of ways (e.g., make changes, good match between 1) the assessment strategy or evidence and generate improvements, maintain successes, increase or decrease 2) a given assessment purpose, need, or question. For example, librar- services or resources, recommend actions to be taken by others, or ians may seek to investigate potential connections between user improve subsequent assessments). They also should be clearly con- interactions with reference services and the acquisition of informa- veyed to all relevant stakeholders (e.g., students, parents, faculty, tion literacy skills. In this case, user satisfaction or service quality administrators, or other librarians) using shared language that is un- surveys would not be the “right” tools to ascertain what users have derstandable to all. The importance of communicating assessment or have not learned from their reference interactions, because users' results was the focus of a concurrent session at LAC led by librarians degree of satisfaction or perception of reference service quality are at the University of Washington, the University of Virginia, and not valid, accurate, or direct measures of information literacy learn- the Orbis Cascade Alliance as well as an “affinity lunch discussion” ing. Whenever possible, assessment tools and data should be fully in- session. Ideally, the decisions,