<<

planning the future on the horizon

EIFL General Assembly 2017 Tbilisi, Georgia

Arnold Hirshon Associate Provost & University Librarian Case Western Reserve University last year: the library ecosystem ecosystem educational factors institutional Learning & functions resources

budget

staffing

library Learning Space Research Content technology roles Services (facilities) Services

• info discovery • student engagement • collaborative & • digital scholarship service systems • outcomes based active learning • digitization stations • e- & e-journals learning & spaces • new media • open content assessment • quiet study • visualizations & infographics opportunity • digital repositories • critical thinking • Inviting spaces • data integration/ (selective) • purchase on demand • digital literacy • leisure & relaxation management • self- • knowledge contexts • innovation hubs & • compliance • special collections • active learning makerspaces • GIS & location services • Data analytics last year’s takeaways: what has changed? Items in red are ones that I discussed in more detail last year • Growth of data policies and data management plans Research • Changing relationships with faculty • Increasing streaming and on-demand services Content & information • Better accessibility of research content & more competition among discovery system options discovery • Users becoming content providers, and libraries becoming publishers • Need clear collection strategies to drive decision-making about format, delivery, and access • Digital scholarship centers are expanding Digital scholarship & • More data analysis, visualization, use of geographic information systems and mapping strategies • Libraries as makerspaces • is having increased influence Technology • Uncertain futures for institutional repositories • Expanded use of open educational resources Learning • Libraries as learning environments • Radical redesign of library information literacy programs: adaptive, micro, and mobile • Greater focus on the user experience Planning, • Developing cultures of innovation, experimentation (e.g., design thinking], and assessment Organizational Design, • Increased use of strategic metrics and altmetrics & Assessment • Rethinking the library’s organizational design and librarian roles Library Spaces • Redesigning library spaces imagining the future isn’t always easy what would we have imagined the iPhone would look like before it came out 10 years ago? but creating the future is possible with the right approaches and tools where should you look to create the future?

in the beautiful clouds? at the horizon?

on the ground? the imperative today is to reorient the focus of library planning it is not about us Students and faculty don’t focus it should be about what our clients on changes in the library world need to achieve What Are How Should the the Changes they care about what is changing in Library Respond? In Libraries? their world of research and learning Content? What Are the Changes In How Should the Library the University, and Space? Respond? Society in General? Technology? Research • Content • Facilities • Services Learning • Content • Facilities • Services how do we • Quality of research output • per faculty define ... • Academic reputation global success • Quality of faculty • Quality of teaching in higher education • Industry income • International outlook, & faculty & student ratios • Employer reputation • Faculty/student ratio university • Recruitment of the best students so success • Student retention rate • Student graduation rates what • Strong job placement library is the • Operational cost efficiency (e.g., success savings) library’s • Number of transactions (e.g., # of role? items circulated or downloaded) are these really the best library success metrics? are they well-aligned with institutional success? • operational cost efficiency (e.g., savings) • number of transactions (e.g., # of items circulated or downloaded, gate counts)

what library success metrics would better demonstrate the benefits and results the university realizes from its investment in the library? ??? traditional measures of value in academe

research publications: grant results of number and productivity funded grants impact & impact funding

rankings: institutional surveys: teaching by students national and reputation & international quality and external prestige reviewers how do you measure “global success” internationally in higher education? what does global success mean today? ARWU’s World Class Universities (WCU) Conference Questions

• What are a WCUs ultimate goals, missions, roles and purposes? • What are the global common goods of WCUs, and what should WCUs create to identify, regulate, measure and finance global common goods in education and knowledge? • How can WCUs balance their roles in developing global common goods, national contributions, and local engagement to achieve global sustainability? • How can WCUs contribute to enhancement and innovation in lifelong learning? where can or should libraries integrate themselves into this new ecosystem? international indexing services: ranking criteria and weights Category QS THE ARWU Nobel Prize & Field 20% Medal winners Academic REPUTATION 40% Highly cited Reputation 20% researchers Per Capita Acad. Perf. 10% Citations per Papers in and 30% 20% Citations per faculty () Science RESEARCH 20% faculty (Scopus_ Papers indexed in Research 30% 20% WoS (SCI and SSCI) TEACHING Faculty /Student 20% Teaching 30% Quality of Education 10% RELATED Ratio INDUSTRY Employer 10% Industry income 2.50% RELATED Reputation International 5% INTERNAT’L. Faculty Ratio International 7.50% RELATED International Outlook 5% Student Ratio charting the changing library metrics for being invaluable focus of attention Internal` External ` make a difference: operational meaningful results effectiveness • improve research organizational • improve work timeliness and quality efficiency quality, image and • improve teaching quality • improve student retention • save time reputation and employment • decrease costs • improve stakeholder potential of graduates • improve productivity & sponsor relations How do we change? Harness strategic intelligence to shift from the “what” of measuring input and activity, and basic assessments to the “why” we do it by using data analytics to assess outcomes and value

new metrics should link library value and performance to: • faculty teaching performance • student learning performance • faculty research performance strategic intelligence and traditional assessment: what is the difference?

Strategic Intelligence: Assessment: comprehensive process to transform an end-of-process measurements of an organization by collecting and analyzing organization’s performance against a few pre-established criteria data to determine strategic plans, and to measure progress against those plans • Creates benchmarks to determine how • Measures: activity, performance, responsive and predictive the efficiency, operational improvement organization is of the needs of stakeholders, sponsors, and funders • Articulates expected value • May inform some decisions (e.g., what • Allocates resources to achieve its plans to stop doing or to de-emphasize) the library role on the new roadmap of institutional success • To develop strategy and improve intelligence • Analyze key predictors of success • Conduct benchmark testing • Visually display information (infographics) fostering • To improve institutional reputation • Document institutional name variants institutional • analysis and improvement • Use alt-metrics innovation and • Support faculty Orchid ID registration • Educate faculty on importance of journal impact, and provide assistance to ready their articles to improve transformation likelihood of publication

• To ensure the library has good working partnerships with other university offices

• To develop and foster a productive organizational Arnold Hirshon culture within the library Orchid ID: .org/0000-0002-1321-6041 a critical factor for individual faculty and institutional success

Publication history () is one of the few consistently important factors in the hiring of faculty

Nathan L. Vanderford and Charles B. Wright. Nature Biotechnology www.nature.com/nbt/index.html?foxtrotcallback=true#-and-comment how to help faculty, and the university, succeed?

help faculty play and win the tenure and promotion (T&P) game teaching faculty the T&P Essential research rules of the game services, tools, and in six short sessions resources. Make your research process easier and more Introduction to scholarly productive, including publishing success. research consultations Increase your scholarly reputation and interlibrary loan by understanding the established options. academic quantitative measures, e.g., author h-index (impact factor) numbers, alt-metrics (alternative metrics), citation counting, and tracking tools (e.g., Web of Knowledge/InCites, ). Digital scholarship: what is it, how can you use it? Hardware Leverage your rights as an and software for data creation, author. Publisher contracts and production, organization, restrictions, options to retain visualization, presentation, your copyright, embargoes, management, and preservation. open licensing, agreements.

Where to (or not to) publish. FINISH! Comparison of commercial, Marketing your scholarship and university-based, trade and yourself. Author ID’s, profiles and publishers, registries (e.g., Orchid), impact factors, research network portals and predatory and unscrupulous sharing hubs (e.g., ResearchGate, publishers. FigShare). Improve institutional and individual faculty research citation and impact What role the library can analysis play in gathering evidence document institutional to generate new insights name variants and provide the institution with strategic support faculty Orchid ID registration intelligence? create faculty profiles and expertise databases a few updates on some trends from last year Horizon Report 2017: Library Edition* 1. Find new ways to organize and disseminate research to make it easier to discover, digest, and track information 2. Use new media (video, visualization, augmented reality) to store and publish data, scholarly records, & publications 3. Pursue open access & educational resources to address financial constraints, combat rising costs, and expand access 4. Create flexible spaces that balance independent/quiet study versus collaboration/active learning 5. Employ user-centric design and continuously collect user data, to make the library is the learning destination 6. Lead efforts to develop the digital fluency and citizenship 7. Uphold information privacy and intellectual freedom, and advocate against policies that undermine public interests 8. Reimagine organizational structures to be agile and matrixed to advance innovative services and operations 9. Employ digital scholarship methods and tools to preserve and mine content, and to enable collaboration 10.Personalize the library using artificial intelligence & the Internet of Things technologies

* note: this report focuses primarily on technologies for learning, and not on research - http://cdn.nmc.org/media/2017-nmc-horizon-report-library-EN.pdf Digital Fluency: NMC “Simply understanding how (Horizon Library Report 2017) to use a device or certain moving beyond “information literacy” and software is not enough; “digital literacy” to understanding research methods in an age of digital scholarship people must be able to make connections between • Ensure the mastery of responsible and creative technology use, including online the tools and the intended identity, communication etiquette, and outcomes, leveraging rights and responsibilities. technology in creative ways • As hubs of information literacy and that allow them to more discovery, libraries are integral in intuitively adapt from one advancing this mission, working with context to another.” campus leaders, faculty, and staff to embed digital fluency more deeply in teaching and learning. http://cdn.nmc.org/media/2017-nmc-horizon-report-library-EN.pdf commercial publishing: as a data company

• As scientists increasingly embrace open-access, publishers must replace old revenues and create new Elsevier appears well ahead of the product differentiators pack in creating a network of • Elsevier knows has collected a lot of information on products that scientists can use to how scientists use data and their search preferences record, share, store, and measure the value to others of the surging • Elsevier’s acquisition of BePress, Hivebench, amounts of data they produce. , Pure, and SSRN, means it can now offer a – Paul Baskin complete range of data-related research workflow services, starting at initial study design and ending Elsevier is creating “suite of with post-publication impact assessments services” that could “lock in scientists to a research workflow • Elsevier has the money, experience, access, and no less powerful than the strength financial patience to get and retain researcher interest of the lock-in libraries have felt to • Open access advocates see an urgent need to create ‘big deal’ bundles.“ alternatives to Elsevier, but funding is thin, and – Roger C. Schonfeld funders tend to move slowly

Paul Baskin. http://www.chronicle.com/article/Elsevier-Is-Becoming-a- Data/240876?cid=at&utm_source=at&utm_medium=en&elqTrackId=6b663b9888b94ef681ca2d4d41573cb0&elq=e1a475bd467e40a987f643ee0ec52728&elqaid=15058&elqat=1&elqCampaignId=6398 2.5% open access: David Lewis’ 2.5% Proposal to create an Open Scholarly Commons (OSC)

• Proposal distributed shortly after Elsevier announced its acquisition of BePress • Types of organizations included for collective funding under this proposal o software projects that support the OSC, e.g., DSpace, Fedora, Hyku, the Open Journal System, ArchivesSpace, Islandora. o Disciplinary repositories, e.g., ArXiv, bioRxiv, Humanities Commons. 2.5% Formula o Large open content repositories, e.g., HathiTrust, . total contribution to  o Tools, e.g., Wikipedia, VIVO, Open Access Button, Unpaywall. total library budget o Open educational resources, e.g., OpenStax. o Other related organizations, e.g, DuraSpace, Center for , Project, Open Textbook Network, Impactstory, Orchid, Creative Commons. o Preservation organizations, e.g., Digital Preservation Network, Academic Preservation Trust. Advocacy organizations, e.g., SPARC. • Implementation: create peer pressure among libraries, and convince university administrators of the importance to support this for the common good.

David Lewis. https://scholarworks.iupui.edu/handle/1805/14063 2.5% the proposal spawned a lot of online chatter, but the devil will be in the details • Strategy: What are the strategic goals of this proposal? Which organizations have the best success record? Will there be a weighting for priority funding? Are all organizations ongoing? Where are the opportunities to pilot the system and demonstrate quick and early success? What are the measures of success for the programs and the OSC as a whole? Is this startup or sustainability funding? What if an organization needs to expand or should contract? • Administration: Will there be host organization, or will a separate NGO formed to manage the funds? How and who will define the funding priorities, e.g., to determine to which projects or organizations we would allocate the funds? How will expenses be monitored, and progress reported? Are all of the organizations able to scale up their services if a very large number of libraries suddenly contributed to the fund? • Funding: What is the 2.5% based upon? Might it need to be 1.0% or 10%? Does the 2.5% replace, or is it on top of, current institutional commitments to the individual organizations? What is the formula and method for divvying up and distributing funds from the central fund to individual organizations? What do libraries receive for their contributions (e.g., currently some like HathiTrust and DPN have a pay-to-play model and provide specific services in return, while others, like SPARC advocate for the common good)? Will libraries that contribute <2.5% receive fewer benefits? Can non-U.S./non-N.A. libraries contribute and participate? How much time will a program or organization be given to prove its value and viability before it would no longer be eligible for funding?

Asking these questions is not to imply that the proposal should not be pursued, but only that there is much to be considered before the idea is ready for launching. 2.5% next steps

• There is now an open document to which people have been adding to and debating what ought to count as an investment in “open” at http://bit.ly/scholarly_commons • A tool asking libraries to indicate their current investment levels in each organization will be launched on 6 October, and libraries can sign up now to participate at https://docs.google.com/forms/d/e/1FAIpQLScQrJgqP8T8chMhzKQN A-9WUNNl3gk2WqOftIaJ75x6iuJ9iw/viewform • A discussion session to consider the proposal will occur at the December meeting of the Coalition for Networked Information open access: ‘predatory’ and ‘unscrupulous’ journals

‘Predatory’ journals prɛdət(ə)ri ˈdʒəːn(ə)ls (noun):

Journals that typically charge authors a fee while offering little in the way of editing and research quality control (e.g., ) control. Predatory publishers sometimes give its journal create a title for a journal that is very similar to established publication to try to trick researchers. ‘predatory’ journals: the downside • Jeffrey Beall, of the University of Colorado at Denver, had been publishing a list of predatory or dubious journals • In 2013, OMICS Publishing Group threated Beall with a $1B suit because journals were on his list of >1,000 scientific journals • The Chronicle of Higher Education: reasons why the Beall list disappeared:

o Librarians over-promoted open-access journals, that often employ author fees that Beall believes creates dangerous incentives for corner-cutting and abuse o Other publishers (e.g., ) also threaten to sue o The University of Colorado, fatigued by complaints from publishers, librarians, or others, pressured Beall to stop o The academic community (universities, research funders, publishers, researchers) understood the value of the list, but did little to help Beall o Beall didn’t recognize that online shaming wouldn’t stop many scientists from publishing in journals that don’t ask too many questions

Beall: http://www.chronicle.com/article/Why-Beall-s-List-Died-/241171?key=yIPgvmN1fEDAy2jyscrBIysMeSZe6taOvRW4gsIuqGiHuvEMHRD0970b- QAtytQDaDJudGZRYjZBaXRBYVFGY3dYanZtclVoWlNjbC1OTVZNV0lzdjBkMkxmRQ ‘predatory’ journals: the “upside”?

• Derek Pyne (associate , Thompson Rivers University, British Columbia) tracked the publications of his business school’s 38 faculty • Took into account faculty compensation and research excellence awards • Findings: o 100% of full published in a predatory journal, > 66% of associate professors, and 10% of assistant professors. o publication in such journals positively correlated with income and the number of research awards received o being published in highly ranked journals was negatively correlated with income

http://www.chronicle.com/article/Does-It-Pay-to-Be-Published-in/240202 ‘predatory’ journals: how they thrive

University of Ottawa study in Nature of 2,000 biomedical articles from >200 predatory journals • Many low-quality journals are from developing nations, but >50% of authors are “from high- and upper-middle-income countries as defined by the World Bank.” • Most funders are not warning researchers to stay away from predatory journals • Some characteristics of predatory journals: o low article-processing fees (less than US$150), with promises of rapid publication o spelling and grammar errors on the journal’s , with language aimed at authors more than readers o overly broad journal publication scope o lack of information about retraction policies, handling, or digital preservation o publications included distorted images o do not include information to enable assessment or reproducibility of the author’s findings https://www.nature.com/news/stop-this-waste-of-people-animals-and-money-1.22554 ‘predatory’ journals: some solutions

“Before approving a study, committees should ask researchers to “Funders and librarians need declare in writing their willingness to to work with their institutional resources, to educate their faculty be careful before agreeing to publish in a such as librarians, to ensure journal that makes unsolicited phone or they do not submit to any journals email invitations, and to ensure that the without reviewing evidence-based journal has a true peer-review system in criteria for avoiding these titles.” place.”

https://www.nature.com/news/stop-this-waste-of-people-animals-and-money-1.22554 what is happening with Sci-hub? Sci-Hub update: the legal battles Details: the Ticking Clock The summary • 21 June 21 2017: a NY District Court granted a request against Sci-Hub and LibGen for a default judgment of $15 million in damages. The injunction also caused Sci-Hub and LibGen to lose several domain names. Sci-Hub's founder replied that she could • Elsevier won a $15M suit in the not pay the damages even if she wanted to, and the "Pirate Bay for science" isn't going anywhere. Although Elsevier was able to U.S. against Sci-Hub in June. So far successfully shut down sci-hub.org , Sci-Hub quickly returned under multiple web addresses, through which Sci-Hub has paid nothing. it continues to thrive today. • 29 June 29 2017: the American Chemical Society requested a default judgment of $4.8 million and an injunction against Sci- • ACS filed a suit for $4.8M, and Hub. • 1 September 2017: because Sci-Hub failed to respond to the they asked an injunction that lawsuit, ACS asked for a default judgment of $4.8M AND an injunction that would require search engines and internet would require search engines and service providers (ISPs) to the site. internet service providers (ISPs) to • 5 September 2017: in a move unrelated to the ACS request, the Sci-Hub founder blocked Russia from accessing the site because block the site. The ACS suit was due to “persecution” from Russia’s “liberal opposition.” • 22 September 2017: a hearing for ACS’s case against Sci-Hub heard in the U.S. courts on Friday. was held at the Federal trial court in Virginia.

https://www.insidehighered.com/quicktakes/2017/09/06/american-chemical-society-moves-block-access-sci-hub http://www.the-scientist.com/?articles.view/articleNo/50361/title/Publishers--Legal-Action-Advances-Against-Sci-Hub/ https://qz.com/1071093/an-academic-publisher-is-trying-to-kill-sci-hub-the-pirate-bay-of-science/ how much is Sci-Hub really being used? how can we really know? • A large U.S. research university tried testing the level of Sci-Hub access by checking 1 week activity on IP address of 80.82.77.83 and found only 38 sessions. The IP address owner is Novogara LTD, located in the • Sci-Hub lost its URL, but simply re-registers under other names • What does this mean? o There are probably no reliable static IP address or addresses for Sci-Hub o Sci-Hub is likely operating multiple servers around the world o All Sci-Hub IP addresses are probably ghosted o If one IP address is shut down, another will probably pop up elsewhere • Sci-Hub is not just a pirate nor just a ghost; it is a ghost pirate

https://community.spiceworks.com/tools/ip-lookup/results?hostname=80.82.77.83 Sci-Hub: the bottom line

Alexandra Elbakyan, Sci-Hub’s owner and founder, who lives outside U.S. jurisdiction, says she plans to ignore the U.S. lawsuits. ______“No publishing executive for the foreseeable future will be able to act without taking the Sci-Hub factor into account. … [P]ublishers have lost control of their own publications. … For a librarian, Sci-Hub is an unacknowledged reserve army prepared to enter the battle with publishers. For publishers Sci-Hub is an enemy, clear and simple … Whatever view a publisher takes of Sci-Hub … Sci-Hub has now entered the decision-making process, where it cannot be easily dislodged.” – Joseph Esposito, Consultant for publishers and digital services. ______Neither the legal nor technical means to Sci-Hub have worked yet. “I don't see any other obvious way to do it. So my reading is [that] Sci-Hub is here to stay.” – some steps to set the future above the horizon where are we going, and how should we get there?

Create new organizational differentiators for the library – less dependent on content, technology, and facilities + more provision of unique and valued services + with increased cross-institution collaborations Create a new organizational culture that fosters strategic change culture of development & delivery a culture of developmentculture of culture of assessment& delivery & experimentation improvement culture of experimentation Create low-cost trials and pilot projects

Ask for bad ideas o there are no sacred cows o empower people to conceive and create bold – sometimes even outlandish – ideas

Move beyond benchmarks o apply diagnostic internal quantitative and qualitative analyses to see if and how a benchmark applies to your organization http://www.mckinsey.com/industries/retail/our-insights/rethinking-the-rules-of-reorganization development & delivery: what we should value and reward in our people

works well in a team good interpersonal & good written environment communication skills communication skills

shows initiative passion & attitude critical reasoning

analytical skills problem solving skills achieves results

good fit with cultural emotional team worker and values intelligence

good technical leadership skills integrity and trust knowledge

Source per Sam Nielsen: Graduate Outlook Survey, AAGE High Flyers Survey, Footprints Market Research, Employers assessment “The right “Data, data [strategic] choices everywhere, need information but what we need is that is accurate, more time to think.” useful, actionable and available where – adapted from a quote by and when needed” Professor Max Lu – Sam Nielsen Arnold Hirshon Associate Provost & University Librarian, Case Western Reserve University [email protected]