Meaningful Metrics: a 21St-Century Librarian's Guide to Bibliometrics, Altmetrics, and Research Impact

Total Page:16

File Type:pdf, Size:1020Kb

Meaningful Metrics: a 21St-Century Librarian's Guide to Bibliometrics, Altmetrics, and Research Impact Meaningful METRICS A 21st-Century Librarian’s Guide to Bibliometrics, Altmetrics, and Research Impact Robin Chin Roemer & Rachel Borchardt Association of College and Research Libraries A division of the American Library Association Chicago, Illinois 2015 The paper used in this publication meets the minimum requirements of American National Standard for Information Sciences–Permanence of Paper for Printed Library Materials, ANSI Z39.48-1992. ∞ Library of Congress Cataloging-in-Publication Data Meaningful metrics : a 21st century librarian’s guide to bibliometrics, altmetrics, and research impact / edited by Robin Chin Roemer and Rachel Borchardt. pages cm Includes bibliographical references and index. ISBN 978-0-8389-8755-1 (pbk. : alk. paper) -- ISBN 978-0-8389-8757-5 (epub) -- ISBN 978-0-8389-8756-8 (pdf ) -- ISBN 978-0-8389-8758-2 (kin- dle) 1. Bibliometrics. 2. Bibliographical citations--Evaluation. 3. Scholarly publishing--Evaluation. 4. Research--Evaluation--Statistical methods. 5. Communication in learning and scholarship--Technological innovations. I. Roemer, Robin Chin, editor. II. Borchardt, Rachel, editor. Z669.8.M43 2015 010.72’7--dc23 2015006338 Copyright ©2015 by The Association of College & Research Libraries, a division of the American Library Association. All rights reserved except those which may be granted by Sections 107 and 108 of the Copyright Revision Act of 1976. Printed in the United States of America. 19 18 17 16 15 5 4 3 2 1 This work is licensed under Creative Commons license CC BY-NC 4.0 (Attribution-NonCommercial Use) Table of Contents Foreword .................................................................................................v PART 1. IMPACT Chapter 1: Understanding Impact ..................................................3 Chapter 2: Impact in Practice ........................................................13 PART 2. BIBLIOMETRICS Chapter 3: Understanding Bibliometrics .................................. 27 Chapter 4: Bibliometrics in Practice ........................................... 71 PART 3. ALTMETRICS Chapter 5: Understanding Altmetrics .......................................99 Chapter 6: Altmetrics in Practice ...............................................155 PART 4. SPECIAL TOPICS Chapter 7: Disciplinary Impact.....................................................181 Chapter 8: Impact and the Role of Librarians .....................209 Glossary ..............................................................................................233 Acknowledgments ......................................................................... 239 About the Authors ...........................................................................241 iii Foreword few days ago, we were speaking with an ecologist from Simon Fraser University here in Vancouver about an unsolicited job A offer he’d recently received. The offer included an astonishing inducement: Anyone from his to-be-created lab who could wangle a first or corresponding authorship of a Nature paper would receive a bonus of $100,000. Are we seriously this obsessed with a single journal? Who does this benefit? (Not to mention, one imagines the unfortunate middle authors of such a paper, trudging to a rainy bus stop as their endian-authoring col- leagues roar by in jewel-encrusted Ferraris.) Although it’s an extreme case, it’s sadly not an isolated one. Across the world, a certain kind of adminis- trator is doubling down on 20th-century, journal-centric metrics like the impact factor. That’s particularly bad timing because our research communication system is just beginning a transition to 21st-century communication tools and norms. We’re increasingly moving beyond the homogeneous, jour- nal-based system that defined 20th-century scholarship. Today’s scholars increasingly disseminate web-native scholarship. For instance, Jason’s 2010 tweet coining the term “altmetrics” is now more cited than some of his peer-reviewed papers. Heather’s openly published datasets have gone on to fuel new articles written by other researchers. And like a growing number of other researchers, we’ve published research v vi Foreword code, slides, videos, blog posts, and figures that have been viewed, reused, and built upon by thousands all over the world. Where we do publish tra- ditional journal papers, we increasingly care about broader impacts, like citation in Wikipedia, bookmarking in reference managers, press coverage, blog mentions, and more. You know what’s not capturing any of this? The impact factor. Many researchers and tenure committees are hungry for alternatives, for broader, more diverse, more nuanced metrics. Altmetrics are in high demand; we see examples at Impactstory (our altmetrics-focused non- profit) all the time. Many faculty share how they are including downloads, views, and other alternative metrics in their tenure and promotion dossiers and how evaluators have enthused over these numbers. There’s tremendous drive from researchers to support us as a nonprofit, from faculty offering to pay hundreds of extra dollars for profiles to a Senegalese postdoc refusing to accept a fee waiver. Other altmetrics startups like Plum Analytics and Altmetric can tell you similar stories. At higher levels, forward-thinking policy makers and funders are also seeing the value of 21st-century impact metrics and are keen to realize their full potential. We’ve been asked to present on 21st-century metrics at the NIH, NSF, the White House, and more. It’s not these folks who are driv- ing the impact factor obsession; on the contrary, we find that many high- level policy-makers are deeply disappointed with 20th-century metrics as we’ve come to use them. They know there’s a better way. But many working scholars and university administrators are wary of the growing momentum behind next-generation metrics. Researchers and administrators off the cutting edge are ill-informed, uncertain, afraid. They worry new metrics represent Taylorism, a loss of rigor, a loss of mean- ing. This particularly true among the majority of faculty who are less com- fortable with online and web-native environments and products. But even researchers who are excited about the emerging future of altmetrics and web-native scholarship have a lot of questions. It’s a new world out there, and one that most researchers are not well trained to negotiate. We believe librarians are uniquely qualified to help. Academic librar- ians know the lay of the land, they keep up-to-date with research, and they are experienced providing leadership to scholars and decision-makers on Foreword vii campus. That’s why we’re excited that Robin and Rachel have put this book together. To be most effective, librarians need to be familiar with the met- rics research, which is currently advancing at breakneck speed. And they need to be familiar with the state of practice—not just now, but what’s com- ing down the pike over the next few years. This book, with its focus on integrating research with practical tips, gives librarians the tools they need. It’s an intoxicating time to be involved in scholarly communication. We’ve begun to see the profound effect of the web here, but we’re just at the beginning. Scholarship is on the brink of a Cambrian explosion, a break- neck flourishing of new scholarly products, norms, and audiences. In this new world, research metrics can be adaptive, subtle, multidimensional, responsible. We can leave the fatuous, ignorant use of impact factors and other misapplied metrics behind us. Forward-thinking librarians have an opportunity to help shape these changes, to take their place at the vanguard of the web-native scholarship revolution. We can make a better scholarship system, together. We think that’s even better than that free Ferrari. Jason Priem Heather Piwowar Cofounders of Impactstory Glossary Section 1 IMPACT Chapter One Understanding Impact nce upon a time, there was no such thing as bibliometrics. Like its conceptual predecessor, statistical bibliography, bibliometrics is a Oconcept predicated on the widespread existence of printed mate- rial and the acceptance of a specific printed format (the journal article) as a fundamental means of communication between scholars and experts within a field. Within library and information science (LIS), we have seen many excellent books and articles published over the last 20 years, each telling its own version of the history of bibliometrics and predicting what lies in store for scholars and practitioners of bibliometrics with new advancements in technology, research methods, and general higher ed. This is not one of those books. This is a book that tells stories—some of which are about bibliometrics, others of which are about altmetrics, but all of which are about impact and human beings’ never-ending quest to measure, track, and compare their value. At this point, let’s take a moment to reflect on what we mean when we say the word impact, particularly in the context of an academic book. Impact is a word that we in the LIS field hear and use every day, yet it can be a surprisingly tricky word to define, at least without lots of additional context. For example, researchers in public health would certainly be dis- appointed by an English department’s assessment about what it means for faculty to produce “impactful” scholarship. This is because impact 3 4 Understanding Impact is a word with a variety of subtle definitions, each changing over time and with different audiences, geographies, and local institutional philos- ophies. The result
Recommended publications
  • A Scientometric Analysis
    Tropical Ecology 59(3): 431–443, 2018 ISSN 0564-3295 © International Society for Tropical Ecology www.tropecol.com Global research trends in ‘Ecology’: A scientometric analysis 1* 2 ANWESHA BORTHAKUR & PARDEEP SINGH 1Centre for Studies in Science Policy, Jawaharlal Nehru University (JNU), New Delhi-110067, India 2Department of Environmental Studies, PGDAV College, University of Delhi, New Delhi-110065, India Abstract: Ecological research has observed a near constant growth during the past few decades. Considering the significance and diversity of ecological research with regard to both its vastness and specificity, this paper is an attempt to map out the research activities in ‘ecology’ through a ‘scientometric analysis’ of the world research outputs. The aim was to identify and document the historical and current research trajectories in the subject. We recognize the fact that in the absence of an in-depth analysis of the trends in research, it is utterly possible that some areas of research get more than adequate attention from the global research and policy community while some equally important areas of the subject remain completely unaddressed or neglected. The study was carried out using two major databases – Scopus and SCImago for the 21 years period from 1996 to 2016 by means of a defined scientific method. Among the several interesting observations, we have found that apart from China, no countries and research institutes from the global south are listed among the top 10 research producing countries/institutes on ecology. Considering the fact that majority of the ecologically sensitive areas and biodiversity hotspots of the world are located in the developing world and have significance influences on the ecological processes across the globe, this calls for immediate attention from the research community.
    [Show full text]
  • “Altmetrics” Using Google Scholar, Twitter, Mendeley, Facebook
    Pre-Print Version Altmetrics of “altmetrics” using Google Scholar, Twitter, Mendeley, Facebook, Google-plus, CiteULike, Blogs and Wiki Saeed-Ul Hassan, Uzair Ahmed Gillani [email protected] Information Technology University, 346-B Ferozepur Road, Lahore (Pakistan) Abstract: We measure the impact of “altmetrics” field by deploying altmetrics indicators using the data from Google Scholar, Twitter, Mendeley, Facebook, Google- plus, CiteULike, Blogs and Wiki during 2010- 2014. To capture the social impact of scientific publications, we propose an index called alt-index, analogues to h-index. Across the deployed indices, our results have shown high correlation among the indicators that capture social impact. While we observe medium Pearson’s correlation (ρ= .247) among the alt-index and h-index, a relatively high correlation is observed between social citations and scholarly citations (ρ= .646). Interestingly, we find high turnover of social citations in the field compared with the traditional scholarly citations, i.e. social citations are 42.2% more than traditional citations. The social mediums such as Twitter and Mendeley appear to be the most effective channels of social impact followed by Facebook and Google-plus. Overall, altmetrics appears to be working well in the field of “altmetrics”. Keywords: Altmetrics, Social Media, Usage Indicators, Alt-index Pre-Print Version Introduction In scholarly world, altmetrics are getting popularity as to support and/or alternative to traditional citation-based evaluation metrics such as impact factor, h-index etc. (Priem et. al., 2010). The concept of altmetrics was initially proposed in 2010 as a generalization of article level metrics and has its roots in the #altmetrics hashtag (McIntyre et al, 2011).
    [Show full text]
  • How to Search for Academic Journal Articles Online
    How to search for Academic Journal Articles Online While you CAN get to the online resources from the library page, I find that getting onto MyUT and clicking the LIBRARY TAB is much easier and much more familiar. I will start from there: Within the Library tab, there is a box called “Electronic Resources” and within that box is a hyperlink that will take you to “Research Databases by Name.” Click that link as shown below: The page it will take you looks like the picture below. Click “Listed by Name.” This will take you to a list starting with A, and the top selection is the one you want, it is called “Academic Search Complete.” Click it as pictured below: THIS SECTION IS ONLY IF YOU ARE ON AN OFF-CAMPUS COMPUTER: You will be required to log-in if you are off campus. The First page looks like this: Use the pull-down menu to find “University of Toledo” The Branch should default to “Main Campus,” which is what you want. Then click “Submit.” Next it will ask for you First and Last Name and your Rocket ID. If you want to use your social security number, that is also acceptable (but a little scary.). If you use your rocket ID, be sure to include the R at the beginning of the number. Then click Submit again and you are IN. The opening page has the searchbox right in the middle. When searching, start narrow and then get broader if you do not find enough results. For Example, when researching Ceremony by Leslie Silko, you may want your first search to be “Silko, Ceremony.” If you don’t find enough articles, you may then want to just search “Silko.” Finally, you may have to search for “Native American Literature.” And so on and so forth.
    [Show full text]
  • Citation Analysis for the Modern Instructor: an Integrated Review of Emerging Research
    CITATION ANALYSIS FOR THE MODERN INSTRUCTOR: AN INTEGRATED REVIEW OF EMERGING RESEARCH Chris Piotrowski University of West Florida USA Abstract While online instructors may be versed in conducting e-Research (Hung, 2012; Thelwall, 2009), today’s faculty are probably less familiarized with the rapidly advancing fields of bibliometrics and informetrics. One key feature of research in these areas is Citation Analysis, a rather intricate operational feature available in modern indexes such as Web of Science, Scopus, Google Scholar, and PsycINFO. This paper reviews the recent extant research on bibliometrics within the context of citation analysis. Particular focus is on empirical studies, review essays, and critical commentaries on citation-based metrics across interdisciplinary academic areas. Research that relates to the interface between citation analysis and applications in higher education is discussed. Some of the attributes and limitations of citation operations of contemporary databases that offer citation searching or cited reference data are presented. This review concludes that: a) citation-based results can vary largely and contingent on academic discipline or specialty area, b) databases, that offer citation options, rely on idiosyncratic methods, coverage, and transparency of functions, c) despite initial concerns, research from open access journals is being cited in traditional periodicals, and d) the field of bibliometrics is rather perplex with regard to functionality and research is advancing at an exponential pace. Based on these findings, online instructors would be well served to stay abreast of developments in the field. Keywords: Bibliometrics, informetrics, citation analysis, information technology, Open resource and electronic journals INTRODUCTION In an ever increasing manner, the educational field is irreparably linked to advances in information technology (Plomp, 2013).
    [Show full text]
  • Everyday Life Information Seeking
    Everyday Life Information Seeking Reijo Savolainen Department of Information Studies and Interactive Media, University of Tampere, Tampere, Finland Folksonomies Ethiopia– Abstract Information seeking may be analyzed in two major contexts: job-related and nonwork. The present entry concentrates on nonwork information seeking, more properly called everyday life information seeking (ELIS). Typically, ELIS studies discuss the ways in which people access and use various information sources to meet information needs in areas such as health, consumption, and leisure. The entry specifies the concept of ELIS and characterizes the major ELIS models. They include the Sense-Making approach (Dervin), the Small world theory (Chatman), the ecological model of ELIS (Williamson), ELIS in the context of way of life (Savolainen), the model of information practices (McKenzie), and the concept of information grounds (Fisher). ELIS practices tend to draw on the habitualized use of a limited number of sources which have been found useful in previous use contexts. Since the late 1990s, the Internet has increasingly affected the ELIS practices by providing easily accessible sources. Even though the popularity of the networked sources has grown rapidly they will complement, rather than replace, more traditional sources and channels. INTRODUCTION THE CONCEPT OF ELIS Information seeking is a major constituent of information Thus far, a rich variety of themes have been explored in behavior or information practices, that is, the entirety of ELIS studies. They have focused on people belonging to ways in which people seek, use, and share information in diverse groups such as the following: different contexts.[1,2] Information seeking may be ana- lyzed in two major contexts: job-related and nonwork.
    [Show full text]
  • Supplementary Text and Figures
    SUPPLEMENTARY TEXT John P.A. Ioannidis1, Richard Klavans2, Kevin W. Boyack2 1 Departments of Medicine, of Health Research and Policy, of Biomedical Data Science, and of Statistics, and Meta-Research Innovation Center at Stanford (METRICS), Stanford University 2 SciTech Strategies, Inc. Contents Technical note on methods 1 References 7 Figures: trends over time and effects of co-authorship 8 Survey to hyperprolific authors for calendar years 2016: text of e-mail 12 Survey to hyperprolific authors for calendar year 2016: open responses 14 Request to contribute comments sent to all listed hyperprolific authors 18 Comments contributed by hyperprolific authors 20 Acknowledgments 119 Technical note on methods An author publishing the equivalent of one full paper every 5 days will end up publishing 73 papers in a calendar year. We selected this number as a threshold to define and study outlier, hyperprolific authors. Of course, published papers may reflect the final presentation of work that has happened over many years, but focusing on calendar years allows to study peaks in productivity and to use a clearly definable time unit. 1 We identified all author records in Scopus that included 73 or more published full papers in any single calendar year between 2000 and 2016. Full papers included in our analysis are the ones classified as Articles, Conference Papers and Reviews in Scopus. All other Scopus categories of published items (Editorials, Letters, Notes, Short Surveys, Errata, and so forth) were excluded. Papers take variable amounts of effort to produce. For items such as editorials, notes, letters to the editor, in theory large numbers of publications are possible to produce by authors who have a talent, proclivity or obsession for writing; such works may occasionally be very important and influential, but they take, on average, substantially less time to produce than Articles, Conference Papers and Reviews.
    [Show full text]
  • A Comprehensive Framework to Reinforce Evidence Synthesis Features in Cloud-Based Systematic Review Tools
    applied sciences Article A Comprehensive Framework to Reinforce Evidence Synthesis Features in Cloud-Based Systematic Review Tools Tatiana Person 1,* , Iván Ruiz-Rube 1 , José Miguel Mota 1 , Manuel Jesús Cobo 1 , Alexey Tselykh 2 and Juan Manuel Dodero 1 1 Department of Informatics Engineering, University of Cadiz, 11519 Puerto Real, Spain; [email protected] (I.R.-R.); [email protected] (J.M.M.); [email protected] (M.J.C.); [email protected] (J.M.D.) 2 Department of Information and Analytical Security Systems, Institute of Computer Technologies and Information Security, Southern Federal University, 347922 Taganrog, Russia; [email protected] * Correspondence: [email protected] Abstract: Systematic reviews are powerful methods used to determine the state-of-the-art in a given field from existing studies and literature. They are critical but time-consuming in research and decision making for various disciplines. When conducting a review, a large volume of data is usually generated from relevant studies. Computer-based tools are often used to manage such data and to support the systematic review process. This paper describes a comprehensive analysis to gather the required features of a systematic review tool, in order to support the complete evidence synthesis process. We propose a framework, elaborated by consulting experts in different knowledge areas, to evaluate significant features and thus reinforce existing tool capabilities. The framework will be used to enhance the currently available functionality of CloudSERA, a cloud-based systematic review Citation: Person, T.; Ruiz-Rube, I.; Mota, J.M.; Cobo, M.J.; Tselykh, A.; tool focused on Computer Science, to implement evidence-based systematic review processes in Dodero, J.M.
    [Show full text]
  • Sci-Hub Provides Access to Nearly All Scholarly Literature
    Sci-Hub provides access to nearly all scholarly literature A DOI-citable version of this manuscript is available at https://doi.org/10.7287/peerj.preprints.3100. This manuscript was automatically generated from greenelab/scihub-manuscript@51678a7 on October 12, 2017. Submit feedback on the manuscript at git.io/v7feh or on the analyses at git.io/v7fvJ. Authors • Daniel S. Himmelstein 0000-0002-3012-7446 · dhimmel · dhimmel Department of Systems Pharmacology and Translational Therapeutics, University of Pennsylvania · Funded by GBMF4552 • Ariel Rodriguez Romero 0000-0003-2290-4927 · arielsvn · arielswn Bidwise, Inc • Stephen Reid McLaughlin 0000-0002-9888-3168 · stevemclaugh · SteveMcLaugh School of Information, University of Texas at Austin • Bastian Greshake Tzovaras 0000-0002-9925-9623 · gedankenstuecke · gedankenstuecke Department of Applied Bioinformatics, Institute of Cell Biology and Neuroscience, Goethe University Frankfurt • Casey S. Greene 0000-0001-8713-9213 · cgreene · GreeneScientist Department of Systems Pharmacology and Translational Therapeutics, University of Pennsylvania · Funded by GBMF4552 PeerJ Preprints | https://doi.org/10.7287/peerj.preprints.3100v2 | CC BY 4.0 Open Access | rec: 12 Oct 2017, publ: 12 Oct 2017 Abstract The website Sci-Hub provides access to scholarly literature via full text PDF downloads. The site enables users to access articles that would otherwise be paywalled. Since its creation in 2011, Sci- Hub has grown rapidly in popularity. However, until now, the extent of Sci-Hub’s coverage was unclear. As of March 2017, we find that Sci-Hub’s database contains 68.9% of all 81.6 million scholarly articles, which rises to 85.2% for those published in toll access journals.
    [Show full text]
  • Market Power in the Academic Publishing Industry
    Market Power in the Academic Publishing Industry What is an Academic Journal? • A serial publication containing recent academic papers in a certain field. • The main method for communicating the results of recent research in the academic community. Why is Market Power important to think about? • Commercial academic journal publishers use market power to artificially inflate subscription prices. • This practice drains the resources of libraries, to the detriment of the public. How Does Academic Publishing Work? • Author writes paper and submits to journal. • Paper is evaluated by peer reviewers (other researchers in the field). • If accepted, the paper is published. • Libraries pay for subscriptions to the journal. The market does not serve the interests of the public • Universities are forced to “double-pay”. 1. The university funds research 2. The results of the research are given away for free to journal publishers 3. The university library must pay to get the research back in the form of journals Subscription Prices are Outrageous • The highest-priced journals are those in the fields of science, technology, and medicine (or STM fields). • Since 1985, the average price of a journal has risen more than 215 percent—four times the average rate of inflation. • This rise in prices, combined with the CA budget crisis, has caused UC Berkeley’s library to cancel many subscriptions, threatening the library’s reputation. A Comparison Why are prices so high? Commercial publishers use market power to charge inflated prices. Why do commercial publishers have market power? • They control the most prestigious, high- quality journals in many fields. • Demand is highly inelastic for high-quality journals.
    [Show full text]
  • Inst Xxx: Information User Needs & Assessment
    INST408A_Consumer_Health_Informatics_Syllabus_Fall2019_StJean&Jardine_Final INST 408A-0101 Special Topics in Information Science: Consumer Health Informatics College of Information Studies, University of Maryland Mondays, 2:00 – 4:45 PM (Hornbake Library, North Wing, Room 0302H) Fall 2019 Co-Instructors: Beth St. Jean, Associate Professor Fiona Jardine, Doctoral Candidate Hornbake Building, Room 4117K Hornbake Building, Room 4105 301-405-6573 301-602-3936 [email protected] [email protected] Office Hours: Beth St. Jean: Mondays, 5:00 to 6:00 PM, or by appointment. Fiona Jardine: Fridays 12:00 to 1:00 PM, or by appointment. Our Liaison Librarian: Rachel Gammons, Head of Teaching and Learning Services, 4100C McKeldin Library, [email protected], 301-405-9120. [Research Guide: https://lib.guides.umd.edu/information_studies] Catalog Description [Prerequisite: INST 201 (Introduction to Information Science)] In this course, we will investigate the fields of Consumer Health Informatics and Information Behavior, focusing most heavily on their intersection – Consumer Health Information Behavior. We will explore people’s health-related information needs and whether, how, and why people seek out and use (or do not seek out and use) health information and the types of health information they find useful. We will also cover the important and interrelated topics of information avoidance, health behaviors, health literacy, digital health literacy, doctor-patient communication, and patient-to-patient communication through support groups and online communities. Throughout the course, we will also focus on the important concept of health justice – an ideal state in which everyone has an adequate and equitable capability to be healthy. We will identify populations that frequently experience social injustice and explore the information-related causes and broader consequences of the health inequities members of these populations tend to face.
    [Show full text]
  • A Mixed Methods Bibliometric Study
    The Qualitative Report Volume 24 Number 12 Article 2 12-2-2019 Collaboration Patterns as a Function of Research Experience Among Mixed Researchers: A Mixed Methods Bibliometric Study Melanie S. Wachsmann Sam Houston State University, [email protected] Anthony J. Onwuegbuzie Sam Houston State University, [email protected] Susan Hoisington Sam Houston State University, [email protected] Vanessa Gonzales Sam Houston State University, [email protected] Rachael Wilcox Sam Houston State University, [email protected] See next page for additional authors Follow this and additional works at: https://nsuworks.nova.edu/tqr Part of the Education Commons, Quantitative, Qualitative, Comparative, and Historical Methodologies Commons, and the Social Statistics Commons Recommended APA Citation Wachsmann, M. S., Onwuegbuzie, A. J., Hoisington, S., Gonzales, V., Wilcox, R., Valle, R., & Aleisa, M. (2019). Collaboration Patterns as a Function of Research Experience Among Mixed Researchers: A Mixed Methods Bibliometric Study. The Qualitative Report, 24(12), 2954-2979. https://doi.org/10.46743/ 2160-3715/2019.3852 This Article is brought to you for free and open access by the The Qualitative Report at NSUWorks. It has been accepted for inclusion in The Qualitative Report by an authorized administrator of NSUWorks. For more information, please contact [email protected]. Collaboration Patterns as a Function of Research Experience Among Mixed Researchers: A Mixed Methods Bibliometric Study Abstract Onwuegbuzie et al. (2018) documented that the degree of collaboration is higher for mixed researchers than for qualitative and quantitative researchers. The present investigation examined the (a) link between the research experience of lead authors and their propensity to collaborate (Quantitative Phase), and (b) role of research experience in collaborative mixed research studies (Qualitative Phase).
    [Show full text]
  • Mapping the Future of Scholarly Publishing
    THE OPEN SCIENCE INITIATIVE WORKING GROUP Mapping the Future of Scholarly Publishing The Open Science Initiative (OSI) is a working group convened by the National Science Communi- cation Institute (nSCI) in October 2014 to discuss the issues regarding improving open access for the betterment of science and to recommend possible solutions. The following document summa- rizes the wide range of issues, perspectives and recommendations from this group’s online conver- sation during November and December 2014 and January 2015. The 112 participants who signed up to participate in this conversation were drawn mostly from the academic, research, and library communities. Most of these 112 were not active in this conversa- tion, but a healthy diversity of key perspectives was still represented. Individual participants may not agree with all of the viewpoints described herein, but participants agree that this document reflects the spirit and content of the conversation. This main body of this document was written by Glenn Hampson and edited by Joyce Ogburn and Laura Ada Emmett. Additional editorial input was provided by many members of the OSI working group. Kathleen Shearer is the author of Annex 5, with editing by Dominque Bambini and Richard Poynder. CC-BY 2015 National Science Communication Institute (nSCI) www.nationalscience.org [email protected] nSCI is a US-based 501(c)(3) nonprofit organization First edition, January 2015 Final version, April 2015 Recommended citation: Open Science Initiative Working Group, Mapping the Future of Scholarly
    [Show full text]