Research Trends Issue 36 March 2014 Research Trends Issue 36 March 2014 Page 01
Total Page:16
File Type:pdf, Size:1020Kb
Research Trends Issue 36 March 2014 Research Trends Issue 36 March 2014 Page 01 Research Trends Issue 36: Research Assessment & Evaluation The common theme of this issue of In their piece, Andrew Plume and Judith Research Trends is Research Assessment Kamalski discuss how new research and Evaluation. This issue’s contributions methods using download data can be all illustrate that bibliometric or informetric used in national research assessment in indicators are useful tools in the assessment an international context. In another article of research, and in some cases how new on new research methods, Mike Taylor metrics are being explored. and Andrew Plume explore the potentials of “altmetrics”. They show that developing In the opening article Gali Halevi and a good new metric is a complex process I provide an overview of the various that involves the collection of accurate data, quantitative approaches to research definition of a statistically sound metric, assessment that are currently in use, which thorough validation, and a close interaction is followed by a number of contributions with users. that focus on research assessment and evaluation from different angles. Sander Finally, February 28th is Rare Disease Day, van Servellen and Ikuko Oba, for example, an international advocacy day to help raise present a study on a particular research awareness for rare diseases. In honour of field: stem cell research, while Gali Halevi this day, Iris Kisjes used SciVal to examine interviews Dr. Daphne Getz on the way publication trends in the field of rare disease national research evaluation is carried out management. in Israel, by the Samuel Neaman Institute. We hope you enjoy reading this issue. Please share your thoughts and feedback with us! You can do this in the comments section following each article on our website or by sending us an email ([email protected]). We look forward to hearing from you! Kind regards, Dr. Henk F. Moed Editor-in-Chief Research Trends Issue 36 March 2014 Page 02 Page 03 Research assessment: Review of methodologies and approaches In this article, Dr. Henk Moed and Dr. Gali Halevi discuss the different levels at which research evaluation takes place and provide an overview of the various quantitative approaches to research assessment that are currently available. Page 06 Stem cell research: Trends in and perspectives on the evolving international landscape Stem cell research is an exciting yet complex and controversial science. In this piece, Alexander van Servellen and Ikuko Oba present the most important findings of their recent study on publication trends in Stem Cell Research. Page 11 Research evaluation in Israel: Interview with Dr. Daphne Getz For this issue Dr. Gali Halevi interviewed Dr. Daphne Getz, to hear more about the way research evaluation is carried out in Israel, and in particular on the recent report on Israeli scientific output published by the Samuel Neaman Institute, for which she was the lead investigator. Page 13 Article downloads: An alternative indicator of national research impact and cross-sector knowledge exchange Dr. Andrew Plume and Dr. Judith Kamalski demonstrate how download data can be used in research assessment to offer a different perspective on national research impact, and to give a unique view of knowledge exchange between authors and readers in the academic and corporate sectors. Page 17 Party papers or policy discussions: an examination of highly shared papers using altmetric data Which scientific stories are most shared on social media networks such as Twitter and Facebook? And do articles attracting social media attention also get the attention of scholars and the mass media? In this article, Dr. Andrew Plume and Mike Taylor provide some answers, but also urge for caution when characterizing altmetric indicators. Page 21 Celebrating Rare Disease Day - A look into Rare Disease research In honour of Rare Disease Day, an international advocacy day to help raise awareness for rare diseases, Iris Kisjes investigated publication trends in the field using SciVal. In this piece she highlights some of the key institutions that are active in research on rare diseases. Page 26 Did you know? …That study design influences the top 100 list of papers most cited in 2013? Research Trends Issue 36 March 2014 Page 03 Section 1: The assessment of scientific merit and LinkedIn, Wikipedia, Twitter, and others, Research Assessment individuals has a long and respectable which are labelled as sources of “altmetrics” history which has been demonstrated in and include news outlets as well (8). The numerous methods and models utilizing data used for each of these evaluation goals, Review of different data sources and approaches whether they measure a publication’s impact (1, 2). The proliferation and increasing in social or economic terms or both, varies by availability of primary data has created the the method chosen in each case. methodologies ability to evaluate research on many levels and approaches and degrees of complexity, but has also Evaluative Indicators Based on the different introduced some fundamental challenges methodologies and approaches, several to all who are involved in this process, indicators, aimed at quantifying and Dr. Henk Moed and Dr. Gali Halevi including evaluators, administrators and benchmarking the results of these evaluative researchers, and others (3). methods, have emerged through the years. Table 1 summarizes some of the main Evaluative methods are used on several indicators used today in research evaluation. levels within the scientific world: (1) Institutional (including departmental) level, Big data and its effect on evaluation (2) Program level, and (3) Individual level. methods Each of these levels has its own objectives and goals; for example: Big data refers to a collection of data sets that is so large and complex that it Institutional evaluation is being used in becomes difficult to process using on-hand order to establish accreditation, define database management tools or traditional missions, establish new programs and data processing applications. The advent monitor the quality of an institute’s research of supercomputers and cloud computing activities among others. The types of which are able to process, analyze and evaluative results can be seen in the visualize these datasets also has an effect ranking systems of universities, which at on evaluation methods and models. While present are produced at both regional and a decade ago scientific evaluation relied international levels, based on different criteria mainly on citations and publication counts, (4). Institutional evaluations are performed most of which were even done manually, based on prestige measures derived from today these data are not only available publications, citations, patents, collaborations digitally but can also be triangulated and levels of expertise of the individuals with other data types (9). Table 2 depicts within the institution. some examples of big datasets that can be combined in a bibliometric study to investigate different phenomena related Program level evaluations are performed to publications and scholarly output. Thus, in order to measure the cost-benefit aspects for example, publication and citation counts of specific scientific programs. These are can be triangulated with collaborative usually based on discovering the linkage indicators, text analysis and econometric between the investment made and the measures to produce a multi-level view potential results of the program (5). Within of an institution, program or an individual. this realm we find measures developed Yet, the availability and processing for technology transfer capabilities and capabilities of these large datasets does commercialization potentialities of the not necessarily mean that evaluation program, among others (6). becomes simple or easy to communicate. The fact of the matter is that as they become Finally an individual evaluation is mainly more complex, both administrators and performed for purposes of promotion evaluators find it difficult to reach consensus and retention of individuals and is done as to which model best depicts the at specific times in a researcher’s career. productivity and impact of scientific activities. Individual assessment methods rely mainly These technological abilities are becoming on counts of publications or citations (7). a breeding ground for more indices, models In the past few years, with the advent of and measures and while each may be valid social media, we have seen an increase in and grounded in research, they present a the use of measures based on mentions in challenge in deciding which are best to use social media sites such as blogs, Facebook, and in what setting. Research Trends Issue 36 March 2014 Page 04 Type of Indicator Description Main Uses Main Challenges Publications – Methods involving counts Measuring the impact Name variations of institutions and individuals make it Citations of the number of and intellectual influence difficult to count these correctly. publications produced by of scientific and scholarly Limited coverage or lack of coverage of the database the evaluated entity (e.g. activities including: selected for the analysis can cause fundamental errors. researcher, department, • Publication impact Some documents such as technical reports or institution) and the citations • Author impact professional papers (“grey literature”) are usually they receive. • Institution/department excluded from the analysis due