<<

editorial Alternative metrics

As the old ‘publish or perish’ adage is brought into question, additional research-impact indices, known as altmetrics, are offering new evaluation alternatives. But such metrics may need to adjust to the evolution of science publishing.

Today, a growing frustration among could be useful, for instance, in suggesting researchers is that the impact of their what subjects are topical, or to reach out contribution to science is mostly assessed to account for scientific progress to the on the basis of out-of-date mechanisms wider society. including and citation But popularity in social media circles measurements. This discontent occurs as is not a reflection on quality. With choice we are reaching a turning point in science made easy at the click of a button, the web publishing history where the essence of induces a herd behaviour whereby users the peer-review process has been called could be more easily swayed than offline by into question. fashion concerning certain research topics. ISTOCKPHOTO/PHATICPHOTOGRAPHY

Indeed, the drive to find alternative © And such trends develop at an exponential metrics is a symptom of a community scale compared to the way topics that are where research evaluation is not Figure 1 | The emergence of altmetrics is a en vogue shape a journal editors’ choice. functioning well. A new movement welcome addition to existing scholarly evaluation. Worse, there could be negative reasons called altmetrics — eloquently described why regular chattering turns into virtual through a manifesto1 published in 2010 clamour, particularly with controversial and arguably a variation on the theme in academia. Social media has not been papers. These would typically only receive of what is referred to as webometrics taken up as an integral part of the academic very few citations due to their limited or social media metrics — revisits the measurement of scientific achievement. scientific quality. measurement of a scientist’s worth. Rather There seems to be a lot of scepticism Taking a step back, any new metrics than using peer-reviewed journal articles, concerning its real value. Resistance to introduced today may not have time to alternative metrics range from other such change is rooted in that these new be validated and gain acceptance. Instead, types of research output to a researchers’ evaluation methods have not sufficiently they would constantly need to evolve in reputation made via their footprint on the been validated to be ready for adoption, line with the rapid way in which research social web. as outlined by a recent report from the information is managed4. A few years from A flurry of tracking tools applied to SURF foundation2. now, scientific findings may be found in research information have already emerged. Indeed, social-media-output metrics specialized archives attended only by field For example, a software solution called may yet need to be further refined to specialists. Comments from readers arising Total-Impact includes social media factors identify truly significant research results. from the archiving could be considered as well as preprints, datasets, presentation By giving as much prominence to amateurs as adequately refined metrics susceptible slides and other output formats in as it does to experts, as argued by internet to be part of an alternative assessment addition to PLoS-style article-level- entrepreneur Andrew Keen3, the web system. They would present the advantages metrics that reflect how an article is being exerts distorting effects on our cultural of emanating from a select sub-group of read, discussed and cited. In parallel, a standards. And research does not escape peers with a strong interest in the topic; a reputation-based tool called Plum Analytics this general distortion. filter that existing social media site filters promises to profile researchers in relation As a result altmetrics seems to cannot offer. to their institution, their work and those fall short of the established research Although spontaneous reviews from who engage with their findings. Meanwhile, evaluation standards. It is worth, however, readers and novel altmetrics are welcomed .com tracks conversations around distinguishing shorter-term and longer- complementary evaluation tools, they scientific articles online. term metrics. The former, embodied by will not replace a thorough scientific These tools claim to exploit the blog posts, news pieces, tweets and likes, quality assessment of papers and scientists tantalizing possibility offered by web 2.0 display a short life span and decay fast, through a selected-expert any . Ultimately they could help in reflecting the transient of popularity. time soon. ❐ monitoring in almost real-time how new By contrast, longer-term online metrics research findings are being read, cited, used such as download, readers, comment References and transformed in practical applications. numbers, albeit collected at a much slower 1. Priem, J. Altmetrics Manifesto; http://altmetrics.org/manifesto By adjusting the research evaluation pace, may be more meaningful. They 2. Users, Narcissism and Control —Tracking the Impact of Scholarly process to the technological reality of could act as a proxy for quality while going Publications in the 21st Century; http://go.nature.com/AB9vxo (SURF foundation, February 2012). web 2.0, important research results could, beyond the traditional h-index or impact 3. Keen, A. The Cult of the Amateur: How Blogs, MySpace, in principle, be replicated and confirmed factor metrics. YouTube, and the Rest of Today’s User-Generated Media are much quicker than before. What is more, using popularity as Destroying our Economy, our Culture, and our Values (Crown Business, 2008). Yet, two years on since the publication of an evaluation criterion may have some 4. Helbing, D. & Balietti, S. Eur. Phys. J. Special Topics the altmetrics manifesto, little has changed merits. Research that is popular online 195, 101–136 (2011).

NATURE MATERIALS | VOL 11 | NOVEMBER 2012 | www.nature.com/naturematerials 907

© 2012 Limited. All rights reserved