Just how far reaching is your research?
21 Oct 2014 by Evoluted New Media
As the pressure grows to show the wider implications and impact of scientific research, we learn how alternative metrics are becoming a recognised indicator of research impact Doubtless most researchers could give a fairly good estimate of how many citations their published work has received over the years. But what about how many news articles it has featured in, how many bloggers have taken an interest, how many comments it is receiving on social media or peer-review forums? The traditional article metrics of citations and download counts no longer have the capacity to support the online reality that the scholarly communication process has evolved to embrace. Researchers from all disciplines are facing increasing demands from their colleagues, institutions and funders to demonstrate evidence of the broader societal impact of their work – and not just for the articles they have published. In the UK and Australia the recent REF and upcoming ERA government reviews have seen an increase in the criteria in this area specifically, across all types of research output, and ongoing and future reviews on the measurement of research impact are a dominant presence in the strategic plans of many key decision makers. Although in the early stages of development, alternative metrics, or ‘altmetrics’, are showing promising signs of offering at least a partial solution to this challenge. A term first coined in 2010 by Jason Priem1, altmetrics today encompass an enormous amount of online activity around scholarly outputs. By tracking sources such as traditional and social media outlets, online reference managers, post-publication peer-review sites, and public policy documents, altmetrics offer a much wider insight into the dissemination and use of an article, dataset, or other non-traditional research object. A number of businesses have already sprung up to support and further develop these new metrics; key players include Altmetric, Plum Analytics (now owned by EBSCO), and Impact Story (a non-profit organisation funded by the Sloan Foundation). The sources tracked and the methods used vary between the three, but they are in agreement over at least one thing: the aim is not to replace the impact factor. The aim, as it stands, is to provide researchers with an opportunity to better understand how their work is being used and applied beyond the scholarly sphere, and to make it easier to gather evidence of that when needed. To date much of the focus and talk around altmetrics has centred on the social media outlets that are tracked – ‘mentions’ on Twitter are often dismissed as invaluable and irrelevant, and blogs are questioned for their quality. But just as the aim of altmetrics is not to replace the impact factor, neither is it to measure the quality of the paper – more they are intended as a measure of attention and reach. There can be no better indicator of this than in the real-world application of research, and this is what makes news of the newer sources now tracked by some altmetrics providers so exciting. Where before it could be difficult to identify and keep of track of where an academic article had been referenced in government and public policy documents, now these too have become sources which are tracked for mentions of papers and the data automatically pulled and collated together. This in turn will make it much easier for a researcher or evaluator to determine the extent of the real-world application of the work, and help to ascertain the optimal course for future research.
The aim of altmetrics is not to replace the impact factor, neither is it to measure the quality of the paper – more they are intended as a measure of attention and reach.Patents are another valuable source for such indicators. With an increasing demand on institutions and researchers to protect their intellectual property and prove the return on investment of their work, tracking mentions of papers in patents can provide crucial evidence to support this aim. Very firmly on the radar of altmetrics providers, patents are seen as an important outlet to provide yet further insight into the ripple effect of research dissemination – and work to integrate them as a data source is ongoing. The big names in research evaluation are taking an interest. In the UK, HEFCE are currently running an ongoing review into the use of metrics in research evaluation, and a recent House of Commons session examined how the use of social media data is impacting today’s society2. In the US, the National Information Standards Organisation (NISO) recently released their Altmetrics Standards Project White Paper3, for which they invited comments and feedback, and are in close contact with the providers of altmetrics data to form a roadmap for the implementation of guidelines on a larger scale. Funders too are beginning to experiment with the tools available – the Wellcome Trust have been actively exploring how altmetrics data can benefit and provide evidence of the broader impact of the funding they provide for the last few years, and the US National Institute of Health (NIH) are taking an increasingly active interest. The case for these metrics and the demand for their use continues to steadily increase; and not just on the part of institutions and funders. Digital Science, a company founded by Macmillan to nurture such disruptive technologies in the academic cycle, recently released the first of their Digital Research reports – in which evidence suggested that a shift in the choice of research output types submitted for review reflect the dominance of the journal impact factor alone as a measure of quality. The research showed that submissions of articles, even those that were not highly cited, have increased at a rate above what would be expected across most disciplines. Researchers are resorting to proving the value of their work on the basis of the reputation of the journal it is published in4. That is not to say that altmetrics as a concept are entirely without their detractors, or that it would be right to quickly discount their uncertainties on the matter. There is a valid concern that altmetrics are a waste of time entirely – just another example of more and more data that no one will have the time or the desire to analyse appropriately – and that they are in danger of giving a misleading sense of importance to an individual researcher or an individual piece of work. One could argue that the Impact Factor of course does precisely this, and it is this which altmetrics were developed with the very intent of providing a different insight from. Thus, in an attempt to address and alleviate these scenarios, all of the altmetrics providers are working closely with the stakeholders in the many and varied areas of the scholarly communication process – from the journal editorial teams to the research administration offices – seeking to identify their needs and clarify questions around the data and its application. By building use cases and testing scenarios we can develop timely and reliable reporting and analysis, which can be effectively incorporated into existing practice. It is hard to believe that any researcher would not want to know where their work was being talked about, to learn about the impact (or indeed, lack of impact) it was having beyond the scholarly sphere. And it is for this reason, if nothing else, that these new metrics are worth pursuing. To gain a better and more complete understanding of where scholarly work outputs are or are not being noticed and acted upon can better inform the future practices of funders, publishers, institutions, and researchers themselves. There is still a long way to go and a lot of questions to be answered in the development of altmetrics, and in their wider adoption. But the initial steps have been taken, and the opportunity is there for those that want it. More information For more information on the initial concept that drove the development of altmetrics, visit http://altmetrics.org/manifesto/. A number of tools are now available for institutions and researchers wishing to explore and use altmetrics for their own tracking and evaluation purposes, including: · The chance to create your altmetrics CV at ImpactStory (https://impactstory.org/) · A free browser bookmarklet which can be used to show a summary of the altmetrics data for any published article (http://www.altmetric.com/bookmarklet.php) · And the recently launched Altmetric for Institutions platform, which is now available for trialling at http://www.altmetric.com/institutions.php References 1. http://altmetrics.org/manifesto/ 2. http://www.parliament.uk/business/committees/committees-a-z/commons-select/science-and-technology-committee/news/140619-smd-ev/ 3. http://www.niso.org/topics/tl/altmetrics_initiative/ 4. http://www.digital-science.com/blog/posts/digital-science-launches-digital-research-reports Author Euan Adie, founder of Altmetric