The focus of this year’s COPE European Seminar was on metrics, particularly altmetrics, and how they can be used and misused, with a line-up of engaging speakers and panels throughout the day. Euan Adie, Altmetric, kicked the day off by noting that metrics were really about getting credit where credit is due, a sentiment echoed by the other speakers throughout the day. He acknowledged that altmetrics mean different things to different groups but that primarily they are a way of measuring the impact of a paper outside of citations; broadening the view of impact to help better assess it.
Lisa Colledge, Elsevier, encouraged transparency of the data behind metrics so that readers could make up their own minds about the impact of an article instead of hiding methodology. She suggested that evaluation of research should use at least two different metrics to avoid relying too heavily on one particular metric that might not accurately represent the impact that needs to be measured. Heavy reliance on one metric can also mean that the metric is a more attractive target for gaming to falsely increase the measure of impact it gives on a particular paper or journal.
Both Lisa and Euan discussed gaming of metrics, including paid promotion and citation gaming. Both agreed that opening up the data on these metrics would help increase transparency and assist investigations into allegations of gaming and that auditing of sources may help better distinguish popularity versus quality in content.
Sarah de Rijcke, Leiden University, then raised the question of impact of metrics on how researchers plan their research practice. She noted that researchers may bear impact metrics in mind when selecting a research question or structuring the process on an experimental level. Focus on performance measures may actually be creating a mono-culture in publishing by reducing diversity of topics researched.
A panel discussion then wrapped up the conversation on metrics for the day with Jonathan Montgomery, Chair, Health Research Authority and Nuffield Council on Bioethics, Mike Thelwall, University of Wolverhampton, and Virginia Barbour, COPE Chair. All agreed that quality of articles and journals needs to be paramount, not the Impact Factor, and that metrics should not replace actual discussion about the quality of articles.
This year’s COPE seminar gave a lot of opportunities for interesting discussions about metrics, how they are currently being used and the potential for misuse. The key takeaway points for the day were:
- Altmetrics are like any other metric – they can be misused.
- Focusing on a single metric to evaluate research gives a narrow view of the impact of the research and can drive efforts towards gaming that metric – multiple metrics should be used to evaluate research.
- More visibility of the underlying data used to create altmetrics and auditing of data sources could be used to prevent altmetric gaming.
The next COPE Seminar will be held in the U.S. in August in collaboration with ISMTE. More details can be found on the COPE website here. More information on misuse of citation metrics can be found in James Hardcastle’s article “Citations, self-citations and citation stacking”.