OpenUP policy recommendations No 3 (Assessment)
OpenUP synthesised and validated key project results and derived 5 recommendations to foster the take-up of novel practices in scholarly peer review, research dissemination & assessment. Discover the rational behind the Recommendation #3
The established evaluation methods of research and researchers are facing criticism. Bibliometric indicators like the h-index  or the Journal Impact Factor (JIF)  are the most common research assessment tools. However, they are believed to have a bias towards senior researchers or reflect the overall impact of a journal and not that of a specific article . These and other bibliometric indicators are failing to prove their suitability for measuring research outputs and their impact, in the context of a movement towards Open Science. They also do not consider other types of activities that scientists participate in, for example, review work, engagement in social media and citizen science.
In response to this, altmetrics emerged to broaden the spectrum and understanding of the impact of research and researcher evaluations. The growing uptake of new forms of dissemination (e.g. blogs, Twitter, openly available reports and open and citable data) is now also driving the use of alternative metrics. They hold the potential to improve the way research impact is understood and measured. There are gaps in the researcher recognition, evaluation and researcher career advancement systems that could be filled by expanding the currently used research and researcher assessment criteria .
However, the concept of altmetrics is still new and a rather small proportion of researchers are aware of it and even fewer use them. A third of the OpenUP survey respondents were aware of altmetrics and only 15% of the total respondents used it in their work . More conceptual scrutinization is needed to establish what altmetrics can measure and how they can inform decisions of researchers and policymakers. Also, dedicated trainings for researchers on alternative metrics are needed to provide clear guidance on what activities such metrics consider.
Recommendation #3 - Specific Actions - Relevant Stakeholders
Increase awareness of and train researchers on alternative metrics
EU and national policy makers
Institutional decision makers
OpenUP synthesised and validated key project results and derived five recommendations to foster the take-up of novel practices in scholarly peer review, research dissemination and assessment while considering existing gaps in evidence and disciplinary differences. To find out more about the OpenUP policy recommendations, follow the link.
 Hirsch, J. E. (2010). An Index to Quantify an Individual’s Scientific Research Output That Takes into Account the Effect of Multiple Coauthorship. Scientometrics Vol. 85, no. 3, pp: 741–54, https://doi.org/10.1007/s11192-010-0193-9.
 Garfield, E. (1972). Citation Analysis as a Tool in Journal Evaluation. Science Vol. 178, pp: 471–79, http://dx.doi.org/10.1126/science.178.4060.471.
 The San Francisco Declaration on Research Assessment (DORA). (2012). http://www.ascb.org/dora/; Hicks, D. et al. (2015). Bibliometrics: The Leiden Manifesto for Research Metrics, Nature News Vol. 520, pp: 429-31. https://doi.org/10.1038/520429a.
 Wilsdon, J. et al. (2017). Next-generation metrics: Responsible metrics and evaluation for open science. Report of the European Commission Expert Group on Altmetrics.
 Blümel, C. (2017). D5.4 report on final taxonomy linking channels of dissemination and altmetrics. OpenUP project. http://openup-h2020.eu/project-material/project-deliverables/.