Altmetrics are provided by platforms, called Altmetrics providers/aggregators. Here we present a categorisation of these providers to better understand how Altmetrics are generated.
Altmetrics providers collect data from different sources and offer services that go beyond optimization of individual scholarly visibility and very often influence the channels for dissemination in Altmetrics . Altmetrics providers provide an application programmable interface (API) through which data collected by the platform can be publicly accessed. Data collection strategies among the Altmetrics aggregators differ. While some of the Altmetrics aggregators collect their own data, others reuse previously collected data. Therefore, we can divide them into primary, secondary, and tertiary aggregators
The main Altmetrics providers:
Article Level Metrics (ALM) emerged from an initiative of the PLoS to provide an alternative category for the classification of articles as citable or non-citable items. Launched in 2009, ALM now provides different impact and performance indicators, collecting data of different sources: views, saves, discussions, recommendations, and citations. PLoS collected data are accessible through APIs and widgets for free download. However, these measures and data are restricted to those articles published in PLoS.
Altmetric.com has been set up in 2011 as a commercial enterprise providing social media outreach for individual researchers and publishers. Its goal is to provide its customers information about the attention a single article receives. Altmetric.com has established a composite indicator that combines different sources of data (news, videos, policy documents, Facebook, Twitter, LinkedIn, and Pinterest), resulting in a single indicator visualized by different colours which is often termed ‘altmetric donut’. The altmetric donut aims to inform about both ‘the quantity and quality of attention received by each item, applying some kind of control for gaming’.
Impact story launched in 2011 under the label ‘Total Impact’ is a non-profit organization emerged from a hackathon. Impact story now is an open source, web based tool that aims at providing individual scientists with instruments to ‘sell their science’. The goal of its founding fathers J. Priem and H. Piwowar, who also revolutionized the community with their terminologies and concepts, is to change the reputation structures of science by widening the scope of scientific products. Impact story provides its users 5 categories for social outreach: cited, saved, discussed, viewed, and recommended. Since 2014, Impactstory is paid service.
Plum Analytics was launched in the same year (2011) as a ‘for profit start up’ and has been set up to provide new scholarly measures. Data and measures are collected at the group level of organizations such as departments, museums, and labs. PlumAnalytics uses data of PloS ALM and is hence regarded to be a ‘secondary aggregator’. Since 2014, PlumAnalytics is part of EBSCO Information Services, a large provider of scientific information in the net. Similar to Impact Story and ALM, PlumAnalytics covers four categories of data: Usage, Captures, Mentions, Social Media, and Citations.
Kudos is a web-based service that helps researchers increase the visibility and impact of their publications since 2013. It aggregates all the most relevant metrics in one place, and maps outreach activities against those metrics. It provides a number of publication views, full-text downloads on publisher site, share referrals and Kudos views (i.e. the total number of visits to your publication page on the Kudos site). Kudos is free for individual researchers.
Webometric Analyst is free software designed to conduct automatic web analyses of various types for social science research purposes. It gathers data from the web from different sources and processes it in many ways. One of its sources is a commercial search engine: it can submit thousands of queries per day and save the results or process them. It can create network diagrams of collections of web sites, estimate the online impact of collections of web sites or ideas, and retrieve information on a large scale about blogs and YouTube videos.
Snowball Metrics is an academia-industry collaboration. The universities involved are following the recommendations outlined by the sector in the 2010 report on research information management, and collaborating with an industry supplier of research information, Elsevier, to ensure that the methodologies are technically feasible before they are shared with the sector.
|Source: Created re-using data from the Innovations in Scholarly Communication project.|
|Please give feedback or suggestions for improvement via our Suggestion box.|
The San Francisco Declaration on Research Assessment (DORA) points out that using the Journal Impact Factor as a proxy measure for the value or quality of specific research and individual scientists leads to biased research assessment. How can we resist misusing metrics?