SWOT Analysis is a very useful technique for understanding the Strengths and Weaknesses of Altmetrics and for identifying both the Opportunities and the Threats that the alternative metrics confront. Here you will discover the Weaknesses, therefore, attributes, characteristics and factors that weaken the competitiveness of Altmetrics in compare to other types of established research impact indicators. The Altmetrics' Weaknesses are the following:
1. Data Integrity & Quality
One current issue that has been highlighted by many scholars is data integrity & quality. While in the “closed universe” of bibliometric source items of citing and inter-linkage can be clearly demarcated and more or less stable, the “open universe” approach of a considerable amount of Altmetrics data sources are potentially fluctuant. In a strict understanding Altmetrics may always change, depending on retrospective changes, e.g. deletion or modification, of the underlying data sources. Such volatilities also post a challenge to promotion and dissemination of these indicators on the level of certain stakeholder such as librarians. A second aspect of this issue are differences between data aggregators, due to differences in data targeting, retrieval and processing strategies or which events are actually recorded, e.g. inclusion or exclusion of re-sharing messages or social media posts.
2. Confusion through Composite Indicators
The conflation of different forms of novel dissemination into a single numeric value may at first sight provide an easier mode of evaluation. Yet, at the same time such composite indicators have the weakness of potentially making an interpretation of this indicator harder in terms of evaluating its changes and extent of importance of the individual dimensions involved in the construction of it. This problem extends beyond the selection of dimensions and inputs into the weighting and re-scaling of individual data sources and dimensions. In the most extreme case, individual data sources, due to a inherent nature of the underlying dissemination channel, might overly influence the composite indicator outcome. For the case of Altmetrics the extent of the problem is further increased by the mutual entanglement of dissemination channels and data sources. This also highlights the need for further classification, providing taxonomies of orders of valuation for the different data sources.
3. Conceptual and terminological confusion
Even though we find an increased activity by researchers to gather further understanding of the underlying logics of individual data sources or differentiate between different types of actions or reactions that are being counted, there still is a lack of understanding in which way acts of valuation and evaluation are connected, i.e. how value generation and value appreciation are being matched in different dissemination channels. This is an issue that goes beyond simple problems of data specification, but rather touches on the disconnect between the importance within some media and their respective counting schemes.
While the forging and colusionary modification of citation counts required either self-citations or acts of behavior modification of third parties, the new forms of referencing through social media are prone to new levels of manipulation reminiscent to the effects of link farms or search engine optimization and their impact on web page ranking. Manipulation no longer requires to modify the behaviour of others, but rather can be achieved by specialized programs and faked user accounts. Solving the problem by simple fraud detection algorithms neglect the conceptual decisions being implicit in such fraud detection. A social media account operated by a research organization or publisher exhibiting an increased amounts of posting and sharing activity might be deemed as legitimate outlet or illegitimate activity, depending on what kind of reasoning is applied to it. Lack of research into Altmetrics on data, software and video content
5. Lack of research into Altmetrics on data, software and video contentWhile some data sources such as Mendeley or Twitter have attracted a considerable amount of activity,
While some data sources such as Mendeley or Twitter have attracted a considerable amount of activity, there are some domains of data sources that received very little attention, yet. Among those underresearched data sources are video content, research data, code and software. It seems, that most aspects relating to quality of these channels of dissemination have a metadata type of quality, i.e. they are properties of the output and are rather not linked to quantifiable metrics in the sense of Altmetrics. Yet, establishing such cross-validation between properties of these types of outputs could help to improve the understanding how and under which circumstances such properties of quality or openness relate to impact.