Tags

, , , , , , , , , , , ,

In the world of academia, the “impact” of scholarly articles has traditionally been measured using citation counts—how often an article is cited in a subsequent article. However, the development of the Internet has challenged this model: the immediacy of social media has allowed for greater and quicker collaboration between academics that exposes the much longer timeframe needed for new articles to be published and their citations to be counted and linked back to the original sources; and the growth of digital repositories has enabled the dissemination of articles outside of the traditional media of academic journals, and to a more diverse, potentially non-academic audience as well.

In recent years, therefore, altmetrics (alternative metrics) have come to the forefront of measuring impact. They are not intended as a replacement for counting citations in order to measure academic impact, but instead as a complementary technique to quantify societal impact. In practice, these new metrics cover elements such as the number of views that an article published online receives; the number of times that it is downloaded; and the number of times that it is “shared”. The latter category includes mentions in news media, social media, blogs, and uses in reference managers such as Mendeley and CiteULike.

There are several providers of altmetrics that have appeared in the last few years: the one that we used in our most recent DITA class is called—appropriately enough—Altmetric (founded in 2011, it now handles approximately 2.5 million requests per day). The way in which it works combines many of the areas that we have learnt about already (and several of which I have previously posted about in this blog): Altmetric maintains a relational database of both online academic publishers—to track the appearance of new articles–and news websites–to track mentions of these articles in the media; it then uses a mashup of Application Programming Interfaces (APIs) from social media platforms, such as Facebook and Twitter, and the aforementioned reference managers, to harvest their data outputs in the JavaScript Object Notation (JSON) markup language from which it produces its own JSON output. This can then be exported to a programme like Microsoft Excel as a comma separated values (csv) file for further analysis and manipulation.

The purpose of Altmetric is that it gives each article that it encounters a score, denoting its impact, according to how often it is shared in news and social media platforms, and saved in reference manager software. The score is not simply a cumulative total, but is instead calculated using an algorithm (that gives different sources different values according to how important a panel of experts believes that source to be in communicating academic information, and also rewards a wide breadth of sources: for instance, an article shared several thousand times on Facebook will be likely to have a lower score than another that is has only a few hundred mentions, but across several social media platforms and in traditional news media websites). As a quick visual aid to indicate the diversity of an article’s societal impact, Altmetric uses a modified version of its rainbow-coloured logo for each article: the example below shows an article with a variety of sources, and therefore colours, (top) compared to one which relies solely on one source, Facebook, for its impact (bottom).

Altmetric contrast

(The first article has a total of 245 mentions; the second has 2546, but receives a lower score due to Altmetric’s algorithm.)

So how useful are altmetrics? The success of Altmetric and its competitors in the industry certainly indicates their popularity: not only can the scores produced be used by academics to research their field (and LIS students!), but also by the authors of articles to demonstrate their credentials, by universities to demonstrate their faculties’ credentials, and by publishers to demonstrate their writers’ credentials. Altmetric not only provides its scores in a searchable browser-based application, but also sells its API to institutions: for example, City University London’s digital repository has an Altmetric score for each of its deposited items, in addition to buttons for online sharing and its own altmetric (number of downloads).

However, there are several potential problems. Most obviously, altmetrics do not take into account the qualitative value of an article, merely its quantitative impact. As Altmetric itself is at pains to point out, with a large red box in its “How is the Altmetric score calculated?” help-page:

The Altmetric score is useful to rank articles based on attention – it can’t tell you anything about the quality of the article itself, though reading the linked discussions might.

A good example of these limitations is the fact that the current holder of the record for the highest Altmetric score of all time is a humorous article about psychiatry in the world of Winnie-the-Pooh: not of much value for research, but likely to attract a great deal of superficial attention due to its offbeat subject matter. To further demonstrate the point, when doing my own searching, I found another frivolous yet popular article on the disappearance of teaspoons from an Australian research institute. It is therefore vital to take the data provided by altmetrics with a grain of salt; to place it in a wider context and to use it in conjunction with the traditional citation-counting.

Furthermore, Altmetric is limited by the number of potential sources that it can monitor effectively in its database. It also limits itself to articles that have a digital object identifier (DOI) or similar record, which excludes those journals that opt not to use one. There is also the question of subject bias: in the searches that I have performed using the service, I have noticed a distinct bias towards the sciences, particularly medicine, at the expense of the humanities. For instance, of the current top ten-scored articles listed under the subject “Library and Information Studies”, five (including the top two) are concerned primarily with medicine and healthcare with no obvious connection to LIS.

Finally, our own module leader, Ernesto Priego, has written on the subject of how altmetrics may be influenced by digital opportunity: in a world where Internet use is still dominated by first-world countries, with a clear correlation to their performance in the Digital Opportunity Index, does using altmetrics reinforce a Western-centric view of academia by paying the greatest attention to those who already possess the greatest means of making themselves heard?

It seems clear to me that, although altmetrics have proven themselves valuable in keeping the world of academia apace with the wider societal and informational developments stimulated by the growth of the Internet, they are not a panacea: they should be used in conjunction with older techniques, and further research is necessary into new methods that have yet to be determined. However, as altmetrics—a very young technology, don’t forget–continue to develop and become further integrated into the established paradigm, then hopefully some of the problems I have mentioned should become less severe in any case.

Advertisements