Editor’s Note: This is a guest post from Iara Vidal, a Ph.D. student based in Brazil whose work focuses on altmetrics and scholarly communication. If you’re curious about altmetrics, or how they may be relevant to science communication, read on.
Being overwhelmed by information is not a new phenomenon, but it is a very real problem. We struggle to keep up to date with all the discoveries, papers, and books in our fields of interest. It often seems as though new fields of study, methods, and/or tools are created every month. Buzzwords are all around, and it can be hard to know if there’s anything useful behind the buzz.
One of these buzzwords is altmetrics. It first appeared in 2010, the year Jason Priem, Dario Taraborelli, Paul Groth and Cameron Neylon wrote Altmetrics: a manifesto. Building on previous research, the manifesto proposed that social media indicators can be used to track the impact of scholarly works in ways that are difficult (or impossible) to do with citations and other, more “traditional,” metrics. Articles that used to live in folders and drawers are now organized with online reference management software like Zotero and Mendeley. Debate about articles no longer happens only in hallways and bars, but on social networks and blogs like this one. These different indicators, and the field that studies them, are what we call altmetrics. [Editor’s note: Altmetric is a company; altmetrics is a term used to refer to the broader field of alternative metrics.]
Altmetrics are more often considered in the context of research evaluation. That’s actually the theme of my PhD, and I could go on and on about metrics, quality, impact, and evaluation.
Don’t worry, I won’t.
If you’re interested in these themes, I suggest looking up DORA and the Leiden Manifesto, and the discussions around research assessment. For this post, I’ll focus on how altmetrics can be useful for people engaging in science communication.
If you’re reading this blog, you’re probably already familiar with the use of social media for science communication and outreach. You may be tracking your blog readers, managing your Twitter audience, or crafting the perfect Facebook page. Altmetrics can help you do this for scholarly output such as articles, databases, code, etc. There is a catch, though: most tools, especially the free ones, require that the item you’re interested in tracking have a persistent identifier, commonly a DOI. On the bright side, if you do have an identifier at hand, these tools usually offer more than just a number; they also provide links to the actual mentions, so you can see for yourself what’s being said about the work, who’s saying it, and where. Let’s look at some of the tools and their uses for science communication.
If you’re a researcher doing outreach for your own research, you could use altmetrics to find audiences that are already engaging with your work, or that might be interested in it. One tool that can help you with that is ImpactStory, which is designed for researchers, free to use and open source. You just inform your ORCID number (if you don’t have one yet, I strongly recommend it!) and it gathers personal data and publications from your profile. Aside from counting how many times each one of your works was saved/shared in different platforms, with links to the actual mentions, ImpactStory also offers badges for different achievements related to your research. You could even include altmetric data on your CV to show how your outreach efforts are actually having impact.
Another tool for researchers is Kudos. Here, the idea is not only to provide metrics (social media mentions, downloads, views and citations) related to your articles/books, but also to help you explain your research and increase its reach. You can create plain-language abstracts and connect all kinds of different materials related to the same project in one place. It also connects with ORCID, which makes it easier to claim your publications.
Of course, researchers are not only ones doing science communication. Maybe you’re a blogger looking for interesting new research, or a research communication officer trying to understand and amplify the impact of the science done in your university. Altmetrics can help you too.
If you ever read scientific journal articles online, you might have noticed a colorful circle, sometimes with a number inside, sitting somewhere in the page like this (scroll down to see it). This circle is the Altmetric donut, probably the best known product of the Altmetric company – which is, in turn, probably the best known altmetric provider — so much so that some people even mistake the company for the field. Every color in the donut represents a different source (like “Twitter”, “news”, or “policy documents”). This way, you can have a notion of the kinds of attention a research object is receiving just by looking at the Altmetric donut. When you click the donut, you can see the details page with all the mentions that were collected by Altmetric from Twitter, news sites, YouTube video descriptions, Facebook public pages, policy documents, Mendeley, and so on.
What about the number that is sometimes inside the Altmetric donut? That’s the Altmetric Attention Score, and it tries to sum up all the attention received by the item in a single number. Different sources are given different weights (a tweet is worth less than a news article, for instance – here’s the information about how the score is calculated). And they also offer some context to help you understand how the item in question compares to ones from the same journal and/or of a similar age.
PLOS journals use a different way to gather and display altmetrics. Actually, they are pioneers in the field: their Article-Level Metrics tool (PLOS ALM) is from 2009, one year before the Altmetrics manifesto was published. Its underlying software, Lagotto, is open-source, and used for providing altmetric data in PKP’s Open Journal Systems and in CrossRef Event Data (under development), among others.
So, if you’re in an article’s page and see some sort of altmetric data available, go ahead and check it. You may also install the free Altmetric bookmarklet in your browser and use it to see altmetrics for articles you’re reading. But If you’re trying to track the impact of a big number of items, like the output of a whole department or the contents of an institutional repository, it may be worth it to look at more robust solutions like Lagotto or the paid services offered by Altmetric and Plum Analytics.
What if you don’t have a specific article in sight, but rather a theme? Well, ScienceOpen, an aggregator of several open access sources, allows you to order search results by Altmetric Attention Score. This way you can find “hot” articles in a given topic and join the conversation. Or, who knows, you could find some interesting results that aren’t on the spotlight for some reason, and then help them get the attention they deserve.
If you’d like to keep up-to-date with which papers are being discussed the most on social media, Paige Jarreau writes monthly “High Five” posts for the Altmetric blog. These highlight each month’s 5 most popular research articles according to Altmetric data.
It is worth mentioning that altmetrics are NOT indicators of quality (I’d argue that neither is the journal impact nor citations, but that’s a rant for another time). There are lots of reasons why someone would share a paper on social media, from “hey, mom, I’m published!” to “look at this silly title!”, from “this is important and you should read it” to “can’t believe nobody noticed this obvious error before it went live”. I think this can be useful in outreach: you can take that paper with the funny title and explain the science, you can use a peer review error to talk about how science self-corrects.
I hope this post has helped you understand more about altmetrics and how science communicators can use them. I want to conclude by saying that you, too, can help altmetrics – in fact, science news, blogs, videos, and similar vehicles are all important altmetric sources. Here’s how you can help papers and researchers get more recognition: whenever possible, use persistent identifiers instead of simple URLs when you mention a research object. If you mention an article on a video or a podcast, put the DOI link on the description. If you talk about an article on Twitter, link to the DOI. If a DOI isn’t available, look for a handle or a platform-specific ID like those on Pubmed and arXiv. This makes it easier for altmetric tools to pick up these mentions, and in turn for researchers and institutions to track their impact.