Want to know who's been tweeting about your work, or what news articles are out there on that latest bit of research? Altmetric are now making that easier than ever.
Article level metrics
Wiley are piloting an "article level metric score" with Altmetric for every one of their articles in the Wiley online library. These scores give a level for how much the article is shared online. I only recently noticed this development after seeing a Tweet from Matteo Cavalleri (@physicsteo) about these Altmetric ratings. A pretty cool idea.
Next to each article is a little Altmetric icon and a score for that article. In their FAQ section, they explain how it's all figured out in three easy steps:
1: Volume - Your score rises as more people mention it, which makes sense. But you only get 1 mention from each person per source, so you can't fake boost the score by tweeting the same thing over and over.
2: Sources - Different sources contribute different amounts. So a newspaper article is worth more than a blog post (even a fine one such as this), but then a blog is worth more than a mere tweet.
3: Authors - The score depends on who is saying it. So loud-mouths talking about every article under the sun are worth less than a specialist sharing with their specialist peers.
So with this new information at hand, I went to check out my latest paper.
Looking at this article recently published in Angewandte Chemie, it has a score of 28, which to me sounds pretty good. Clicking on the link then gives you a breakdown of what that all means. So for this particular paper there were three news articles and five Twitter mentions. The news stories were all new to me. Being an author of the work you think I might have known, but no. A break down of the tweets states that four were from the public and one from a scientist. I'm pretty sure all the tweets came from scientists so I'd love to know whose work Altmetric doesn't think is worthy of scientist status!
The news articles were all based on the same press release from an Angewandte but one link was to a Chinese website. A (possibly bad) Google translate of the opening sentence gave, "Eye for an eye, but also treat the person in his body", which was definitely not in the original English press release!
But hey Altmetric, I also talked about the paper on my own blog recently, so that's another one to add to the list, maybe need to expand your search a little. (--Update-- This blog is now recognised! Nice one Altmetric, sorry for ever doubting you.)
Alternate versions
Interestingly, Angewandte also does a German version of the journal (technically this is the original and the other is an international version). Looking at the paper in the German edition tells another story. The score is up to 34!
There are four news outlets this time and one blog. No tweets for this one but one extra blog (still not mine). Two of the same news pieces from the international version were there and one of the new ones was from a German website (a faithful translation this time).
So Angewandte is actually artificially lowering their scores by having two versions. Maybe something for the Altmetric team to consider here; combining the mentions somehow. Or I could just do it for them and add up the scores, which gives me an impressive looking 28 + 34 = 62. That's how science works right?
I imagine that dual-language nature of ACIE (does *any* other scientific journal still do this?) also sometimes leads to splits or other discrepancies in citations (regular citations, that is), as people might cite either the German or English versions, or both. Sometimes I see citations with some mixture of the title, volume and page numbers from the German and International editions, causing further chaos (the volume numbers differ because the German edition has been going for far longer than the International edition, and the page numbers differ for obvious reasons).
ReplyDeleteIf the citation metadata is neat and tidy then it shouldn't matter (if both versions are cited then one should be tagged as a translation), but my empirical observation is that some publishers couldn't find their metadata with both hands.
At least in Wiley journals you have to cite both the Int and German versions at the same time, so that sorts that problem, but it is still a little cumbersome. This is all kinda new so will be interesting to see what follows and how they solve these issues.
Delete