Is Almetrics an Acceptable Replacement for Citation Counts and the Impact Factor?

In the last 15 years there has most certainly been an increase in the amount of academic articles actually published. With this development questions have now arisen about measuring research because the old methods of impact factor and citation counting take time. Simliarly, information outlets are changing and the quality measurements must account for those changes. Scholarly articles can now be easily accessed through social media platforms and the open access movement.

A few years ago Altmetrics was introduced as a tool for scholars and librarians. For a complete explanation go to The main concept behind Altmetrics is that its system can easily tell you which articles are the most popular and possibly the best quality (although those do not go hand in hand). The Altmetrics manifesto shares the way the measurement is calculated with the number of times the article is saved, shared, reused and posted on social media.

It makes sense that the number of citations can reveal the impact of an article because other scholars are citing it and building upon the original research. Yet, citations take time because another scholarly publication has to come out with the citation. This can take an extremely long time depending on the publisher. One must speculate that the reason altmetrics are popular is because of its speed although it is not as accurate as citation counts.


I am not convinced of the accuracy of Altmetrics in today’s scholarly community.  Too much trust is given to measuring one’s success through Altmetrics, which can include twitter and facebook. Altmetrics speeds up the process of analyzing a publication’s impact, citations, and popularity. This is particularly important when university professors are in consideration for tenure or government grants.


I fully acknowledge that there is a need for some type of measurement of published work. Yet, the potential for abuse overshadows this need in my mind.  Call me a skeptic, but social media sharing, download issues, self-citation, and gaming page views to increase impact are happening more often than we think. Although it is still fairly new we have become too dependent on Altmetrics, especially in the academic community where Altmetrics can make or break one’s career.

My additional criticism of Altmetrics is that many universities and colleges are deciding employee value based on the Altmetrics data.  This should not measure the value and quality that an individual professor brings to a university.

In this blog I struggle to give my full support to Altmetrics because there are many ways to beat the system. There are advantages to altmetrics and I will address them. My hope is that you will finish reading this blessay with a better understanding of how almetrics works and how to actually analyze the data in context. Altmetrics can be valuable, but only when it is understood correctly.


Previous generations of scholars also struggled with measuring the quality of other scholars’ work. The impact factor was created as a solution because it takes the average number of times articles from a specific publication are published in the past two years were cited. The idea is that the more times an article is cited the more relevant and influential the piece is. This concrete number could be referred to when looking at quality instead of just listening to others opinions.

Yet, the impact factor is not perfect because there are ways to cheat this system too. Publishers began to find sneaky ways to cite the articles in their journals in order to get a higher impact factor. Additionally, the impact factor is just an average and better results may come from using an actual citation count.

Citation Counts

In most cases the number of times an article is cited implies the possibility of superior quality for an article. Scholars have relied on citations for decades and often when searching for an article in a database it is easiest to sort results by citation counts. The downside of this is self-citation, publishing time limits, speed, and the politics of publishing. It can take years to see the first citations. It is difficult to use citations on new forms of scholarship like online articles, journals in databases, and presentations where citations are not always prevelant. Fortunately, statistics overall demonstrate that Altmetrics may be able to predict what the citation count will eventually be (Priem, 2012).

Here to Stay?

When Altmetrics first came into the picture most scholars were skeptical about whether it would last. However, with the growth of social media platforms scholars use the web more to share their work. The research impact path now differs from its traditional way as scholars learn about each other’s research in unique ways.

There is no official list of Altmetric rules especially when dealing with websites and social media platforms popularity often changes. Altmetrics can be modified with the varying necessities for academics. Ultimately Altmetrics allows scholars to measure impact, immediate impact, usage, capture, mentions, social media, and citations.

For those of you who are not familiar with Altmetrics the chart below shows the current type of metrics the Plum Analytics (that is one of hundreds Altmetric tools) supports. Listed is the type of metric, its description, and an example. This helps to make Altmetrics more concrete and defines things like the number of clicks means the number of clicks on a URL. Usage is considered the most on the Plum Analystics website.

Current List of Metrics



Example Source(s)



Abstract Views

dSpace, ePrints, PLoS

The number of times the abstract of an article has been viewed


Clicks, Facebook

The number of clicks of a URL




The number of collaborators of an artifact



Dryad, Figshare, Slideshare, Github

The number of times an artifact has been downloaded


Figure Views

figshare, PLoS

The number of times the figure of an article has been viewed


Full Text Views


The number of times the full text of an article has been viewed




The number of libraries that hold the book artifact


HTML Views


The number of times the html of an article has been viewed


PDF Views

dSpace, ePrints, PLoS

The number of times the PDF of an article has been viewed




The number of times the dataset has been viewed.


Supporting Data Views


The number of times the supporting data of an article has been viewed



CiteULike, Delicious

Number of times an artifact has been bookmarked



Slideshare, YouTube

The number of times the artifact has been marked as a favorite




The number of times a person or artifact has been followed




The number of times a repository has been forked



CiteULike, Mendeley

Number of times an artifact has been placed in a group’s library




The number of people who have added the artifact to their library



Vimeo, YouTube

The number of people who have subscribed for an update




The number of people watching the artifact for updates


Comment count

Facebook, Reddit, Slideshare, Vimeo, YouTube

The number of comments made about an artifact


Forum Topic Count


The number of topics in a forum discussing the artifact


Gist count


The number of gists in the source code repository




The number of links to the artifact


Review count


The number of user reviews of the artifact


Blog count

Research Blogging, Science Seeker

The number of blog posts written about the artifact

Social Media


Facebook, Vimeo, YouTube

The number of times an artifact has been liked

Social Media



The number of times an artifact has gotten a +1

Social Media



The average user rating of the artifact.

Social Media


Figshare, SourceForge

The number of recommendations an artifact has received

Social Media



The number of upvotes minus downvotes on Reddit

Social Media



The number of times a link was shared on Facebook

Social Media



The number of tweets that mention the artifact


Cited by


The number of articles that cite the artifact according to CrossRef


Cited by

Microsoft Academic Search

The number of articles that cite the artifact according to Microsoft Academic Search


Cited by


The number of PubMed Central articles that cite the artifact


Scopus Cited-by Count


The number of articles that cite the artifact according to Scopus


Cited by


The number of patents that reference the artifact according to the USPTO

 The Issues

Altmetrics are still a new tool for scholars and there are some definite kinks that need to be worked out. Possibly leading to fewer skeptics and allowing Altmetrics to become even more relevant.

There are definite limitations to Altmetrics especially when you are comparing different data sets that are both in different contexts. Here scholars compare all like factors when in reality the comparison is moot because each was accumulated in a different context. Keep in mind that each discipline has its own pattern and citation trends so it does not make sense to compare a physics article to a sociology paper.

The issue that gives me the greatest concern is the problem of invisibility. If a publication is more accessible than it is likely to be cited more. Open access has allowed more access to information for scholars, but some publishers still keep a tight hold on what is released.

Many critics argue that there should be a set of standards when measuring Altmetrics. These standards have not been developed because Altmetrics are considered so new that it is difficult to come up with the same standards all the time. Niso Altmetrics has been working on defining standards and published a standards proposal in March 2013 at The context of the Altmetrics can vary greatly, yet without standards it is difficult to say Altmetrics clearly measures the quality of work or the impact it has on the field. Additionally, Altmetrics has been shaped to include statistics from social media platforms. This is an area where gaming is very easy and the number of downloads can be manipulated by buying coworkers or family members. For more on gaming and its effects on Altmetrics check out this scholarly kitchen blog at

Although popularity should not guide our definition of quality it can indicate the amount of future scholarly citations. Some Altmetrics can tell us in minutes what it takes citations months or years to tell us—the popularity of research among other scholars. Below I show the results that Altmetrics can produce. This chart is from Scopus and is based on Harvard publications in the social sciences using the algorithm SJR. The results are very much what a scholar would expect to see. The SJR and citations for Harvard Law Review are the highest with an SJR of 1.555 (light green). The Harvard international Law Journal is gaining citations (dark green) and SJR throughout the years and the Harvard Civil Rights law review (purple) is also gaining SJR and citations. These trends and popularity of the most recent articles in these publications can predict the number of citations that will occur within a few years.Description: Screen shot 2013-11-19 at 10.35.06 AM.png

In the scientific community there are major objections to Altmetrics, specifically who receives funding based on these statistics. My thoughts here are that we first need to move away from judging articles by the articles they are published next to. Second, we need to move away from simply counting citations. Not only are citation-based metrics noisy and error-prone, but also they’re at best lagging indicators.

The Future

In conclusion, there is no perfect metric for research and therefore scholars should not place too much importance on any one metric. Possibly the best solution is to use almetrics along side traditional citation counts and the impact factor. Altmetrics do speed up the process and can easily be adapted to today’s new digital scholarship. Librarians’ role in all of these is to help consumers successfully use statistical information on publications. It is our job to educate the public on the advantages and disadvantages of Altmetrics. Nick Scott has created a great guide for evaluating Altmetrics and other measurement tools at



Bauer, K., & Bakkalbasi, N. (2005). An examination of citation counts in a new scholarly communication environment. D-Lib magazine.

Carpenter, T. & Lagace, N. (March 19, 2013). Proposal to Study, Propose, and Develop Community-based Standards or recommended Practices in the Field of Alternative Metrics.  Retrieved from

 D. Crotty. ( 2013, October 7). Driving Altmetrics Performance Through Marketing — A New Differentiator for Scholarly Journals? Retrieved from

 N. Scott. (2012, January 6). A pragmatic guide to monitoring and evaluating research communications using digital tools. On Think Tanks. Retrieved from

 Plum Analytics. (2013, July 28). Current List of Metrics. Retrieved from

 Priem, J., Piwowar, H. A., & Hemminger, B. M. (2012). Altmetrics in the wild: Using social media to explore scholarly impact. arXiv preprint arXiv:1203.4745.

 Priem, Taraborelli, Groth, & Neylon. The Altmetrics Manifesto. Retrieved from

 Saha, S., Saint, S., & Christakis, D. A. (2003). Impact factor: a valid measure of journal quality?. Journal of the Medical Library Association, 91(1), 42.



Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s