No Comments on A Review of Plum Analytics 352
Rebecca Raszewski, MS, AHIP
Information Services & Liaison Librarian
University of Illinois at Chicago Library for the Health Sciences
Plum Analytics is an online resource that integrates traditional measures of researchers’ impact such as citation counts with altmetrics such as article downloads or tweets to provide a more comprehensive picture of a researcher’s impact. Plum Analytics tracks altmetrics from over 30 sources for resources such as articles, books and book chapters, dissertations, and videos to demonstrate that citation counts are not the only way a researcher can have impact.
Founded in 2012, Plum Analytics was acquired by EBSCO Information Services in 2014. Plum Analytics won Library Journal’s award for “Most Ambitious Database of 2013.”
I became interested in Plum Analytics because the PlumX Plum Print icon appears in my institution’s version of CINAHL, CINAHL Plus with Full Text, as an option for some of the citations. Clicking on the PlumX Plum Print icon reveals Plum Analytics’ categories of metrics (if applicable) for the citation. There is also a “See Details” link to get more information, such as where a citation is mentioned in the news or on Twitter. I wanted to learn more about the metrics Plum Analytics uses and which additional features might be available.
Plum Analytics categorizes the data it captures through five types of metrics:
- Usage – includes article level usage data. Also includes article downloads or how many times a video has been played on YouTube.
- Captures – tracks when someone bookmarks or favorites someone’s research. Delicious, Mendeley, and SlideShare are examples of resources that Plum Analytics will track for this data.
- Mentions – includes when research is mentioned in blog posts, comments, or Wikipedia.
- Social Media – includes when research is liked, shared, or tweeted in social media platforms such as Facebook, Twitter, or YouTube.
- Citations – tracks how many times a researcher’s work has been cited. The citation metrics include how many times a citation has been cited in PubMed Central, Scopus, or even if it’s been cited in a patent from the United States Patent and Trademark Office. In 2016, PlumX Metrics began tracking what it refers to as clinical citations. These types of citations come from resources such as clinical practice guidelines in PubMed, clinical trials, or DynaMed Plus Topics.
Plum Analytics offers several products for tracking researchers’ and institutional impact.
• PlumX Dashboards – allows users to group individual researchers’ or groups’ research within one’s organization. In addition to articles, it also includes books, published data sets, and presentations.
• PlumX Metrics – a subset of Plum X Dashboards that tracks altmetrics from over 30 sources. PlumX Metrics can be embedded in one’s institutional repository so that researchers can view their research impact in one place. It also provides metrics widgets that can be embedded in various websites such as department websites, faculty pages, or lab sites.
• PlumX Benchmarks – used for benchmarking institutional data and impact. Data can be filtered by spending category, granting institute and center, geography, and over a specific data range. Institutions can be ranked for all five categories of metrics.
• PlumX Grants – provides insight into the grants funded at institutions. It can even assist in determining who should be applying for grants at an institution, especially when only a certain number or researchers can apply for a grant.
• PlumX Funding Opportunities – locates new research funding opportunities. Searches can include category, agency, eligibility, and funding instrument type. Search results can be embedded in a researcher’s profile page or a group’s page through PlumX Dashboards.
This is not a comprehensive review due to the difficulty in getting a trial from Plum Analytics. It took several attempts to determine who to arrange the trial through and to get a response from the initial contact. Two representatives from the company offered to meet with me initially before a trial was set up to give me an overview of Plum Analytics, but these attempts fell through. They did not respond to my follow-up emails when we tried setting up a different time to meet. It was extremely disappointing not to be able to fully review Plum Analytics’ features. For this review, I looked at their videos and website. Since I was unable to get a full trial, librarians who are interested in pursuing this resource should proceed with caution.
Even with the difficulty in getting a trial, Plum Analytics would be worth exploring, especially for institutions that do not have access to benchmarking resources such as InCites (formerly through Thomson Reuters, now through Clarivate Analytics) or are seeking to go beyond citation counts in analyzing their productivity. For example, I found Plum Analytics’ example of how a book or a book chapter is analyzed innovative. For the 4.1 million books and book chapters Plum Analytics tracks, it uses sources such as Amazon for book reviews; EBSCO eBooks for abstract views, downloads, and clicks; Goodreads for how many people have added the book to their bookshelves; and WorldCat to see how many libraries own the book. Plum Analytics’ approach to integrating altmetrics with citation counts would be an asset for an institution that is exploring how it can best showcase its institutional impact or to improve its awareness of funding opportunities. Plum Analytics is definitely a product to keep an eye on to see which metrics it will be including in the future. I found disappointing their lack of response and interest in working with Doody’s to set up a trial.
Leave a comment