Sally Smith_Assignment4

Author Impact Metrics

Author Impact Metrics

To look at author impact metrics, I decided to look at two EPA researchers that I have collaborated with, Mary Gilbert and Scott Voorhees. When searching for both authors in Web of Science and Scopus, I was struck by how many different authors were retrieved. In Scopus Mary had an ORCID in conjunction with her Scopus author ID which made it easy to determine which articles she authored because they were all organized together. Web of Science does not use an ORCID, and authors must claim their records to ensure its accuracy. I still retrieved multiple records for Mary in Scopus and Web of Science that I needed to analyze together. This was the same for Scott, and I wondered how confusion over author identity impacts the integrity of the metrics generated by the two databases. In a worst-case scenario, a high-impact article may be accidentally left out of the metrics if not correctly attributed to the right author. Authors will need to be sure that their research is tied to something like an ORCID to ensure that all their publications are used to create metrics. This was especially troubling because Scott had significantly more articles in Scopus (14) than in Web of Science (3). This could be attributed to Scott not having an ORCID but further underscores that the metrics do not fully reflect an individual’s scholarship.

I found that Web of Science and Scopus generated similar metrics to compare authors. Both databases emphasized the authors’ funding sources, number of citations overtime, and the subject areas that they published in. I did find it interesting that Scopus emphasized the h-index more than Web of Science. The h-index looks at the number of citations over time for an author’s publications and then ranks the author by those citations. Scopus provided a detailed chart that illustrated an author’s h-index over time, but Web of Science only briefly mentions it – perhaps because the database prefers its own metrics in the Journal Citation Reports. Even more troubling is that Web of Science explicitly says when calculating the h-index: “Source items that are not part of your subscription will not be factored into the calculation.” An author trying to collect metrics to support tenure or to pursue a grant will be limited to the subscription databases that her institution has, potentially creating a situation where she cannot show the full scope of her scholarship. Web of Science’s calculation disadvantages institutions that cannot afford expensive subscription packages and potentially hinder those institutions abilities to attract grant money or other researchers to improve its scholarly reputation.

Out of all the sources I looked at, Google Scholar was the worst source to look at author impact. The number of results retrieved for both authors was significantly higher than what Web of Science and Scopus said. Additionally, there was no place to see an author’s profile to make sure I was looking at articles from the right author. The only metric that Google Scholar highlighted was the times cited for each article. Citation metrics, however, could not be accurate because there were multiple versions of the same articles with different citation metrics.

Article Impact Metrics

Article Impact Metrics

Scopus

In looking at two different articles, I thought it would be interesting to compare articles published twenty years apart. I was expecting the older article (Article 2) to have fewer metrics than the newer article but was pleasantly surprised that Scopus still provided extensive metrics. In comparing the two, Article 2 had far more citations and usage metrics. I also appreciated that Scopus divided the usage metrics between full article views and abstract views. One of the challenges in putting together usage data is the prevalence of “bots” or algorithms that are skimming through articles. I wonder if there is a way to ensure that actual humans are looking at articles to determine a true usage count. The captures count may be more accurate because it is linked to reference managers. Article 1 had more metrics relating to Social Media use which makes sense relative to its publication year. Overall, I preferred Scopus metrics when comparing articles together because it contained a mix of traditional and untraditional metrics.

Altmetrics

Article 2 did not have as many social media metrics which was not surprising since it was released over twenty years ago. This carried over into the Altmetrics score which gave it a relatively low score when compared to the newer article. In looking at the Altmetrics page, I appreciated that they included a map of where the metrics came from as well as a breakdown of the different subject areas of those citing/mentioning the article. I also thought that having the score in a red circle was misleading because it implies that the article is somehow bad or not a good piece of scholarship. The Scopus metrics, however, tell a completely different story that supports the high impact of the article. I wish the Altmetrics page gave more context about the article when providing the score because there was no mention of the article’s age as contributing to its low score. It does provide context to other articles published at the same time, but there doesn’t seem to be any mechanism to boost the scores of older articles when compared to newer articles that receive social media attention.

Article 2 also did not have a detailed Dimensions page which I thought was a better tool to see the full context of an article. This page mentions the large number of citations that the article has and includes the relative citation ratio, the measure of citation performance in comparison to similar articles. Article 1 had similar metrics on its Dimensions page but the metrics went into more detail with the Field Citation Ratio as well as a brief description that put the article into context with similar articles.


I have neither given nor received aid while working on this assignment. I completed the assignment before looking at anyone else's work. Signed Sally Smith.