Academics:
I chose to review the metrics for Judith Maxwell, a linguistic anthropologist at Tulane University, and María Tapias, a medical anthropologist at Grinnell College. Both professors work on Latin America and are likely to have published in Spanish as well as English, and neither are early-career academics. Dr. Maxwell, in particular, is likely close to retirement. I thought it would be interesting to look at two anthropologists who work in very different subfields of the discipline. I am also familiar with both of their work, which helped me judge how well the metrics represented their scholarship and places within the field.
The first thing that I noticed is that Scopus and Web of Science did very poor jobs finding their publications. In both cases I believe the h-factor should be higher, and Dr. Maxwell’s publication count was laughably low for a prolific academic toward the end of her career who is prominent in her field. Her results in Scopus and Web of Science are below.
Scopus: Web of Science:
Besides not capturing the bulk of her publications (and none in Spanish), Web of Science also misattributed several publications to Dr. Maxwell in medical and ecology journals. Altogether, those errors make the provided charts totally meaningless. I did find the journal mapping tool pictured below interesting, and I find that it makes it especially clear which articles are misattributed.
Since Dr. Maxwell already had tenure and recognition within her field when these platforms were built, there is not a big incentive to maintain her profile there, and it seems clear that she has not.
Dr. Tapias’ results in Scopus and Web of Science were similar, although with fewer misattributed papers. Interestingly, the few papers that were assigned to her appear to be her most-cited papers, so the h-index is very similar to Google Scholar even though Google was able to correctly attribute many more papers to her.
Scopus: Web of Science:
Even if the h-index is fairly consistent for Dr. Tapias across Scopus, Web of Science and Google Scholar, the graph functions for Scopus and Web of Science are not useful, because they only captured her top publications instead of the total body of her work. Similarly to Dr. Maxwell, I also believe Dr. Tapias had tenure prior to the development of these metrics and therefore does not appear to have put effort into maintaining them as accurate reflections of her scholarly output. My main impression of both Scopus and Web of Science is that they may have some interesting statistics or visualizations of scholarly output, but an academic absolutely must put effort into adding their work to the profiles and deleting misattributed works. At least in the social sciences, the automatic profiles are far from comprehensive and do not give accurate representations without manual curation.
I was more impressed with Google Scholar, at least in its ability to capture and correctly attribute scholarly output.
Looking at these profiles made me particularly wary of the h-index as a good indicator of anything. According to Google Scholar, Dr. Tapias has a score of 6, meaning that six of her works have been cited at least six times. But her top articles have been cited 75 and 61 times, respectively, which is not captured in that number at all. I also think publications and citations before about 2008 were not well captured by Google Scholar, as represented in this graph of Dr. Maxwell’s citations by year:
I highly doubt that her citation rate picked up that significantly in the last decade, considering that she has been working in the same specialty since the 1980s. So while Google Scholar seems better at aggregating scholarly output, I still have my doubts about its comprehensiveness for scholarship older than 10 or so years. I also found the visualizations and interfaces of Scopus and Web of Science more informative and interesting, although their data was worse.
Articles:
I chose to investigate a library science article that is relevant to my thesis and an article by a prominent scholar of Central American literature and culture that I assumed would have more appeal across several disciplines. The results of both were surprising to me, and it was useful to compare the Scopus and Altmetrics when both were available.
The Schadl & Todeschini (2015) article I expected to have relatively low citation rates as a niche paper in a specialized field of library science (Latin American Studies collections and collecting books in Spanish and Portuguese). The Scopus results were about what expected, with under 10 citations:
I appreciated the PlumX metrics, which appeared to show a lot more engagement with the article than the citations in Scopus and view counts did. It’s particularly astonishing to see Scopus count 6 views when PlumX counts over 1,000 abstract views and the paper itself was published open access.
I knew that one of the authors has since been the head of the Hispanic Division at the Library of Congress, but I was surprised to see a fairly high Altmetrics score. This made more sense once I clicked through to the tweets and other mentions and found out that the paper had one a prize at a conference (SALALM). Announcements of the prize account for every tweet, Facebook post and Wikipedia mention. I had never considered the impact that winning a prize might have on journal article metrics, especially altmetrics. Even though winning the prize didn’t seem to increase actual engagement with the content of the article on social media platforms, the announcements alone significantly boosted the ratings into the top 25% of all research outputs scored by Altmetric.
I was also very surprised by what I found for the Arias (2001) article, which was exactly nothing. The article did not show up in Scopus, although several of Arias’ other papers do, and it also didn’t return anything in Altmetrics. I was shocked because this article is by a prominent Guatemalan scholar weighing in on an academic scandal/debate on Rigoberta Menchú, the Guatemalan Nobel laureate, so I expected high engagement. In contrast to what I found on Scopus, I looked on Google Scholar and according to their metrics the article has been cited 90 times. I tried several other of Arias’ works on Scopus and Altmetrics and had trouble finding any with more than 1 citation. I think this shows how fallible these metrics are, if one article can be heavily skewed by an award and another (which should show much higher engagement) doesn’t appear at all.
Arias, A. (2001). Authoring ethnicized subjects: Rigoberta Menchu and the performative production of the subaltern self. PMLA, 116(1), 75.
Schadl, S. M., & Todeschini, M. (2015). Cite Globally, Analyze Locally: Citation Analysis from a Local Latin American Studies Perspective. College & Research Libraries, 76(2), 136–149. https://doi- org.libproxy.lib.unc.edu/10.5860/crl.76.2.136
I, Sara Kittleson, have neither given nor received aid while working on this assignment. I have completed the graded portion BEFORE looking at anyone else's work on this assignment.