AnnaTwiddy_A4

I. "Impact" of Academics Lisa Jardine and Adam Smyth

For my look into the “impact” of two academics as measured by Web of Science, Scopus, and Google Scholar, I thought it might be interesting to focus on two scholars whom I encountered quite frequently over the course of working on my first master’s degree in early modern English. The first of these is Lisa Jardine, who is probably best described as an historian of the early modern period, but whose wide-reaching scholarship and publications—which spanned about forty years, from the early 1970s until her death in 2015—has proven to be massively influential on early modern studies more broadly. The second is Adam Smyth, a younger scholar (his first publications date from just under twenty years ago) who focuses more specifically on the history of the early modern book. Since these two scholars work or worked in roughly the same field, but have noticeably different levels of scholarly output and range, I thought it would be worthwhile to see how these databases each quantify their respective levels of impact.

Going into my investigation, I was skeptical of the idea that Scopus and Web of Science would accurately capture the impact or influence of either of these two scholars, for the simple reason that—in my experience, at any rate—these two resources are not widely used within the field of English literature. Knowing that, I assumed that the information relayed by these two sources would be insufficient. There were certainly some elements in both of my Web of Science and Scopus findings that bore this out; I was surprised to see, for instance, that Web of Science records Jardine’s twelve listed publications in the database as being cited only ten times, and I also feel that the differing h-indices assigned by both databases to Jardine and Smyth do not truly reflect their influence within the field (especially Jardine’s, which, despite her much longer list of total publications compared to Smyth and her longstanding scholarly reputation, only had a slightly higher h-index of eight next to Smyth’s six on Scopus, and inexplicably had an h-index of two on Web of Science, compared to Smyth’s five there). Still, there were also some findings by these databases, particularly Scopus, that I felt reflected the differences between the two scholars that I would expect to see in an accurate report on their respective impacts. Scopus records 20 cited documents for Lisa Jardine and 25 for Adam Smyth; despite the fact that I would likely have expected a starker difference in these respective amounts of cited documents, I was ultimately unsurprised that Scopus also recorded Jardine’s documents as being cited 737 times over the past several decades (with the bulk occurring in the 2010s) compared to Smyth’s documents being cited 117 times (with nearly all of these occurring in the last few years). This difference was logical to me, as not only has Jardine produced more work in general over a longer period of time, but the range of subjects she covered in her work is generally much more varied compared to Smyth’s output; it would thus make sense for her to have a higher number of citations. Web of Science, in contrast, recorded Smyth as having a much greater amount of publications compared to Jardine (55 to 12) and a much greater amount of citations (80 to 10). As Smyth is a younger scholar who is still producing a great deal of work (in contrast to Jardine who, as mentioned, died six years ago), I wonder if Web of Science might be privileging recency to a greater degree than Scopus is, at least within this area of scholarship.

It was much more difficult pulling hard findings from Google Scholar compared to Scopus and Web of Science—I could only see how many times each of the individual publications of these two authors had been cited, rather than the authors in total—but just perusing the first few pages of results for each author left me with the impression that the numbers reported by Google Scholar give a more accurate indication of each author’s actual level of influence. Jardine’s books, recorded on the first page of her Google Scholar results, had individual citation rates ranging from 195 to 858, while Smyth’s most cited work has a citation number of 185. While it would have been helpful to have a more formalized report of these numbers, as I found with Web of Science and Scopus, simply seeing this stark difference in citation numbers helped validate for me my suspicion that Web of Science and Scopus were not telling the full story of these two authors’ impact. Though Scopus was the closest to Google Scholar in the amount of citations it attributes to Jardine, it still pales in comparison with the level of influence Google Scholar appears to ascribe to her.

Complementing these observations are images of my findings, including the metrics compiled from Scopus and Web of Science, as well as screenshots of my first few pages of Google Scholar results (it might be necessary to zoom in order to read them properly).


II. “Impact” of Two Articles

For assessing the “impact” of two articles using Scopus and Altmetric, I decided to switch disciplines and look at two recent LIS papers on studies undertaken at two major academic libraries:

Basak, S., and Yesmin, S (2021). Students’ attitudes towards library overdue fines in an academic library: A study in a private university setting in Bangladesh . IFLA Journal, 47(1), https://doi.org/10.1177%2F0340035220944948

Sonmez, F. D., Cuhadar, S., and Kahvecioglu, M. K. (2021). Successes, challenges, and next steps in implementing outcome-based assessment: The case of Istanbul Bilgi University Library. The Journal of Academic Librarianship, 47(1), https://doi.org/10.1016/j.acalib.2020.102249

Since these two articles were both published this year, and in comparable journals, I was curious to see how similar their Scopus and Altmetric scores would be; in particular, I wondered if the scores would feature the level of discrepancy Web of Science, Scopus, and Google Scholar had demonstrated in reporting on the impact of Jardine and Smyth. Unsurprisingly, the contrast between Scopus and Altmetric was not as stark as these differences had been. Sonmez et al.’s article appeared consistently to be slightly more impactful than Basak and Yesmin’s piece across the board: Scopus and PlumX each counted it as having seventeen views, with Scopus recording a single citation. Its Altmetric score, meanwhile, was three, recording three tweets, one citation, and 19 readers on Mendeley. Basak and Yesmin’s piece, in contrast, was described by Scopus as having seven views, with four reads and four tweets recorded by PlumX. Its Altmetric score, meanwhile, was also slightly below the Altmetric score of Sonmez et al.’s piece, scoring a two with three tweets and five readers on Mendeley. While there were some slight inconsistencies between these recorded figures—for example, Altmetric recorded five Mendeley readers for Basak and Yesmin’s piece while PlumX recorded four—the numbers were not so far apart as to appear alarming. I figured what small inconsistencies that could be found could likely be attributed to the recency of each article.

With these similarities, it was difficult for me to assess whether I preferred Scopus’ or Altmetric’s metrics on this front. Ultimately, though, I found myself appreciating Scopus slightly more for the generally greater breadth of information it covered—for each of these pieces, Scopus offered its own read and citation count, plus additional metrics provided by PlumX. And, while PlumX did not consistently capture social media engagement the way that Altmetric did, it still offered some insight into this area as well. For these reasons, Scopus struck me as just slightly more versatile, though I acknowledge that my perception of these metrics might be different had I chosen to look at two other articles.

As with my look at the academics’ “impact,” I am complementing my writeup with screenshots of the metrics I gathered from Altmetric and Scopus.