ClaireReinert_A4

Scholar: Sally M. Foster, Professor of Archaeology at the University of Stirling, Scotland

SCOPUS

Documents: 5

Cited: 41 times (33 without self-citations)

Co-authors: 4

Preprints: 0

H-index: 4

Topic: 50% arts and humanities, 40% social sciences, and 10% business management

Document Types: 4 articles, 1 review

Other thoughts: All 5 documents were published in different journals, and many of her citations come from one specific article, “The Thing About Replicas: Why Historic Replicas Matter (2016, open access).” It was cited 16 total times, while the next most-cited work only has 9 cites, “The Legacy of 19th Century Replicas for Object Cultural Biographies: Lessons in Duplication from 1830s Fife (2014, open access).” Finally, they show one of her articles under a separate person, back when she published at the University of Glasgow, and they haven’t merged the two authors.

Web of Science

Documents: 13

Cited: 85 times (77 without self-citations)

H-index: 4

Document Types: 6 articles, 6 book reviews, 1 editorial material

Open Access: 7 All OA, 2 Gold-Hybrid, 2 Green Published, 5 Green Accepted, 1 Green Submitted

Research Areas: 8 Archaeology, 6 Arts & Humanities Other Topics, 2 History, 1 Anthropology, 1 Social Sciences Other Topics

Other Thoughts: There’s a big difference between her most cited publication (47 times) and her second most cited publication (15 times). You can see that the most content she’s published in one journal is 3 articles in Medieval Archaeology. She’s published single articles in at least 8 other journals. Finally, you can see that 3 of her publications for funded—one each by Historic Environment Scotland, the Royal Society of Edinburgh, and the Strathmartine Trust.

Google Scholar

It’s much harder to view metrics for an author on GS. Searching “Sally M. Foster” pulls up her publications, and I view some of the metrics for each individual article, but nothing to aggregate all of her works. When I click on her hyperlinked name under a specific article, it pulls up everything she’s published under that specific journal, which still doesn’t show me the full picture.

Comparison

I think that I got the broadest, most accurate picture of Sally’s work from Web of Science, which showed me lot that SCOPUS and GS didn’t have, such as open access, publishers, journals, and funding agencies. I really appreciated seeing how many open access articles she has. Most of them were green OA, which means that she has copied her published work into a repository, rather than publishing in an OA journal.

Something that both SCOPUS and WoS seem to struggle with is the topic breakdown of Sally’s work. I chose her in part because she really walks a tightrope between history and archaeology, much more so than most archaeologists. Typically, archaeology is a social science while history is in the humanities. Because Sally combines the two in almost everything she publishes, I wanted to see how both sites would break down her articles. It’s hard to compare the 2 because WoS included more than twice as many articles as SCOPUS, and because WoS broke the topics down much more specifically than SCOPUS. Additionally, WoS accounted for publications that have two or more topics, while SCOPUS seems to have placed her works in only 1 research category each. SCOPUS broke her articles down into 50% humanities, 40% social sciences, and 10 % business management (I think cultural heritage management is a better term for this aspect of her work), while WoS broke it down by 60% archaeology, 45% “other humanities topics,” 15% history, 8% anthropology, and 8% “other social science topics.” The fact that these add up to over 100% tells me that WoS is overlapping her topics. If we assume that archaeology and anthropology are social sciences, while history is in the humanities, we can aggregate WoS’s topics to be 76% social sciences and 60% humanities. In conclusion, both WoS and SCOPUS show her research areas as being pretty evenly split between the social sciences and the humanities.

One final thought is that I don’t believe and of these sites actually accounted for all of her publications. She has several books and edited works that didn’t show up on any of the sites, which tells me that all 3 are really built for journal articles only. I think that this is problematic, given the emphasis on publishing monographs in the humanities. If monographs aren’t accounted for, humanities scholars like Sally will be very much misconstrued by these metrics. Additionally, I’ve tried to find an accurate list of all of Sally’s publications, because I know for an absolute fact that she’s published more than 5 or 13 things (I can’t get a total for Google, because they don’t let me filter all the wrong Sally’s), but it’s nearly impossible to find one. These sites aren’t telling me anything useful unless they can aggregate the citations from ALL of her works, which they clearly aren’t. Finally, Sally does a lot of presentations. These surely count as scholarly outputs, but regular citation metrics don’t apply to them. In short, I think that these cites are pretty useless in determining Sally’s scholarly impact because they only account for a fraction of her works by excluding monographs, presentation, and her older articles.

Scholar: Christopher A. Whatley, Professor of History at the University of Dundee, Scotland

SCOPUS

Documents: 24

Cited: 220 times (199 without self-citations)

Preprints: 0

Co-authors: 7

H-index: 7

Topics: 62% arts and humanities, 23% social sciences, and 3% each engineering, environmental science, business management, materials science

Document Types: 8 articles, 5 books, 5 book chapters, 3 editorials, and 3 reviews

Other Thoughts: the “publication sources” pie chart shows that 19 of his works were all published in discrete journals/locations (one work per journal), and the most that he’s published in the same journal is 4 articles for Scottish Historical Review. He also has a big preponderance between his most cited work (a 2006 book, Scots and the Union, which has 89 cites) and his second most-cited work (a 1997 book, The Industrial Revolution in Scotland, which has 31 cites).

Web of Science

Documents: 75

Cited: 98 times (93 without self-citations)

H-index: 7

Document Types: 43 book reviews, 22 articles, 15 book chapters, 4 editorial materials, 2 conference papers, 1 biographical material (obituary), 1 book, and 1 “notes.”

Open Access: 1 All OA, 1 Green Submitted

Research Areas: 56 History, 15 Government Law, 6 “Other Social Science Topics,” 6 Business Economics, 2 Literature, 1 “Other humanities Topics,” 1 Religion

Other Thoughts: 57 out of his 75 publications have 0 cites, which I find incredibly hard to believe, given how prominent a Scottish historian Christopher is. His work with the most cites has 14, and one of his most lauded books only shows 8 cites (Google Scholar gives this same book 132 cites).

Red Flag: A bar graph shows that Chris often comes out with 2 works per year, except for 2014, when he came out with 17 published works. Looking closer, I find that WoS is individual accounting for all 13 chapters from his one book that year. In reality, he published 1 book, 3 articles, and 1 obituary in 2014, but WoS claims that he came out with 13 discrete book chapters as well. I’ll discuss this more in my comparison.

Google Scholar

When I search Christopher A. Whatley, relatively accurate individual works come up. But I still can’t find anything on him beyond each individual listing that he authored. When I click on his hyperlinked name, it takes me to all the stuff he’s published in a specific journal. GS does show a more accurate-looking metric for Christopher’s 2006 book, The Scots and the Union: Then and Now, to which they attribute 132 cites (which WoS says only got 8 cites). The first 70 works that show up (mostly articles, some book reviews, some reviews of his books, and a few obituaries authored by him), look to be mostly accurate stuff that he did write himself, but around page 8 we start seeing works in other disciplines that are probably not the same Christopher Whatley.

Comparison

The fact that Chris only has 2 OA materials anecdotally confirms my suspicion that historians aren’t very concerned with making their materials OA. Because Sally is an archaeologist whose field is literally the public presentation of archaeological interpretations, her OA statistics confirm that she seems pretty concerned with making her publications OA. Not so with Chris.

I chose Chris for similar reasons as Sally—although he’s a historian, he includes a lot of economics and political science in his research and analysis, so I wanted to see how these sites would classify his research area. SCOPUS has his down as 62% humanities (history), and 23% social sciences (political science and economics). If I assume that history is in the humanities and that political science/economics are social sciences, I can convert WoS’s broader disciplines into the same categories as SCOPUS. We can reasonably assume that WoS has broken his topics into 81% humanities and 35% social sciences. This is pretty similar to SCOPUS’s breakdown, though we should remember that WoS is accounting for 3 times more of Chris’ stuff, and that WoS allows for works to have overlapping topics. It’s interesting to me that his social science topic percentage is so high, because Chris is a historian. He might write about the economic and political histories, but he is primarily concerned with the study of Scottish history, which probably should put him 100% in the humanities pile. It’s a little less clear with Sally, because she’s studies equal parts history and archaeology, so I logically expected to see a pretty even breakdown of humanities and social sciences. Again, not so with Chris.

When I looked at Sally’s metrics, I was struck by how none of her monographs, conference papers, or edited works were included, but I was struck so much more so by the fact that WoS accounted for Chris’s book The Scots and the Union as 13 individual book chapters. This is a single-authored book, so it should not have been that confusing for WoS to recognize it as a book. This example further drives home to me the fact that these sites are built for journal articles, and nothing else. Woe to the humanities scholar who relies heavily on monographs for his scholarly reputation, because none of these sites will account for it (or the will, but they’ll just bungle the categorization).

Article: Whatley, Christopher. “Reformed Religion, Regime Change, Scottish Whigs and the Struggle for the ‘Soul’ of Scotland, c.1688-c.1788.” Scottish Historical Review 92, no. 1 (2013): 66-99.

SCOPUS metrics: 7 citations in SCOPUS (96th percentile), 4.52 field-weighted citation impact

Views count: 1 in 2021, 2 in 2020, 14 from 2012-2021

PlumX Metrics: 52 exports-saves, 4 readers, 514 full-text views, 61 link-outs, 2588 abstract views

Altmetric: “hasn’t picked up any sharing activity on this article yet”

0 readers on Mendeley

0 readers on CiteULike

Comparison: I think that the PlumX metrics tell me the most about this article. The most interesting thing I notice is that there are 2,588 abstract views, but only 514 full-text views. I wonder if these statistics reflect only the views that happened on SCOPUS, or if it’s counting every time someone reads the article from any repository or library (is there a way to do that?). If I’m only looking at the number of times people have read the article on SCOPUS, I’m not sure if that really tells me anything of value about this article.

Article: Mullen, Stephen. “Henry Dundas: A ‘Great Delayer’ of the Abolition of the Transatlantic Slave Trade.” Scottish Historical Review 100, no. 2 (2021): 218-248.

SCOPUS metrics: nothing listed

PlumX metrics: 68 tweets (nothing else listed about views or saves)

Altmetric: tweeted by 51 (mostly in Britain, some in Canada, and a few in the US; mostly members of the public, with a few scientists and science communicators)

Comparison: I’m interested in the lack of SCOPUS metrics because I thought that it was an automated statistic generator, but maybe I’m wrong and nobody’s drawn up/published the statistics yet. There’s no way that nobody’s read, or even looked at, the article, especially if it’s been tweeted a few dozen times. I really like the ability to see the actual tweets and the demographics of the people that tweeted the article through Altmetric, but I don’t think that a tweet count is a replacement for simple view/cite stats. Just because random members of the public are tweeting about an article, it does not follow that this article has more “impact” than one that does not get tweeted. Looking at how an article is shared can give us a unique look at a topic or an author, but I don’t see much tangible value in the use of altmetrics; what am I supposed to glean from the fact that the first article hasn’t been shared? Without an explicit interpretation, I don’t feel like I’ve learned much from altmetrics.