I decided to investigate two old history professors from my undergraduate years. Both were absolutely fantastic professors and it was my understanding that they were fairly well known and well cited. I thought that choosing these too would be especially interesting because they are more known for their books and so I wanted to see how their article publications compared. The first professor is Fred S. Naiden who is a professor of Ancient History at UNC Chapel Hill focusing on Ancient Greece and the Near East. The second is Richard Talbert who is also a professor of Ancient History at UNC Chapel Hill focusing on the history of the Roman Empire.
Naiden: In Scopus he was cited 215 times between 2006 and 2021. His books were the most cited of his works and contribute to this citation number a lot. He has an h-index of 7 and 23 documents attributed to him. All of the articles listed did belong to him which might be because he was the only author to show up in the search so there are no similar authors for him to get confused with. I found it very interesting that he had different articles listed in Web of Science and that none of his books were included which is likely why his h-index was so much lower at a score of 2. These articles had a lot fewer citations but they were all his articles so once again the system is accurate. Additionally, despite the articles being different WoS attributed 27 articles to him. The extra articles coupled with the fact that they were generally cited less also would have contributed to his lower h-index. These 27 articles were only cited 34 times. Despite the low number of citations, he is still positioned within the 55th percentile for citations. The breakdown of publication subject also seems to be fairly similar between Scopus and WoS with a bit more nuance and diversity documented in WoS. He did not have a Google Scholar profile but the first 20 results did all belong to him and the number of citations attributed to him would far surpass those granted by Scopus or WoS. One of his books alone is marked as being cited 358 times. The individual entries for each work do provide a good amount of information about the publication so even though he does not have a profile you can still see how popular it and he has been in the field.
Scopus
Web of Science
Talbert: There were four Richard J.A. Talbert’s in Scopus, one of which did not have any work listed. Two did have one work each listed and did seem to be variants of the top result who had 23 works listed, was cited 87 times, and had an h-index of 6. All the information below is in regards to that top listing. Most of his works listed were books, book introductions, or book chapters with a few articles and I thought that something had been misattributed to him when looking at the subject breakdown and I saw medicine but this was in fact accurate. The information from Web of Science was quite different from Scopus. WoS gave him 58 publications with 52 total citations and an h-index of 3. WoS also misattributed 2 articles to him, one in microbiology and one in physics. I was surprised by this because WoS had been very accurate up until this point. I also noticed that the articles in WoS were different than those in Scopus. This is to be expected to a point since WoS lists double the amount of works than Scopus but there seemed to be some listed in Scopus missing from WoS. I was interested to see this pattern present in both of my professors. Talbert did not have a Google Scholar profile either. Upon discovering this I looked up several other history professors to find that none of them had Google Scholar profiles either and I am now wondering if this is a trend or a coincidence. A search for his articles came up with quite a few which again showed much higher rates of citation than suggested by either other resource. Out of the first twenty results all except one did belong to Talbert but one additional observation I had was that the works were much older than those logged in Scopus or WoS.
Scopus
Web of Science
Overall, I found all three tools to be quite useful but I think that I would use them for different purposes: Google Scholar for more casual inquiry and Scopus and Web of Science for more serious exploration. I doubt that I would use either Scopus or WoS alone and would probably use them together to ensure that I get the full picture of impact. For both professors, the h-index scores did not match at all and so I think if I were to try to measure impact I would take both scores with a grain of salt. Additionally, I think that I would more likely look at the profiles holistically because neither seems to be especially complete when compared with the citations and extra publications noted with Google Scholar. Clearly, there is no one way to measure impact because it does not seem that the resources that are supposed to do this measure it consistently anyways.
Articles:
Totenhagen, Casey J., Deborah M. Casper, Kelsey M. Faber, Leslie A. Bosch, Christine
Bracamonte Wiggs, and Lynne M. Borden. 2015. "Youth Financial Literacy: A Review of Key Considerations and Promising Delivery Methods." Journal of Family and Economic Issues 36, no. 2: 167-191. https://doi.org/10.1007/s10834-014-9397-0.
In trying to evaluate this citation I greatly preferred Scopus because Altmetrics had no information about it. Scopus noted that it has been cited 28 times in is above the average number of citations for similar documents. The PlumX Metrics are extremely detailed and although I don’t know what I would do with this information, it is very interesting to see that only less than one-quarter of people who look at the abstract continue to look at the full article. My first reaction to seeing all of this data was that if I was the author I would be constantly checking on it.
Scopus
Altmetrics
Varcoe, K.P., S.S. Peterson, P. Wooten Swanson, and M.C. Johns. 2010. “What Do Teens
Want to Know About Money—A Comparison of 1998 and 2008.” Family and Consumer Sciences Research Journal 38: 360-371. https://doi.org/10.1111/j.1552-3934.2010.00032.x.
When investigating this article, I was a little surprised that the number of views in Scopus was so low since the first article I used actually cites this one and is newer so I expected the older, cited article to have been a bit more popular. According to Scopus, it has been cited 6 times and viewed 29 times. The PlumX statistics truly showed how little use the article has gotten since the abstract has been read from EBSCO 507 times but the article only read 144 times. This is even worse that the other article. I was very interested to see that the PlumX metrics had a social media tracking feature too. Altmetrics also showed that the article has been tweeted and I was very excited to see this because Altmetrics had no data on several other articles which was very frustrating. I did enjoy seeing where in the world people are interacting with the article and what types of people and levels of students and scholars are reading it. However, I think that Scopus, with the PlumX metrics, does a more thorough job of reporting the statistics and is more reliable, so I would likely use Scopus more.
Scopus
Altmetrics
Overall, I think that this has reaffirmed my belief that there is no one way to track metrics or to measure impact with them. Although the first article was more popular, one could argue that because the second is cited by the first that the second article contributed to the impact of the first even though people didn’t read the second article. In evaluating Scopus, I greatly appreciated how comprehensive it all way. For Altmetrics, I was largely frustrated by their lack of information about articles. This is likely because they are more based in social media and so if articles aren’t discussed outside the traditional channels of discussion then they won’t have anything to report on Altmetrics. I think that if I were to use these two services, I would decide which one to use based on the type of information I need from it.
I have neither given nor received aid while working on this assignment. I completed the graded portion before looking at anyone else's work. - Rebecca