Initially I was going to compare my findings for Dr. Francesca Bolla Tripodi with another SILS faculty member but after researching Dr. Tripodi I decided to use Dr. Gary Marchionini instead. As it turns out none of the tools (Scopus, Web of Science, or Google Scholar) was particular adept at locating Dr. Tripodi’s work. If anything it shows the limitations of these tools. All 3 tools confused Dr. Francesca Bolla Tripodi with another Dr. Francesca Tripodi who researches medicine in Italy. In fact, there may be two separate Drs. Francesca Tripodi researching medicine in Italy. It was hard to tell It makes determining citation information difficult. Scopus has three entries for our Dr. Tripodi at Carolina, James Madison, and University of Virginia. I was only able to determine all three were her because I happened to know she worked at James Madison, lived in Charlottesville, and her middle name is Bolla. I also know that she wrote some of the articles attributed to her in each section. The University of Virginia record was the most robust, with 6 articles attached so I focused on it. Dr. Marchionini also had two records with Scopus, but both were affiliated with Carolina, so I am not entirely why there are two. It does not appear that any articles overlap between the two records. I will focus on the top record since it includes the most documents and the higher h-index.
I could not find the Analyze Search Results options, but I did find the Analyze Author Output which is a nice visualization of their work. Usurpingly Dr. Marchionini has far citations and a higher h-index than Dr. Tripodi. He has, after all, published over 100 papers in the last 30 something years.
Analyze Author Output - Scopus - Marchionini
Analyze Author Output - Scopus - Tripodi
Web of Science combined all of the published authors named Francesca Tripodi into 1 “algorithmically generated author record.” This includes the other Drs. Francesca Tripodi that I mentioned who are based in Italy. Web of Science supposes that “Tripodi, Francesca Bolla; Tripodi, Francesca; Tripodi, F; Tripodi, F.” are the same person when there are at least two, maybe three, of them. None of the citation information was correct since it included articles from the other Drs. Tripodi. None of the listed articles that I could attribute to our Dr. Tripodi had any individual citation information.
Web of Science also created an “algorithmically generated author record” for the name Gary Marchionini. He appears to be the only Gary Marchionini currently publishing. I was able to generate a citation report. It generated a graph that looks quite similar to the one created by Scopus. His most cited article is “Exploratory search: From finding to understanding” which was published in 2006. It has been cited 2,455 times. For reasons I couldn’t figure out, it only let me download the graph with the publications and the graph with the citations separate.
Additionally, I was able to analyze the results using the analyze button. It generated this tree graph which indicates what field Dr. Marchionini publishes in. Unsurprisingly, most of his articles are related to information and library science and computer and information science.
Tree Graph - Web of Science - Marchionini
While Google Scholar still confused both Drs. Tripodi, the first 6 articles were authored by our Dr. Tripodi. None of the Drs. Tripodi has a user profile so there was not any specific data related to that. I used individual article citation data.
Of her articles listed, Google Scholar also includes more types of metrics, including CrossRef and Mendeley. Looking at one of her articles mentioned in Scopus and Web of Science, there was still no citation data, but Google Scholar did confirm that 16 accounts did tweet about it. It also seems that Google Scholar has some access to Altmetrics without users having to use the browser plug-in.
Dr. Gary Marchionini has a user profile in Google Scholar. According to Google Scholar his most cited paper is “Information seeking in electronic environments” from 1997 with 3,099 citations. The other article “Exploratory search: From finding to understanding” is in second place with 1,912 citations. Web of Science and Google Scholar are not in agreement. Two of his articles are considered “public access” but it does not explain what that means. It clearly indicates that it is open access in some capacity, but “public access” is an odd way to phrase.
I cannot say why Dr. Tripodi is not more represented in these tools. While she has not been researching as long as Dr. Marchionini, I would have expected more citation data along the same lines as his. Just fewer and more recent. Is this an example of a gender bias gap? Or is she still too new in her career? Or is it both?
The two articles I chose were “Disinformation as political communication” which was published in 2020 in a political science journal (Political Communication), by Deen Freelon and Chris Wells (henceforth called the Freelon article). The other article is “The social, political, economic, and cultural dimensions of search engines: an introduction” which was published in 2007, in a communications journal by Eszter Hargittai (henceforth called the Hargittai article). These articles were more evenly matched.
The Freelon article has been cited 40 times according to Scopus. It has a field-weight citation impact of 20.26. Which seems a bit high when the threshold is over 1.00. The Hargittai article has a field-weight citation impact of 2.13. Both have been cited more than expected. Hargittai has been cited over 57 times. I suppose the higher citation impact is because the Freelon article is newer. I wonder how those numbers change over time. I am also not that surprised to see a bump in citations in 2020. The larger cultural conversation about the role of search engines in the spread of misinformation has probably affected who is reading and citing the article.
Scopus - Freelon
Scopus - Hargittai
The PlumX Metrics is very interesting. The visualization is really nice. It counts 39 citations for the Freelon article and 57 citations for the Hargittai. It also counts social media; tweets, facebook posts, etc. The Freelon article has been tweeted 186 times. while Hargittai has only been tweeted 6 times. It makes sense given that the Hargittai article was published just a year after Twitter launched and went live. Twitter was still gaining traction. Though, The Hargittai article has been cited on Wikipedia and that is included as factor for the metric. Based on visuals alone, PlumX makes it seem like the Hargittai article is more successful/important/[insert other signifier]. It just seems to be talked about in more places. But it is hard to tell if that is true or not.
PlumX - Scopus - Freelon
PlumX - Scopus - Hargittai
Altmetrics is very different. The main focus is the Attention Score rather than traditional citations. The swirl graphic, which draws the eye immediately, lists where the article has attention and how much. Hargittai is a blue and grey swirl indicating twitter and Wikipedia while the Freelon article is a red and grey swirl indicating twitter and news outlets. Altmetrics also includes a map to indicate where in the world these tweets, Mendeley readers, and attention score data is located. Despite having a larger variety of attention, the Hargittai article has a much lower attention score than the Freelon article. The attention score seems very tied to twitter and other social media, rather than based on traditional citations or how many people are reading through Mendeley. It is worth noting that the Altmetrics tweet count is different from the one Scopus uses. Are they updated using different criteria or update new information at different times? It does not appear to update in real-time as people tweet it.
Altmetrics - Freelon
Altmetrics - Hargittai
Other than PlumX, I think that both Scopus and Altmetrics thinks that the Freelon article is more successful/important/[enter other signifier] because of the higher field-weight citation impact and the higher attention score. While Hargittai has the most citations so far, it has accrued those citations slower over time. Though can we really quantify how more successful/important/[enter other signifier] one article is to another when they aren't about the same topic? Looking back I probably should have picked articles on the same topic to better compare.
I have neither given nor received aid while working on this assignment. I have completed the graded portion BEFORE looking at anyone else's work on this assignment. - Claire