PNRR Young Researchers Project:

Rumours on Networks (RON)

In Autumn 2023, I have been awarded a 3-year research fellowship at Ca' Foscari University of Venice funded by the Ministry of University and Research (MUR) in Italy, within the NextGenerationEU funding scheme. 

The project "Rumours on Networks" is currently under way and it aims to increase our understanding of the factors that allow rumours and misinformation to spread on (online) social networks. This work is built on earlier work of mine, funded through a Horizon 2020 MSCA fellowship, and is part of a long-term research subject, whose aim it is to find efficient ways to deal with the spread of misinformation without impacting freedom of expression. 

Here is a colloquial, non-technical, overview of the papers that I am (or have been) working on with various co-authors as part of the project. If you are interested in the actual papers, please follow the links provided below:


This paper has actually started a long time before the RON project, but has finally been completed as part of it. In it, we ask whether an increase in verification of information is always necessarily a good thing for a society. On purpose we look at a context where verification is not only possible, but perfect: If incorrect information (which we summarise as "rumors") is verified, the truth will be revealed. We think of scenarios like the questions whether climate change is a real threat, whether there is a link between HIV and AIDS, or whether the MMR vaccine for children causes autism or not, i.e., the scientific consensus is clear (yes, yes, and no) and it is possible to access this information. More than that, we focus on how many people are correctly informed, as ignorance of the truth (like the HIV-AIDS link) may be just as harmful as believing a wrong thing. To answer the question whether in such cases more verification is always better, we assume two things: First, people ignore messages that contradict their worldview, unless these messages are verified (e.g., if you generally believe that climate science is conducted correctly, you tend to ignore messages that claim climate change is a hoax). Second, how much verification happens in a society can be influenced by policy makers, for example through information literacy programmes in schools, guidelines to spot misinformation etc.  

We show that these two conditions lead to multiple outcomes: 

Our results do not claim that it is always good for a society to allow incorrect information to circulate. More than anything, they highlight the complexity of the relationship between truthful and incorrect messages, and the importance for policy makers to understand as much as possible about the variables that surround the diffusion of any particular rumor, as even a policy as seemingly straightforward as increasing verification may backfire. 



Homophily, the tendency to connect to others that are similar to ourselves, is a prevalent aspect of human behavior. In online communications it can, however, lead to "echo chambers", whereby people tend to receive similar messages over and over again, and become convinced that the view of their community is also the prevalent one in society overall. Or that something is correct (since many people told you about it), even if it is not (because you basically just hear the same message over and over, just from different people in your group). In this paper, my co-author and myself do not focus on information once groups have formed, but on what forces might lead them to form groups in the first place.

Here, we look at identity theory, and show how people have incentives to choose the same identity as their neighbours. This then leads them to coordinate their actions. We derive conditions under which multiple identities exist in society, which could be different social classes (or groups with different world views on certain topics), just as we observe in reality. Our results back up arguments that mixing across social classes can be an important driver for social mobility: when people have friends of different identities, this makes it easier for them to choose an identity that matches their own characteristics best, as doing so does not automatically mean being different from their friends.



Echo chambers have often been cited as increasing the spread of rumours. Theoretically, if people are able to verify information they receive, they tend to verify information that confirms their worldview less than messages that contradicts this view. This means that increases in homophily will lower verification in society, as more homophily means being more likely to meet others with the same worldview as yourself, who are more likely to convey you messages in line with that view. However, if people take this into consideration, they should start to verify information more as homophily increases. In this paper, we aim to find in a lab experiment whether people truly verify confirming and contradictory messages differently, and if they do, how these verification rates change with homophily. This paper is currently a work in progress, as we design an experiment to look into these questions.



In nature, different viruses have different infection characteristics as well as different recovery rates. For example, it is very easy to catch an airborne virus like the cold and very easy to recover from it; other viruses need a much closer connection to be transmitted, but it might also take longer (or be impossible) to recover from them, e.g., HIV. Many cultural phenomena spread similar to viruses, but there is little discussion for how the characteristics of a meme might be adjusted to maximise its spread in a population.



Funding Information:

European Union Next-GenerationEU - National Recovery and Resilience Plan (NRRP) – MISSION 4 COMPONENT 2, INVESTMENT N.1.2 – CUP N.H73C22001340001.