3. Working Conditions

Download the 2024 Report

Trust and Safety professionals see some of the very worst of humanity. Child sexual abuse, animal torture, beheading, rape, threats, hate speech, and more. With generative AI tools becoming vastly more accessible, it is possible that the scale of this content could accelerate


T&S work is not easy. Professionals experience stress not only from exposure to traumatic content and bad behavior, but also from the pressures of standing between harms and the people they are working to protect. Navaroli argues that the personal costs of the work are exponentially higher for Black professionals and others who experience added trauma when content is targeting identities they share (race, gender, sexual orientation). They bear the costs of performing this “superhero” role, at the same time they are expected to meet high standards for pleasant behavior, as job postings in my research reveal. 

While it is the CEO’s who testify before Congress, it is T&S professionals on the inside of companies who implement the complex and difficult work needed to address content online and other digital harms.

My research interviews indicate that the ongoing waves of layoffs serve to break team cohesion and leave professionals with a feeling of precarity or lack of worth. 


Additionally, research from the Bureau of Investigative Journalism in Latin America, Asia, and the US and UK, found that content moderators globally experience harms as a result of their T&S work, specifically trends of mental health issues and lack of support, with some pointing to workload issues tied to a reduction in internal staffing and therefore increased outsourcing. 


On top of the internal job pressures, my research suggests that T&S professionals feel the pressure from outside their companies. Activists and civil society organizations, rightly so, call on the tech companies to fix the problems on their platforms.