The Research

Trust & Safety

Digital technologies are interwoven with our everyday lives. They offer convenience, connection, and access to a vast amount of information with relative ease. They can also accelerate, amplify, and extend the reach of so-called bad actors. Social media, smart home devices, and connected cars facilitate gender-based violence, promote mis- and dis-information, and spread online hate that carries over into offline violence. Online environments are vulnerable to cybercrime, fraud, and privacy violations. 


For over a decade, while national and international laws and regulations slowly began to roll into place, a set of workers hired by technology companies to confront these harms has coalesced around the term Trust & Safety (T&S). Research into the nature of the work is the focus of this research as part of  the Human and Social Dimensions of Science and Technology PhD program.


Why This Area of Research

In 2021 I noticed two things. First, I saw what seemed to be an increase in the number of job postings on the website of the Trust and Safety Professional Association (TSPA), and of T&S jobs on other “tech for good” sites. I also noticed that a good number of my colleagues from civil society were being recruited into tech companies as subject matter experts in gender-based violence, hate crimes, human rights, and related areas. 


I took the opportunity offered by a Research Design graduate course to draft a proposal for a research project focusing on T&S. I later developed the project into my Applied Project for the MS Public Interest Technology program at Arizona State University. I continued the work in the intervening year (2022-2023) between completing the PIT program and starting the PhD Human and Social Dimensions of Science and Technology program, also at ASU.

Research Overview

I gathered over one hundred job postings and conducted six IRB-approved semi-structured interviews with T&S professionals between Fall 2021 and Spring 2022. From an analysis of those two data sources, and informed by stakeholders, I sought to contribute both towards defining what T&S work is, as well as to articulate the points of alignment and divergence in the conceptions of T&S work by companies (as represented by the job postings) and professionals (from the interviews). I was able to identify six functions of the work, as well as intended beneficiaries, harms and actors countered, professional skills and mindsets, challenges, and trends. I offered these findings along with a discussion and recommendations in a poster session for the Trust & Safety Research Conference at Stanford University in July 2022, and through a publicly available report and this website released in February 2023 for an audience primarily of T&S professionals.


Concurrently, I continued to gather job postings, eventually collecting a total set of just over 300 postings. I initiated a new round of interviews in Spring 2023, for a total of 15 interviews. While the interview script was the same in the second phase of research, I narrowed my analysis of the data in response to themes that were strongly turning up in the interviews, conversations with stakeholders, and in conversations at conferences in the field. Therefore, I focused on the affective qualities and expertise sought after in job recruitment and the nature of stress in the work environment. Specifically, I argue that while T&S professionals are recruited for their passion and caring, and for their subject matter expertise in the negative sides of human behavior, they find themselves out of place within the corporate cultures which do not elevate affect or approaches that are more risk-conscious. Further, this mismatch results in more damage to professionals’ wellbeing than the often horrible content and behaviors T&S works to counter alone. 


I have presented preliminary findings from this second round of research at four conferences: the GETS (Governing Emerging Technology and Science) conference at the ASU Law School in May 2023, TrustCon - the conference of the TSPA - in July 2023, the European Conference on Domestic Violence in September 2023, and at the meeting of the Society for the Social Studies of Science in November 2023. I have now finalized a brief report based on my research which is intended for an audience of policymakers and civil society organizations who are considering governance approaches to mitigate the harms that accompany social media and other digital and connected everyday technologies.


Research Methodology

My research is shaped and informed by feminist methodological commitments including care, consent, participatory meaning making, and relationship. Care for individuals, communities, and societies impacted by digital technologies drives this research and my larger work. Care for my interviewees led me to design my research in four specific ways. First, I offered confidentiality protections for their identities and affiliations, extending to data collection, storage, and correspondence. Second, I structured the interview scripts to not request information that could violate common non-disclosure agreements (NDA’s) with their current or former employers, instead asking for their views of T&S work or the field overall rather than specific cases or incidents. Third, I only included quotes, even though de-identified, with express permission. Finally, because of the small and tight-knit nature of the field, I let interviewees decide if they wanted to acknowledge me or our work together in public settings, and shared their identities with no one else.


The first and second items are also based on the value of consent. For me, this goes beyond the kind of rote consent many of us give regularly when clicking “I accept” to terms of service, or signing a medical privacy notice form. In the sexual and reproductive health rights field, the acronym FRIES is used to stand for consent that is Freely given, Revocable, Informed, Enthusiastic, and Specific. I could have simply asked the interviewees to sign a consent form (which I did, per IRB). However, I both included more detail and more options than are generally required in the form, and invited a discussion of consent before interviews began. 


In line with both care and consent, I also structured my research throughout to encourage participatory meaning making. I consulted stakeholders (including T&S professionals) in the design of the research. I offered interviewees and stakeholders the opportunity to review and shape my preliminary findings in both phases. Of note, I expressly made this optional to allow for interviewees to opt out of the additional labor. They were warmly invited, but not obligated to continue to participate actively in the research.


Lastly, a word about relationship. I value my long-term connection to the people I have interviewed. I have followed up with news or opportunities as relevant to them. I have shared updates on my work and invited first round interviewees to comment on the second round findings. I am not in favor of research or practice approaches which “parachute in” through short-term engagements typified by hackathons. The perspectives they shared with me were personal, meaningful, and impactful. As noted above, I have deferred to allow them to set the terms of continued professional connection, while offering follow through to let them know about the ongoing research.


Further details on my methods of analysis are available on request. Briefly, I used grounded theory and mixed methods approaches to analyze the job postings and interview transcripts, developing and iterating codes during both rounds of research.


Research Limitations

Two important limitations should be noted. The first and most significant one is that this research does not cover the portion of the T&S field which is primarily located in lower-income communities and countries. A large portion of the total number of T&S workers are not directly employed by big tech companies. It has been reported in the popular press and by investigative journalists that Business Partner Organizations, third-party vendors, and outsourcing organizations pay gig or ghost workers low wages under demanding performance targets with little support for wellbeing to review and now label social media content that is often extremely upsetting. This research did not include job postings for that segment of the work. In my second round of interviews, I spoke with T&S professionals outside the US and Europe, and they did describe to a small degree the conditions in larger-scale content moderation operations. However, I do not feel that my data is robust enough in current scope to adequately address this important area. 


Second, my research is not an assessment of the technical tools, policies, or processes used in T&S work. I am aware of other researchers in academia and civil society focusing on these very important areas. Some of that work can be found in the Journal of Online Trust and Safety, the Trust and Safety Research Conference (both hosted by Stanford), and by various research consortia and collaboratives. 


Future Research

My set of over 300 job postings contains data that can be analyzed in far more and different ways than I have done. As this was scraped from publicly available sites, I am happy to share this with other interested researchers. Trends in hiring, benefits, recruitment strategies, job roles, and other interesting organizational and human resources topics might be of particular interest.


In the vein of Public Interest Technology, Responsible Innovation, and the Governance of Emerging Technology, I believe it would be valuable to examine the genealogy of the term “Trust and Safety.” My research thus far suggests that this term was not generated with an intended connection to other previous corporate nuances such as “consumer safety,” “product safety,” or “workplace safety.” Similarly, there is a small body of literature specific to digital or social platforms and trust, and a much larger set of work on trust in organizations and institutions. It is my sense that exploring these previous uses can shed light on the combined and specifically applied term “Trust and Safety,” especially as everyday technologies in the home, workplace, and public spaces converge with the connectivity and massive data processing of digital technologies.


Finally, the next years of development for the T&S field will be both interesting and important. This year, 2024, will see elections in a significant number of countries, all likely affected to some degree by social media content. Additionally, the widespread availability of generative AI images, text, and content are already impacting T&S work, specifically content moderation. While my personal focus will shift more to my dissertation research in another area, I will remain connected to the T&S field and this topic for years to come.