**Deadline for workshop application: August 14th 2025 August 28th, 2025 **
During this critical time, we see an opportunity to gather as one community, to encourage honest conversation about the “hows" and “whys" of sociotechnical safety research. We invite researchers in both fields to discuss how CSCW’s methods, norms, and theories might bridge this emergent community, e.g., building meaningful collaborations with participants, researcher/participant safety. To cultivate reflexivity and reflection (R&R), we will host a closed-door panel of experienced researchers to share learnings from their work before collaboratively developing artifacts outlining actions that researchers can take to address these challenges. By fostering a collective learning environment at CSCW, we will assist researchers across disciplines to conduct responsible sociotechnical safety research by prioritising reflexivity.
is a PhD candidate at Royal Holloway University of London in the Center for Doctoral Training in Cyber Security for the Everyday. She utilizes ethnographically informed methods to consider digital and ontological security practices of populations impacted by conflict.
is a postdoctoral fellow at Georgetown University within the Initiative for Tech & Society. She uses qualitative research methods and cryptography to develop usable privacy tools. Lucy completed her PhD in computer science at Brown University in 2023.
is a postdoctoral researcher at Microsoft Research. She is broadly interested in how technology mediates harm, how to intervene, and what it means to do so. Emily earned her PhD from Cornell Information Science in 2024, where her primary project examined digital technologies' role in intimate partner violence.
is a Ph.D. candidate in Computer Science & Engineering at the University of Washington. Her work investigates how gender, interpersonal relationships, and other topics in social discourses shape the security, privacy, and digital safety of people. She is particularly interested in investigating how sociotechnical harms surface through emerging technologies or on social media.
is a social scientist and a Professor in the Information Security Group at Royal Holloway University of London. Her work is distinctly ethnographic in nature and explores information security needs, perspectives and practices among groups often living and working at what we might call the margins of societies.
is an assistant professor at George Mason University in Information Sciences and Technology. She studies the privacy practices of vulnerable populations and the impact of complex surveillance ecosystems and data relations on our shifting norms around privacy. She is interested in how people threat model in the context of reproductive health.
is the Clare Luce Boothe Assistant Professor of Computer Science at Georgetown University and a Faculty Associate at the Berkman Klein Center for Internet & Society at Harvard University. She uses computational, economic, and social science methods to understand users’ security, privacy, and online safety-related decision-making processes, with a particular focus on safety in intimate interactions.
is a research scientist on Sony AI's AI Ethics team and a visiting scholar in Information Science at University of Colorado Boulder. Morgan broadly focuses on mitigating technical harms, particularly in the context of AI development and deployment. Much of his work has examined how computer vision systems construct identity, and how that construction disempowers marginalized groups.
is an Assistant Professor in Design and Global Development at Northumbria University, where she also co-leads the 'Design Feminisims' and 'MARGINALITIES' research groups. She has conducted research with refugees, activists and humanitarians, with a focus on the interplay between sociotechnical systems and mechanisms of oppression and resistance.