Goals & Themes

Workshop Goals

► Bring together a community of researchers and industry experts interested in supporting online community moderators

► Welcome and introduce new researchers to study online community moderation

► Formally establish a list of best practices based on principles and ethical standards for pursuing future moderation research

► Identify and map the agenda for future research topics and questions for future work in this area








CSCW 2019 in Austin, TX

Workshop Themes

Though the themes in this workshop will evolve as discussions progress, we provided some examples to help guide initial discussions: current practices and focuses of volunteer moderation research, recruitment and community access, design interventions and moderation technology, privacy and data collection, and research in marginalized or vulnerable communities. Participants should feel free to use these themes as starting places or jumping boards for questions or topics in their position papers as well.

Current practices and focuses of volunteer moderation research: In order to discuss what the future of research in this domain looks like, it is useful to begin with a discussion of the current state. Recently published work has focused on a handful of prominent platforms, notably including Facebook, Reddit, and Twitch. Less attention has been paid to other types of social spaces, from online games to discussion forums. Though the volume of papers on community moderation is relatively low, the methods used thus far have also been relatively homogeneous; a strong majority of this work has used interview methods, with a small number using other ethnographic methods, quantitative analyses, or experiments. This imbalance suggests several questions ripe for exploration—what types of questions in this space are best answered with each methodological approach? What are the challenges of using each method in this particular research space? Has the high volume of qualitative work in this space resulted from the appropriateness of these methods in answering early questions, or is it a result of the backgrounds of researchers who have thus far been involved?

Recruitment and community access: Whether or not a social computing researcher is directly recruiting volunteer moderators, recruitment material will affect voluntary moderators multiple ways. Whether requesting access to an online community to recruit participants or even running scripts in the background to collect data, researchers' methods can take volunteer community moderators away from the unpaid work they do to manage their communities. Additionally, these potential informants may not know how to respond to recruitment messages or what they should consider in determining whether to grant researchers access, as they may not be fully knowledgeable on the practices and ethics of human subjects research. In this workshop we will imagine strategies for future research where moderators are informed and/or recruited in a way that is easy to understand but contains the most important information that volunteer community moderators need to approve community access. How might moderators operate as ambassadors for their community when determining consent in studying a community?

Design interventions and moderation tools: Moderators and community managers are often constrained in the actions they can take to better govern their communities as a result of strict technological affordances of the platforms they use. Without effective tools for scaling content moderation resulting from increasing community size, moderators may be limited in their ability to curb norm and rule violating behavior and risk their communities collapsing into anomie. How can we design sociotechnical systems and tools to better support moderation teams in the future? Are automated (e.g., machine learning) tools to handle the challenges of scale on large communities a solution? How do we design these new technologies with respect to fairness, accountability, transparency, and ethics in socio-algorithmic online governance? How should research involving sociotechnical interventions and testing new technologies be carried out in this space? What is a “successful” online intervention? How should we evaluate the effectiveness of online moderation tools and their long-term impact on the labor of community moderators?

Privacy and data collection: Following calls for better strategies of informed consent in recruitment messages, future social computing researchers must also pay careful attention to how they collect, store, and share data and identifiers obtained from volunteer community moderators. While potential informants may be comfortable with their usernames being published alongside interview quotes or their communities identified in papers, we have a responsibility as social computing researchers to minimize the risk that may accompany exposure in a future publication. On the other hand, collecting trace data on online communities through web scraping tools may be technically acceptable within the policies of some online platforms that host these communities. How can we be more intentional about gaining consent from volunteer community moderators to collect data? What should volunteer community moderators know about how we record, store, and share their data and identifiers? How can we describe this to them in a way that’s easy to understand?

Marginalized or vulnerable communities: Moderators of online communities with marginalized or vulnerable groups may feel a sense of responsibility to provide a safe space online and to protect their members from harassment, hate speech, and other harmful acts more so than in other kinds of online communities. In some cases, this may lead to increased skepticism about the intentions of researchers and the impact that research may have on their communities. What does this mean for studying moderation in marginalized and vulnerable communities? What experience and/or background should researchers focus on developing prior to beginning work in these spaces? What can we learn from studying moderation of these communities that can’t be learned from studying other communities?