Call for Papers

Please submit a short position paper on previous or ongoing research work on biases in crowd data.

Submissions can be up to 3 pages in length (excluding references) and should follow the ACM Manuscript format (available as Latex template and Word template).

The review of submissions will follow a juried process (see https://chi2022.acm.org/for-authors/note-on-selection-processes/). Submissions will be selected based on their relevance to the workshop themes, and the originality and novelty of the submitted ideas.

Manuscripts should be submitted in pdf format at the workshop submission website by 10 September 2021 17 September 2021. At least one author of each accepted paper must attend the workshop and all participants must register for both the workshop and for at least one day of the conference.

For any questions, please contact workshop organizers at bcd21@easychair.org.


Workshop Themes

Through this workshop, we aim to foster discussion on ongoing work around biases in crowd data, provide a central platform to revisit the current research, and identify future research directions that are beneficial to both task requesters and crowd workers.

  • Understanding how annotator attributes contribute to biases

Research on crowd work has often focused on task accuracy whereas other factors such as biases in data have received limited attention. We are interested in reviewing existing approaches and discussing ongoing work that helps us better understand annotation attributes contributing to biases.

  • Quantifying bias in annotated data

An important step towards bias mitigation is detecting such biases and measuring the extent of biases in data. We seek to discuss different methods, metrics and challenges in quantifying biases, particularly in crowdsourced data. Further, we are interested in ways of comparing biases across different samples and investigating if specific biases are task-specific or task-independent.

  • Novel approaches to mitigate crowd bias

We plan to explore novel methods that aim to reduce biases in crowd annotation in particular. Current approaches range from worker pre-selection, improving task presentation and dynamic task assignment. We seek to discuss shortcomings and limitations of existing and ongoing approaches and ideate future directions.

  • Impact on crowd workers

We want to explore how bias identification and mitigation strategies can impact the actual workers, positively or negatively. For example, workers in certain groups may face increased competition and lack of task availability. Collecting worker attributes and profiling could raise ethical concerns.