We are pleased to announce two keynote speakers:
Open University of Cyprus, Nicosia CYPRUS
Research Centre on Interactive Media, Smart Systems, and Emerging Technologies
http://www.jahna-otterbacher.net/
Short bio
Jahna Otterbacher received her doctorate from the University of Michigan (Ann Arbor, USA), where she was a member of the Computational Linguistics and Information Retrieval (CLAIR) research group. She is currently Assistant Professor at the Open University of Cyprus (OUC), Faculty of Pure and Applied Sciences, where she is the academic coordinator of the MSc in Social Information Systems.
Jahna also coordinates the Cyprus Center for Algorithmic Transparency (CyCAT) at the OUC, a new initiative funded by the H2020 Widespread Twinning program. The CyCAT seeks to promote transparency and accountability in algorithmic systems that people routinely use, but that are rather opaque to them (e.g., search engines), through three types of interventions – data-, developer- and user-focused. In addition to her post at the OUC, Jahna holds a concurrent appointment as team leader of the Transparency in Algorithms Group at the Research Centre on Interactive Media, Smart Systems and Emerging Technologies), a new center of excellence and innovation in Nicosia, Cyprus, in collaboration with two international Advanced Partners, UCL (UK) and MPI (Germany).
Jahna’s research has been published in journals such as the ACM Transactions on Internet Technology (ACM TOIT) and Knowledge and Information Systems, as well as in top-tier international conferences such as the ACM Conference on Human Factors in Computing Systems and the AAAI Conference on Human Computation and Crowdsourcing (HCOMP). Jahna has also served as a contributor to the Harvard Business Review, where she writes about the social implications of Big Data practices and analytics.
Keynote Title
Revealing Social Bias in Human-Machine Information Systems
The biases surrounding analytics, algorithmic processes and intelligent systems that are playing an ever-growing role in our society, are being discussed extensively in the press, by researchers as well as by policy makers. There is concern not only about their accuracy and potential to bring about concrete harms, but also about their tendency to perpetuate social stereotypes and inequalities. In this talk, I first provide a brief overview of governmental and industry initiatives to promote algorithmic transparency and accountability. Next, I focus on our team’s approach, in which we audit parts of the typical machine learning pipeline to reveal social biases. In the first example, we examine the production of racial and gender biases in a common human intelligence task, image labelling, which is often used in producing training data for machine vision algorithms. In a second example, we audit the output of the Microsoft Bing search engine, to examine how gender stereotypes are perpetuated when images are retrieved on character trait queries (e.g., “intelligent” or “emotional person”). In conclusion, I shall argue that in Human-Machine Information Systems, which use large-scale data reflecting human intelligence, that while the reproduction of social biases is likely inevitable, there are ways to effectively raise users’ awareness of these biases.
Short Bio
Kristy Milland is community manager of TurkerNation.com. She has been a crowd worker for more than a decade, as well as a Requester and a researcher. She will attend law school at the University of Toronto this fall, and is keenly interested in the treatment of crowd workers, both legislatively and directly by users of and the platforms themselves. She travels the world speaking about the experiences of workers, advising government bodies on rules which promote ethical treatment, non-profit organizations and unions on how to help, and academics on how to work best with this population.
Keynote Title
Bias from the Worker Perspective
Workers understand that Requesters need to have good quality data, or else they'll leave the platform. That means that most workers are interested in reducing bias as much as the Requester is. We therefore are one of the best places to look for information on how to reduce bias, and how to ensure your data quality is as high as possible.