Date: Wednesday, June 12, 2019
Time: 7pm - 11pm
Location: The Federal Bar, The Parlour Room (102 Pine Ave, Long Beach, CA 90802, USA)
Date: Tuesday, June 11, 2019
Time: 2:00 PM - 5:30 PM
Location: Long Beach Convention Center, Room 202
2:00 - 2:10 Introduction
2:10 - 2:30 Designing for Trans and Gender Diverse Inclusion - Sera Fernando
2:30 - 2:45 Improving fairness in machine learning systems: What do industry practitioners need? - Miro Dudik
2:45 - 3:30 Panel on FATE and how it affects the Queer Community
3:30 - 4:00 Break + Chat with sponsors
4:00- 4:25 Designing Human-Centered AI Products - Nida Zada
4:25 - 4:50 The Risk of Racial Bias in Hate Speech Detection - Maarten Sap
4:50 - 5:15 Standing Out: How A Queer First Generation College Student Went from Intern at Goldman Sachs to Chief of Staff to the CEO at Sentieo - Danielle Landrein
5:15 - 5:30 Conclusion
Sera Fernando
Microsoft
Miro Dudik
Nida Zada
Maarten Sap
University of Washington
Danielle Landrein
Sentieo
Yoshua Bengio
MILA
Margaret Mitchell
Luke Stark
Microsoft
Ria Kalluri
Stanford University
The quickly advancing field of machine learning is exciting but raises complex ethical and social questions. How can we best use AI for varying applications while avoiding discrimination and lack of sensitivities to its users? Particularly, Queer users of machine learning systems can fall victim to these often discriminatory, biased, and insensitive algorithms. We want to raise awareness of these issues among the research community. But in order to do so, we need to make sure that the queer community is comfortable among their peers both in the lab and at conferences.
Our data shows that â…” of the queer attendees at NeurIPs are not publically out. The queer attendees rated their comfort level at the conference as a 3.3 on a scale of 1-5 from feeling dangerously hostile to completely welcome. We want to improve these numbers and make queer researchers feel that they can bring their whole selves to these conferences. The queer research communities top two asks are to build the queer AI community and increase the participation and visibility of queer people in machine learning. We aim to work with conference organizers and the queer community to move towards these goals.
We believe the first step for creating more diverse and inclusive algorithms is by talking about the problems and increasing the visibility of queer people in the machine learning community. By bringing together both queer people and allies, we can start conversations around biases in data and how these algorithms can have a negative impact on the queer community.
Andrew McNamara
Raphael Gontijo Lopes
Natalia Bilenko
William Agnew