September 23rd and 24th, 2022 | London & Online

Workshop on Human Behavioral Aspects of (X)AI

Overview

Interest in explainable AI has significantly increased in recent years. Explanations provide transparency for what are often black box procedures, which is viewed as critical for fostering the trust in AI in real-world practice. Explainable Artificial Intelligence (XAI) research aims to improve user’s understanding of the decision-making process of AI systems, which in turn should build trust in AI systems.

Research on XAI from the perspective of machine learning (ML) focuses on the development of methods for automated production of explanations of AI decisions without sufficient investigation into how these explanations affect human recipients’ beliefs. On the other hand, the research in psychology and cognitive science on explanations and trust has just began to explore the domain of AI explainability. Bringing these two communities together is crucial if we are to build explainable AI systems that are helpful to the human recipient and that would impact people’s beliefs in ways that we want and hope they would.

The goal of the workshop is to bring these two communities together and to facilitate communication and raise awareness regarding how people process explanations. This would mark an important step in building tools that help us better communicate AI prediction processes to human recipients.

Important dates

  • Abstract submission deadline: 11 September 2022, 23:59 (anywhere on earth)

  • Author notification: 16 September 2022

  • Workshop: 23 September 2022 (9:00 to 17:00 BST) and 24 September 2022 (9:00 to 13:00 BST)

Registration

To register for the workshop please follow this link.

Location

In person:

Clore Management Centre (CLO B01)

Birkbeck, University of London

Torrington Square

London

WC1E 7JL

United Kingdom


Online:


To join online, please use the following zoom link:

https://us06web.zoom.us/j/84723905152?pwd=MERWTm53NW83V3c3NHVUQUR3SWErdz09


Meeting ID: 847 2390 5152

Passcode: 965653

Contact

human.behavioral.xai@gmail.com