Empowering Interactive Robots by Learning Through Multimodal Feedback Channel
ACM ICMI 2021
October 22, 2021


News

  • 13.09.2021: We are going virtual! The registration is already open. For more information, click here.

  • 13.09.2021: Our workshop day is determined: 22 October 2021. You can find the details and schedule on Program.

  • 18.03.2021: We have three keynote speakers confirmed: Dr. Judith Holler, Dr. Georgia Chalvatzaki and Dr. Heni ben Amor.

Overview

Robots provide the potential to assist humans both in the workplace and in everyday life. In contrast to classical robotic applications, assistive robots will face a variety of different tasks, making it essential for them to learn through direct interaction with users. While recent advances in Machine Learning have facilitated learning from non-expert users, it is still an open question of how to optimally incorporate the inherent multimodality of human feedback into these algorithms. Additionally, there is yet to discover the full potential of combining explicit and implicit human feedback channels.

In this workshop, we will focus on how to best incorporate multimodal human feedback into existing learning approaches and which multimodal interactions are preferred and/or required for future robotic assistance. We provide a forum for interdisciplinary exchange between researchers from disciplines such as HRI, HCI, Affective Computing, natural language understanding, and machine learning to stimulate a discussion on how multimodality can become key to the emerging field of interactive machine learning and lead to more intuitive and successful interactions with robots in the future.

Topics

Topics of interest include, but are not limited to:

  • Multimodal Human Robot Interaction

  • Multimodal Human Behavior Modelling

  • Interactive Reinforcement Learning

  • Learning from Demonstration

  • Multimodal Instructions

  • Affective Computing for Robotics

  • Cognitive Models for Multimodal Interaction

  • …