Acoustically Aware Robots

Detecting and evaluating the sounds robots make and hear.

Abstract

The sound a robot or automated system makes and the sounds it listens for in our shared acoustic environment can greatly expand its contextual understanding and to shape its behaviors to the interactions it is trying to perform.

People convey significant information with sound in interpersonal communication in social contexts. Para-linguistic information about where we are, how loud we're speaking, or if we sound happy, sad or upset are relevant to understand for a robot that looks to adapt its interactions to be socially appropriate.

Similarly, the qualities of the sound an object makes can change how people perceive that object and can alter whether or not it attracts attention, interrupts other interactions, reinforces or contradicts an emotional expression, and as such should be aligned with the designer's intention for the object.

In this tutorial, we will introduce the participants to software and design methods to help robots recognize and generate sound for human-robot interaction (HRI). Using open-source tools and methods designers can apply to their own robots, we seek to increase the application of sound to robot design and stimulate HRI research in robot sound.

The material will be available soon.

You can already join our discord, and check the Tutorial page for the material that is already available.