Enhancing user experience through improved auditory feedback — mixing robot sounds with musical material to help robots express joy and frustration.
Figure 1: the NAO robot.
Figure 2: Project workflow.
Figure 3: Impact of musical augmentation on Joyful ratings; adding musical elements significantly enhanced perceived joy.
Sound 1: Joyful robot gesture without augmentation.
Sound 2: Joyful robot gesture with musical augmentation.
Keywords
Sound Design, User Research, Human-Centered Intelligent/AI Systems.
Links
Technologies
R (data preparation and visualization), SPSS (statistical analysis), Max/MSP (sound synthesis).
Background
Despite advances in robotics, many robots still produce noticeable mechanical sounds. These sounds can influence human perception of robot behavior. This project aimed to enhance human-robot interaction with a NAO robot by combining mechanical sounds (engine sounds, motor sounds, glitches) with synthesized musical elements.
Aim
The goal of this project was to combine mechanical robot sounds with musical sounds to more effectively convey specific emotions, such as frustration, sadness, joy, and relaxation.
Approach
This project employed a mixed-methods approach, combining quantitative and qualitative analysis techniques. I led this project from start to finish, designing the experiment, collecting and cleaning data from 31 participants, synthesizing sounds and conducting sentiment analysis.
Data collection:
Experiment 1: Participants provided free-form text descriptions of robot sounds produced by expressive robot movements.
Experiment 2: Participants rated unprocessed mechanical robot sounds versus processed (i.e. musically augmented) robot sounds on a set of emotional scales.
Data analysis:
Text mining and sentiment analysis: Extraction of key themes and emotional sentiment from text descriptions.
Statistical analysis: Two-way repeated measures ANOVAs to compare ratings of different sound designs.
Findings
Experiment 1 revealed a mismatch between the intended emotional expression of the robot and the perceived emotions conveyed by its mechanical sounds. In particular, the robot was perceived as frustrated when performing a joyful gesture (see Sound 1 above). This highlights the importance of carefully designing auditory cues to align with robotic gestures. To address this issue, Experiment 2 explored the impact of combining mechanical sounds with synthesized musical material. By augmenting the mechanical sounds with appropriate musical cues, we were able to effectively increase joyful ratings significantly (see Figure 3).
These findings have important implications for the design of social robots. By carefully designing musical sounds that mix well with existing robot sounds, we can design robot agents that are more effective communicators.
This work was published in the International Journal of Social Robotics (Springer, five-year impact factor 4.8, ranking among the 20 top Social Robotics journals) and has already been cited 31 times.