The Un-Kidnappable Robot: Acoustic Localization of Sneaking People
Mengyu Yang^, Patrick Grady^, Samarth Brahmbhatt*,
Arun Balajee Vasudevan**, Charles C. Kemp***, James Hays^
^Georgia Institute of Technology *Intel Labs
**Carnegie Mellon University ***Hello Robot Inc.
Abstract
How easy is it to sneak up on a robot? We examine whether we can detect where people are using only the incidental sounds they produce as they move, even when they try to be quiet. We collect a robotic dataset of high-quality 4-channel audio paired with 360 degree RGB data of people moving in different indoor settings. We train models that predict whether there is a moving person nearby and then their location. We implement our method on a robot in real time, demonstrating the ability for robots to navigate populated indoor spaces in a passive manner.
The Robot Kidnapper Dataset
We collect data across 4 movement categories: standing still, quiet walking, normal walking, and loud walking. Each category is under 2 robot settings: static and dynamic robot. We collect data across 8 rooms and 12 participants. Below are examples from a few categories. Headphones recommended! 🎧