HAT
Head-Worn Assistive Teleoperation of Mobile Manipulators
International Conference on Robotics and Automation (ICRA) 2023
Akhil Padmanabha*, Qin Wang*, Daphne Han, Jashkumar Diyora, Kriti Kacker, Hamza Khalid, Liang-Jung Chen, Carmel Majidi, Zackory Erickson
*These authors contributed equally and are ordered alphabetically.
This work was continued and the second version of HAT was presented at the International Conference on Human Robot Interaction 2024. More information is located on the HAT2 Website.
Project Video
Abstract
Mobile manipulators in the home can provide increased autonomy to individuals with severe motor impairments, who often cannot complete activities of daily living (ADLs) without the help of a caregiver. Teleoperation of an assistive mobile manipulator could enable an individual with motor impairments to independently perform self-care and household tasks, yet limited motor function can impede one's ability to interface with a robot. In this work, we present a unique inertial-based wearable assistive interface, embedded in a familiar head-worn garment, for individuals with severe motor impairments to teleoperate and perform physical tasks with a mobile manipulator. We evaluate this wearable interface with both able-bodied (N = 16) and individuals with motor impairments (N = 2) for performing ADLs and everyday household tasks. Our results show that the wearable interface enabled participants to complete physical tasks with low error rates, high perceived ease of use, and low workload measures. Overall, this inertial-based wearable serves as a new assistive interface option for control of mobile manipulators in the home.
Proposed Interface
The interface, shown below, consists of a hat integrated with an absolute orientation inertial measurement unit (IMU), microcontroller with builtin Bluetooth capabilities, and small lithium polymer (LiPo) battery. A thin layer of Neoprene foam is added over the electronics for comfort. Build instructions, code, and installation procedures are detailed here.
Robot Modes and Mode Switching
Speech recognition, using audio captured by a wireless microphone worn by the participant, is used for selection of four robot modes: drive, arm, wrist, and gripper, as shown in the figure on the left. On the right is a video showing a participant using speech recognition for mode switching while completing a blanket removal task during the human study.
![](https://www.google.com/images/icons/product/drive-32.png)
Robot Mapping
Signals from the head-worn interface are communicated to the mobile manipulator via Bluetooth and mapped to velocity commands for the robot's actuators based on the mode the user is in. For example, the arm mode mapping is shown below.
Human Study
A human study was conducted with 16 healthy participants and 2 participants with motor impairments. Participants were asked to complete the four tasks shown below, Cup Retrieval, Trash Pickup, Blanket Removal, and Leg Cleaning. Participants were additionally asked to complete Task 1, Cup, with a conventional web interface with modifiable speed control in conjunction with a head tracking software and single button mouse.
Videos
![](https://www.google.com/images/icons/product/drive-32.png)
Video of participant with impairments, I1, completing blanket removal task
![](https://www.google.com/images/icons/product/drive-32.png)
Sped-up video of healthy participant using head-worn interface for cup retrieval task
![](https://www.google.com/images/icons/product/drive-32.png)
Sped-up video of healthy participant using head-worn interface for blanket removal task
![](https://www.google.com/images/icons/product/drive-32.png)
Video of participant with impairments, I2, completing trash retrieval task
![](https://www.google.com/images/icons/product/drive-32.png)
Sped-up video of healthy participant using head-worn interface for trash pickup task
![](https://www.google.com/images/icons/product/drive-32.png)
Sped-up video of healthy participant using head-worn interface for leg cleaning task
Comparison with Web Interface
The head-worn interface serves as an alternative to a web interface for people who have a difficult time accessing traditional computing systems. For Task 1 (Cup), we additionally had participants use a Stretch RE1 web interface as a baseline.
![](https://www.google.com/images/icons/product/drive-32.png)
Sped-up video of healthy participant using head-worn interface for cup retrieval task
![](https://www.google.com/images/icons/product/drive-32.png)
Sped-up video of healthy participant using web interface for cup retrieval task