Advancing wearable devices and applications through novel design, sensing, actuation, and AI
A full-day workshop at the IEEE International Conference on Robotics and Automation (ICRA)
Date: Friday May 17th, 2024
Time: 09:00 - 17:00
Location: Room CC-503
Yokohama, Japan
Abstract
Wearable devices have great potential to transform fields ranging from healthcare to robotics. They can facilitate unprecedented datasets of human behavior, support on-demand personalized insights or assistance, and enrich interactions among humans, robots, and the environment.
These applications also raise exciting challenges. Machine learning pipelines must operate on data that is sparse or highly variable across subjects and environments. Sensors and actuators should adapt to complex motions yet be compact and operated by novice users. Multimodal frameworks yield novel insights but increase deployment and algorithm complexity. Applying wearables to robots enhances their skills but raises questions of transferring networks or devices between agents.
This workshop will explore advances, challenges, and future directions within such areas. Technical areas include machine learning pipelines, soft or customizable sensor and actuator fabrication, multimodal fusion, and modeling. Modalities include biosignals, tactile sensing, and motion sensing. Application areas include healthcare, movement assistance or augmentation, robotic manipulation, human-robot interaction, and AR/VR. Topics can also include data capture or deployment considerations and the roles of academia and industry for development and dissemination.
The workshop welcomes 2-page extended abstracts, presented as lightning talks and interactive posters / demonstrations. The workshop will also feature keynote presentations and panelists. We look forward to seeing you in Japan!
Topics of Discussion
Sensor and actuator design or fabrication, including soft or customizable devices for humans or robots
Machine learning for wearable sensor data, including sparse datasets and cross-subject generalizability
Multimodal sensing and cross-modal insight generation
Sensing modalities including biosignals, tactile sensing, or motion
Applications and experimental considerations including healthcare, rehabilitation, movement assistance or augmentation, robotic manipulation or contact awareness, human-machine interaction, and AR/VR
Data capture for human behavior or robot perception, including unobtrusive sensing or robot skins