Current Works


Multimodal Execution Monitoring


 
Assistive robots have the potential to serve as caregivers, assisting with activities of daily living (ADLs) and instrumental activities of daily living (IADLs). Detecting when something has gone wrong could help assistive robots operate more safely and effectively around people. However, the complexity of interacting with people and objects in human environments can make errors difficult to detect. I introduce a multimodal execution monitoring system to detect and classify anomalous executions when robots operate near humans. The system’s anomaly detector models multimodal sensory signals with a hidden Markov model (HMM) and uses a likelihood threshold that varies based on the progress of task execution. The system classifies the type and cause of common anomalies using an artificial neural network. I evaluate my system with haptic, visual, auditory, and kinematic sensing during household tasks and human-robot interactive tasks (feeding assistance) performed by a PR2 robot with able-bodied participants and people with disabilities. In my evaluation, my methods performed better than other methods from the literature, yielding higher area under curve (AUC) and shorter detection delays. Multimodality also improved the performance of monitoring methods by detecting a broader range of anomalies.

[Publications]

[1] Daehyung Park, Yuuna Hoshi, and Charles C. Kemp. “Multimodal Anomaly Detection for Robot-Assisted Feeding Using an LSTM-based Variational Autoencoder”, IEEE Robotics and Automation Letters (RA-L), 2018. [submitted]

[2] Daehyung Park, Hokeun Kim, and Charles C. Kemp. “Multimodal Anomaly Detection for Assistive Robots”, Autonomous Robots, 2016. [submitted]

[3] Daehyung Park, Hokeun Kim, Yuuna Hoshi, Zackory Erickson, Ariel Kapusta, and Charles C. Kemp. “A Multimodal Execution Monitor with Anomaly Classification for Robot-Assisted Feeding”, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS2017) [PDF] [Video]

[4] Daehyung Park, Zackory Erickson, Tapomayukh Bhattacharjee, and Charles C. Kemp. “Multimodal Execution Monitoring for Anomaly Detection During Robot Manipulation”, IEEE International Conference on Robotics and Automation, 2016. (ICRA2016) [PDF][Video]

[Resources]
Anomaly Detection resources

 

Assistive Manipulation




 


General-purpose mobile manipulators have the potential to serve as a versatile form of assistive technology. However, their complexity creates challenges, including the risk of being too difficult to use. We present a proof-of-concept robotic system for assistive feeding that consists of a Willow Garage PR2, a high-level web-based interface, and specialized autonomous behaviors for scooping and feeding yogurt. As a step towards use by people with disabilities, we evaluated our system with 5 able-bodied participants. All 5 successfully ate yogurt using the system and reported high rates of success for the system's autonomous behaviors. Also, Henry Evans, a person with severe quadriplegia, operated the system remotely to feed an able-bodied person. In general, people who operated the system reported that it was easy to use, including Henry. The feeding system also incorporates corrective actions designed to be triggered either autonomously or by the user. In an offline evaluation using data collected with the feeding system, a new version of our multimodal anomaly detection system outperformed prior versions.

[Publications]

[1] Ariel Kapusta, Philip Grice, Henry Clever, Yash Chitalia, Daehyung Park, and Charles C. Kemp. “An Assistive Robotic System with a Robotic Bed and a Mobile Manipulator”,  [under preparation]

[2] Daehyung Park and Charles C. Kemp, "Multimodal Execution Monitoring for Robot-Assisted Feeding," TechSAge State of the Science Conference, 2017 [PDF]

[3] Ariel Kapusta, Yash Chitalia, Daehyung Park, and Charles C. Kemp. "Collaboration Between a Robotic Bed and a Mobile Manipulator May Improve Physical Assistance for People with Disabilities," IEEE ROMAN workshop on Behavior, Adaptation and Learning for Assistive Robotics" (BAILAR), 2016 [PDF]

[4] Daehyung Park, Youkeun Kim, Zackory Erickson, and Charles C. Kemp. “Towards Assistive Feeding with a General-Purpose Mobile Manipulator”, ICRA2016 workshop on Human-Robot Interfaces for Enhanced Physical Interactions, 2016 [PDF]



Haptic Manipulation



 

Combining Tactile Sensing and Vision to infer Dense Haptic Labels

 

Haptically-guided Interleaving Planning and Control

 

Learning Initial Conditions to reach into the unkown

 


[Publications]
T. Bhattacharjee, A. A Shenoi, D. Park, J. Rehg, and C. Kemp, "Combining Tactile Sensing and Vision for Rapid Haptic Mapping",
IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS2015) [PDF]
D. Park, A. Kapusta, J. Hawke, and C. Kemp. “Interleaving Planning and Control for Efficient Haptically-guided Reaching in Unknown Environments”, IEEE-RAS International Conference on Humanoid Robots (Humanoids 2014) [PDF]
D. Park, A. Kapusta, Y. Kim, J. Rehg, and C. Kemp. “Learning to Reach into the Unknown: Selecting Initial Conditions When Reaching in Clutter”, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS2014) [PDF]

[Software]
Tactile sensing plugin for GAZEBO : https://github.com/gt-ros-pkg/gt-meka-sim