Videos

Below are some videos that I consider interesting from various of my works.

Personalized Collaborative Plans for Robot-Assisted Dressing via Optimization and Simulation

Accompanying video for the paper:

A. Kapusta, Z. Erickson, H. M. Clever, W. Yu, C. K. Liu, G. Turk, and C. C. Kemp, “Personalized Collaborative Plans for Robot-Assisted Dressing via Optimization and Simulation,” Autonomous Robots (AURO), 2019.


A system for bedside assistance that integrates a robotic bed and a mobile manipulator

Accompanying video for the paper:

Kapusta AS, Grice PM, Clever HM, Chitalia Y, Park D, et al. (2019) A system for bedside assistance that integrates a robotic bed and a mobile manipulator. PLOS ONE 14(10): e0221854. https://doi.org/10.1371/journal.pone.0221854

The video demonstrates the evaluations of our system for bedside assistance that integrates a robotic bed and a mobile manipulator. The individuals featured in the video have provided written informed consent (as outlined in PLOS consent form) to publish their image alongside the manuscript. https://doi.org/10.1371/journal.pone....


Person Tracking and Gesture Recognition in Challenging Visibility Conditions Using 3D Thermal Sensing

Accompanying video for the paper:

A. Kapusta and P. Beeson, "Person tracking and gesture recognition in challenging visibility conditions using 3D thermal sensing," 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), New York, NY, 2016, pp. 1112-1119, doi: 10.1109/ROMAN.2016.7745247.


Learning to reach into the unknown: Selecting initial conditions when reaching in clutter

Accompanying video for the paper:

D. Park, A. Kapusta, Y. K. Kim, J. M. Rehg and C. C. Kemp, "Learning to reach into the unknown: Selecting initial conditions when reaching in clutter," 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, 2014, pp. 630-637, doi: 10.1109/IROS.2014.6942625.

This video shows a reaching-in-clutter experiment with a PR2 robot in a foliage-aperture-clutter. The contents of the video contain two reaching trials that Learning Initial Conditions (LIC) selects a passable aperture and a good initial configuration to reach a goal.


Interleaving planning and control for efficient haptically-guided reaching in unknown environments

Accompanying video for the paper:

D. Park, A. Kapusta, J. Hawke and C. C. Kemp, "Interleaving planning and control for efficient haptically-guided reaching in unknown environments," 2014 IEEE-RAS International Conference on Humanoid Robots, Madrid, 2014, pp. 809-816, doi: 10.1109/HUMANOIDS.2014.7041456.

This video shows a haptically-guided interleaving planning and control (HIPC) method with a haptic mapping framework using a simulated DARCI robot in Gazebo simulator. The contents of the video contain five different demonstrations; haptic sensing, haptic mapping, task-space planning & model predictive control, joint-space planning & model predictive control, a full version of reaching demonstration.


Video showcasing work from:

Killpack, M.D., Kapusta, A. & Kemp, C.C. Model predictive control for fast reaching in clutter. Auton Robot 40, 537–560 (2016). https://doi.org/10.1007/s10514-015-9492-6

This video shows the robot Darci (with a 7 degree of freedom arm and series elastic actuators) reaching in artificial foliage using model predictive control with a tactile sensing sleeve. We use a full dynamic model of the robot arm in contact with the world. We also include a collision constraint to limit impact forces. Video is in real time.


Video showcasing work from:

Killpack, M.D., Kapusta, A. & Kemp, C.C. Model predictive control for fast reaching in clutter. Auton Robot 40, 537–560 (2016). https://doi.org/10.1007/s10514-015-9492-6

This video shows teleoperation of the robot Darci (7 degree of freedom arm with series elastic actuators) in an unmodeled canvas bag. We are using model predictive control with whole-arm tactile sensing. We also use a forward model of the dynamics and an explicit collision constraint to help control the forces. The video is in realtime.


Rapid categorization of object properties from incidental contact with a tactile sensing robot arm

Accompanying video for the paper:

T. Bhattacharjee, A. Kapusta, J. M. Rehg and C. C. Kemp, "Rapid categorization of object properties from incidental contact with a tactile sensing robot arm," 2013 13th IEEE-RAS International Conference on Humanoid Robots (Humanoids), Atlanta, GA, 2013, pp. 219-226, doi: 10.1109/HUMANOIDS.2013.7029979.

This video shows the rapid categorization performance using HMMs while the robot reaches into clutter made of trunks and leaves. The robot uses data from the forearm tactile skin for online categorization. A taxel (tactile-pixel) is marked as a green dot if it is categorized as a leaf and as a brown dot if it is categorized as a trunk.