The Feeling of Success:

Does Touch Sensing Help Predict Grasp Outcomes?


A successful grasp requires careful balancing of contact forces. Deducing whether a particular grasp will be successful from indirect measurements, such as vision, is therefore quite challenging, and direct sensing of contacts through touch sensing provides an appealing avenue toward more successful and consistent robotic grasping. However, in order to fully evaluate the value of touch sensing for grasp outcome prediction, we must understand how touch sensing can influence outcome prediction accuracy when combined with other modalities. Doing so using conventional model-based techniques is exceptionally difficult. In this work, we investigate the question of whether touch sensing aids in predicting grasp outcomes within a multimodal sensing framework that combines vision and touch. To that end, we collected more than 9,000 grasping trials using a two-finger gripper equipped with GelSight high-resolution tactile sensors on each finger, and evaluated visuo-tactile deep neural network models to directly predict grasp outcomes from either modality individually, and from both modalities together. Our experimental results indicate that incorporating tactile readings substantially improve grasping performance.



In the video, we show an example of grasping on one of the unseen test objects. During each grasp, using visuo-tactile inputs, the deep neural network model decide whether to lift the object or not. The first two times the grasp is estimated as not sufficiently good and the object is re-grasped. At the third attempt, the success rate is estimated as sufficiently high and the object is successfully lifted.


The full dataset collected and used to train the models is available here, together with a test python notebook:

EDIT 22/12/2017: we updated the dataset to be in hdf5 instead of pickle format, and we also included a demo file to explore the data.


  • The Feeling of Success: Does Touch Sensing Help Predict Grasp Outcomes? Roberto Calandra, Andrew Owens, Manu Upadhyaya, Wenzhen Yuan, Justin Lin, Edward H. Adelson, Sergey Levine ; PMLR 78:314-323 [PDF]

Contact Us

For any question you can contact us at