Learning to Singulate Layers of Cloth using Tactile Feedback

Sashank Tirumala*, Thomas Weng*, Daniel Seita*, Oliver Kroemer, Zeynep Temel, David Held

IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), October 2022. Kyoto, Japan.

This paper won the Best Paper Award at the ROMADO-SI workshop.

[arXiv] [kNN data and training code] [robot system code]

Abstract

Robotic manipulation of cloth has applications ranging from fabrics manufacturing to handling blankets and laundry. Cloth manipulation is challenging for robots largely due to their high degrees of freedom, complex dynamics, and severe self-occlusions when in folded or crumpled configurations. Prior work on robotic manipulation of cloth relies primarily on vision sensors alone, which may pose challenges for fine-grained manipulation tasks such as grasping a desired number of cloth layers from a stack of cloth. In this paper, we propose to use tactile sensing for cloth manipulation; we attach a tactile sensor (ReSkin) to one of the two fingertips of a Franka robot and train a classifier to determine whether the robot is grasping a specific number of cloth layers. During test-time experiments, the robot uses this classifier as part of its policy to grasp one or two cloth layers using tactile feedback to determine suitable grasping points. Experimental results over 180 physical trials suggest that the proposed method outperforms baselines that do not use tactile feedback and has a better generalization to unseen fabrics compared to methods that use image classifiers.

Selected Press Coverage

Video Submission

IROS22_0623_VI_i_submitted_video_10MB.mp4

Videos of Trials

These are representative videos of trials reported in the paper, all for our method, referred to as Feedback-Tactile in the paper. These videos are not cut or sped-up.

In these videos, the objective is to grasp the top 1 layer out of the cloth stack. The robot successfully achieve this in these two examples.

DSC_0143.MOV
DSC_0131_grasp_1_layer_success.MOV

In these videos, the objective is to grasp the top 2 layers out of the cloth stack. The robot successfully achieve this in these two examples.

DSC_0238.MOV
DSC_0244_grasp_2_layers_success.MOV

Here are two representative failure cases. In these videos, the objective is to grasp the top 2 layers out of the cloth stack. In the video to the below left the classifier incorrectly thinks the robot has grasped 2 layers (when it actually has grasped just 1) and thus improperly terminates the trial early. In the video to the below right, the robot is able to get two layers after several attempts, but the second layer to slip out of control when the robot lifts to mark the end of the trial.

DSC_0242_grasp_2_layers_failure_thinks_1.MOV
DSC_0241_grasp_2_layers_failure_slippage.MOV

Acknowledgments

We thank LG Electronics and the NSF CAREER grant IIS-2046491 for funding. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the sponsors. We thank our colleagues for helpful hardware support, in particular Sarvesh Patil, Raunaq Bhirangi, Tess Hellebrekers, and Pragna Mannam. We thank our colleagues for helpful paper writing feedback, in particular Zixuan Huang, Chuer Pan, and Sarthak Shetty.

BibTeX

@inproceedings{tirumala2022,

title = {{Learning to Singulate Layers of Cloth based on Tactile Feedback}},

author = {Sashank Tirumala and Thomas Weng and Daniel Seita and Oliver Kroemer and Zeynep Temel and David Held},

booktitle = {IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},

year = {2022},

}