Learning to Grasp Clothing Structural Regions for Garment Manipulation Tasks

Wei Chen, Dongmyoung Lee, Digby Chappell and Nicolas Rojas

 REDS Lab, Dyson School of Design Engineering, Imperial College London

Accepted to 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2023)

[ArXiv]




Abstract

When performing cloth-related tasks, such as garment hanging, it is often important to identify and grasp certain structural regions---a shirt's collar as opposed to its sleeve, for instance. However, due to cloth deformability, these manipulation activities, which are essential in domestic, health care, and industrial contexts, remain challenging for robots. In this paper, we focus on how to segment and grasp structural regions of clothes to enable manipulation tasks, using hanging tasks as case study. To this end, a neural network-based perception system is proposed to segment a shirt's collar from areas that represent the rest of the scene in a depth image. With a 10-minute video of a human manipulating shirts to train it, our perception system is capable of generalizing to other shirts regardless of texture as well as to other types of collared garments. A novel grasping strategy is then proposed based on the segmentation to determine grasping pose. Experiments demonstrate that our proposed grasping strategy achieves 92%, 80%, and 50% grasping success rates with one folded garment, one crumpled garment and three crumpled garments, respectively. Our grasping strategy performs considerably better than tested baselines that do not take into account the structural nature of the garments. With the proposed region segmentation and grasping strategy, challenging garment hanging tasks are successfully implemented using an open-loop control policy.

The Garment Hanging Task

In this project, we investigate one of the daily tasks that most people perform: hanging garments. While hanging is easy for a human it is rather difficult for a robot. A few steps are needed to be performed: Find the collar of the crumpled garment, grasp it and transport it to the hook.

Challenges of Garment Hanging

As a deformable object, garment can deform in many ways, leading to self-occlusions.  This will be particullarly challenging for robot perception and manipulation.

It is difficult to grasp the garment without a robust pose estimation, since the garments are highly deformable.

The perception of a garment is the key to the following manipulation task. Failed perception will easily cause failure of garment manipulation system.

Framework

 

Perception System

Data Acquisition and Model Training

Pose Estimation for Grasping and Manipulation

 A novel grasping strategy is then proposed that outputs a suitable grasp pose based on the extracted skeleton and local geometric structure of the segmented region of clothing.

Skeleton-based Center extraction

Grasping Pose Estimation

Action Primitives for Grasping and Garment Hanging

Experiment Setup

  • We use different types of collar-type garments to test the robustness of our system, including:
        • Template shirts (TS, used for perception training,  column1)  
        • Coats (C1, C2, column2) 
        • Polo (P1, P2, column3)
        • Denim jackets (D1, D2, column4)
        • Short sleeve shirts (S1, S2, column5 )
        • Long sleeve shirts (L1, L2, column6)

  • The real-world experiment is implemented on a UR5 robot with a customized parallel gripper. 
  • All the gripper parts are 3d-printed with one soft silicone pad attached to each finger. 
  • An Intel RealSense D435i RGB-D camera is fixed during both data collection and experiment. 

Detailed Presentation Video and Experiment Trials

If you have any questions, please feel free to contact me via W.CHEN21@IMPERIAL.AC.UK