Learning to Fold Real Garments with One Arm:

A Case Study in Cloud-Based Robotics Research


Ryan Hoque*, Kaushik Shivakumar*, Shrey Aeron, Gabriel Deza, Aditya Ganapathi,

Adrian Wong, Johnny Lee, Andy Zeng, Vincent Vanhoucke, Ken Goldberg

Supplemental Material

Paper: [Link]

Appendix: [Link]

All Datasets (8.4 GB): [Link]

All Models (418 MB): [Link]

All Experiment Logs (23.2 GB): [Link]

Google PyReach Code: [Link]

Project Fork Code: [Link]

Abstract

Autonomous fabric manipulation is a longstanding challenge in robotics, but evaluating progress is difficult due to the cost and diversity of robot hardware. Using Reach, a new cloud robotics platform that enables low-latency remote execution of control policies on physical robots, we present the first systematic benchmarking of fabric manipulation algorithms on physical hardware. We develop 4 novel learning-based algorithms that model expert actions, keypoints, reward functions, and dynamic motions, and we compare these against 4 learning-free and inverse dynamics algorithms on the task of folding a crumpled T-shirt with a single robot arm. The entire lifecycle of data collection, model training, and policy evaluation is performed remotely without physical access to the robot workcell. Results suggest a new algorithm combining imitation learning with analytic methods achieves 84% of human-level performance on the folding task.

1-Minute Supplemental Video

1min-reach-video.mp4