Soft Robotics

This week we will explore robot grasping. Learn about some of the challenges of picking up objects of different sizes and shapes when not specifically pre-programmed for this type of behavior

Papers to Read

There will be a discussion surrounding these papers on Saturday, August 12th at 8:00 PM PST. Please join the discussion in the #robot_grasping channel.

Robotic Grasping and Contact: A Review

Antonio Bicchi et. al

Abstract - In this paper, we survey the work in robotic grasping related areas that has been done over the last two decades, with a bias toward the development of the theoretical framework and analytical results in this area. In addition we assess the state of the art in this area and outline some of the important open problems.



Universal Robotic Gripper Based on the Jamming of Granular Material

Eric Brown et. al

Abstract - Gripping and holding of objects are key tasks for robotic manipulators. The development of universal grippers able to pick up unfamiliar objects of widely varying shape and surface properties remains, however, challenging. Most current designs are based on the multifingered hand, but this approach introduces hardware and software complexities. These include large numbers of controllable joints, the need for force sensing if objects are to be handled securely without crushing them, and the computational overhead to decide how much stress each finger should apply and where. Here we demonstrate a completely different approach to a universal gripper. Individual fingers are replaced by a single mass of granular material that, when pressed onto a target object, flows around it and conforms to its shape. Upon application of a vacuum the granular material contracts and hardens quickly to pinch and hold the object without requiring sensory feedback. We find that volume changes of less than 0.5% suffice to grip objects reliably and hold them with forces exceeding many times their weight. We show that the operating principle is the ability of granular materials to transition between an unjammed, deformable state and a jammed state with solid-like rigidity. We delineate three separate mechanisms, friction, suction, and interlocking, that contribute to the gripping force. Using a simple model we relate each of them to the mechanical strength of the jammed state. This advance opens up new possibilities for the design of simple, yet highly adaptive systems that excel at fast gripping of complex objects.

Dex-Net 2.0: Deep Learning to Plan Robust Grasps with Synthetic Point Clouds and Analytic Grasp Metrics

Jeffrey Mahler et. al

Abstract - To reduce data collection time for deep learning of robust robotic grasp plans, we explore training from a synthetic dataset of 6.7 million point clouds, grasps, and analytic grasp metrics generated from thousands of 3D models from Dex-Net 1.0 in randomized poses on a table. We use the resulting dataset, DexNet 2.0, to train a Grasp Quality Convolutional Neural Network (GQ-CNN) model that rapidly predicts the probability of success of grasps from depth images, where grasps are specified as the planar position, angle, and depth of a gripper relative to an RGB-D sensor. Experiments with over 1,000 trials on an ABB YuMi comparing grasp planning methods on singulated objects suggest that a GQ-CNN trained with only synthetic data from Dex-Net 2.0 can be used to plan grasps in 0.8s with a success rate of 93% on eight known objects with adversarial geometry and is 3× faster than registering point clouds to a precomputed dataset of objects and indexing grasps. The Dex-Net 2.0 grasp planner also has the highest success rate on a dataset of 10 novel rigid objects and achieves 99% precision (one false positive out of 69 grasps classified as robust) on a dataset of 40 novel household objects, some of which are articulated or deformable. Code, datasets, videos, and supplementary material are available at http://berkeleyautomation.github.io/dex-net.


Learning Hand-eye Coordination for Robotic Grasping with Deep Learning and Large-scale Data Collection

Sergey Levine et. al

Abstract - We describe a learning-based approach to hand-eye coordination for robotic grasping from monocular images. To learn hand-eye coordination for grasping, we trained a large convolutional neural network to predict the probability that task-space motion of the gripper will result in successful grasps, using only monocular camera images independent of camera calibration or the current robot pose. This requires the network to observe the spatial relationship between the gripper and objects in the scene, thus learning hand-eye coordination. We then use this network to servo the gripper in real time to achieve successful grasps. We describe two large-scale experiments that we conducted on two separate robotic platforms. In the first experiment, about 800,000 grasp attempts were collected over the course of two months, using between 6 and 14 robotic manipulators at any given time, with differences in camera placement and gripper wear and tear. In the second experiment, we used a different robotic platform and 8 robots to collect a dataset consisting of over 900,000 grasp attempts. The second robotic platform was used to test transfer between robots, and the degree to which data from a different set of robots can be used to aid learning. Our experimental results demonstrate that our approach achieves effective real-time control, can successfully grasp novel objects, and corrects mistakes by continuous servoing. Our transfer experiment also illustrates that data from different robots can be combined to learn more reliable and effective grasping.

A Slip Detection and Correction Strategy for Precision Robot Grasping

Daniela Rus et. al

Abstract - This paper presents a grasp force regulation strategy for precision grasps. The strategy makes no assumptions about object properties and surface characteristics and can be used with a wide range of grippers. It has two components: a slip signal detector that computes the magnitude of slip and a grasping force set point generator that acts on the detector’s output. The force set point generator is designed to ensure that slip is eliminated without using excessive force. This is particularly important in several situations like grasping fragile objects or in-hand manipulation of thin small objects. Several experiments were conducted to simulate various grasping scenarios with different objects. Results show that the strategy was very successful in dealing with uncertainty in object mass, surface characteristics, or rigidity. The strategy is also insensitive to robot motion.

Videos to Watch

Joshua Lessing, Director of Research and Development

Josh joined Soft Robotics Inc. after working as a postdoctoral fellow in the laboratory of Prof. Whitesides at Harvard University. As the Senior Scientist at SRI, Josh is responsible for the design and fabrication of a fundamentally new class of chemically inspired robotic actuators. He holds a Sc.B. in Chemistry from Brown University and a Ph.D. in Physical Chemistry from the Massachusetts Institute of Technology. His unique perspective on robotics has allowed for the creation of soft and adaptive robotic actuators through a novel combination of materials and fabrication methods.


Projects to Work on

Project 1: Build your own gripper challenge

Introduction

We have already seen how difficult robot grasping unstructured objects can be so now it's time to see if you can do better! For this task, you are challenged to construct a gripper capable of grasping a pen, penny, and a raw egg. There are lots of challenges here with these three objects, but with some "research" (i.e. YouTube) you may be able to find some grippers you can make really cheaply to get the job done. If you're successful, post your gripper in action in the #robot_grasping channel. Be creative, but most of all have fun!

Project 2: Explore Dex-Net 2.0

Introduction

Now that you have read the paper on Dex-Net 2.0 and watched the lecture by Ken Goldberg, you may want to see if you can explore these concepts yourself. Luckily, the folks at UC Berkeley have open-sourced a large portion of their work for the general public. Below are a list of resources you may find useful if you would like to recreate (or experiment with) concepts in robot grasping. For more information on Dex-Net 2.0 please look here.

Suggestions for Exploration

  • Try to find the optimal location of where to grab five (5) objects.
  • See if you can re-create the grasp robustness show in Figure 9.
  • Build a robotic arm and run Dex-Net 2.0 on this arm and evaluate the performance with your robot. Are your results the same? Why or why not? Do you think having better materials (e.g better servos, more feedback in the gripper, etc.) would improve results?

Datasets

Code

Documentation


Announcing Udacity's First Workshop with Robot Garden AUgust 12-13th!

Udacity is very excited to present our inaugural Robotics Nanodegree program workshop! Created in collaboration with Robot Garden, this workshop will be an opportunity to explore the field of Soft Robotics (in which robots have no rigid components)—we’ll work together as a group to create a soft robotic gripper that is capable of picking up a variety of different objects. All attendees will receive a complementary one-month membership to Robot Garden, which will enable you to hone your skills in areas like 3D printers and laser cutters, as well as work with real robots like Turtlebots and Parrot drones!

Space is limited; sign-up here! (There are two different events)

https://www.meetup.com/Robot-Garden/events/242204060/

https://www.meetup.com/Robot-Garden/events/242362132/


Please sign the Robot Garden Wavier prior to the class: http://www.robotgarden.org/get-involved/waiver/

Soft Robotic Gripper: https://youtu.be/uPx8xwRpfFk

Contact Jim Berry for more details (@berryjam in Slack)