A Billion Ways to Grasp
An Evaluation of Grasp Sampling Schemes on a Dense, Physics-based Grasp Data Set
Clemens Eppner, Arsalan Mousavian, Dieter Fox
Robot grasping is often formulated as a learning problem. With the increasing speed and quality of physics simulations, generating large-scale grasping data sets that feed learning algorithms is becoming more and more popular. An often overlooked question is how to generate the grasps that make up these data sets. In this paper, we review, classify, and compare different grasp sampling strategies. Our evaluation is based on a fine-grained discretization of SE(3) and uses physics-based simulation to evaluate the quality and robustness of the corresponding parallel-jaw grasps. Specifically, we consider more than 1 billion grasps for each of the 21 objects from the YCB data set. This dense data set lets us evaluate existing sampling schemes w.r.t. their bias and efficiency. Our experiments show that some popular sampling schemes contain significant bias and do not cover all possible ways an object can be grasped.
Data
Download: https://zenodo.org/record/4713945
Structure: The data set consists of 21 h5 files describing the grasps of 21 objects from the YCB object set (download here or here). Each file contains the following key-value pairs:
object: string identifying the object mesh file name
object_class: string identifying the object category
object_com: list of 3 floats describing the object's center of mass
object_scale: float describing the scale of the mesh [m]
poses: Nx7 lists of floats describing the grasp poses (position, followed by a unit quaternion (xyzw))
quality_flex_success: N booleans, describing grasp success (object still between fingers after shaking motion)
quality_flex_pregrasp_linear: N floats, accumulated linear motion during finger closing for each grasp
quality_flex_pregrasp_angular: N floats, accumulated angular motion during finger closing for each grasp
quality_flex_postgrasp_linear: N floats, accumulated linear motion during shaking motion for each grasp
quality_flex_postgrasp_angular: N floats, accumulated angular motion during shaking motion for each grasp
If you use the data please cite:
@inproceedings{EppnerISRR2019,
title = {A Billion Ways to Grasps - An Evaluation of Grasp Sampling Schemes on a Dense, Physics-based Grasp Data Set},
author = {Clemens Eppner and Arsalan Mousavian and Dieter Fox},
year = {2019},
booktitle = {Proceedings of the International Symposium on Robotics Research ({ISRR})},
address = {Hanoi, Vietnam}
}
Real-World Executions
![](https://www.google.com/images/icons/product/drive-32.png)