REPLAB: A Reproducible Low-Cost Arm Benchmark Platform For Robotic Learning
Brian Yang, Jesse Zhang, Vitchyr Pong, Sergey Levine, and Dinesh Jayaraman
Abstract
Standardized evaluation measures have aided in the progress of machine learning approaches in disciplines such as computer vision and machine translation. In this paper, we make the case that robotic learning would also benefit from benchmarking, and present the "REPLAB" platform for benchmarking vision-based manipulation tasks. REPLAB is a reproducible and self-contained hardware stack (robot arm, camera, and workspace) that costs about 2000 USD, occupies a cuboid of size 70x40x60 cm, and permits full assembly within a few hours. Through this low-cost, compact design, REPLAB aims to drive wide participation by lowering the barrier to entry into robotics and to enable easy scaling to many robots. We envision REPLAB as a framework for reproducible research across manipulation tasks, and as a step in this direction, we define a template for a grasping benchmark consisting of a task definition, evaluation protocol, performance measures, and a dataset of 92k grasp attempts. We implement, evaluate, and analyze several previously proposed grasping approaches to establish baselines for this benchmark. Finally, we also implement and evaluate a deep reinforcement learning approach for 3D reaching tasks on our REPLAB platform.
Full Technical Report: https://arxiv.org/abs/1905.07447
A short version of this report was published at ICRA 2019 as: Brian Yang, Dinesh Jayaraman, and Sergey Levine, "REPLAB: A Reproducible Low-Cost Arm Benchmark for Robotic Learning".
Follow us!
To hear news about major updates from us, please sign up at: https://forms.gle/yy7RLa2U6ioxWcJKA
Video
Hardware
Shopping list
- 1 x WidowX arm (link)
- 40 series T-slotted profiles (link)
- 2 x 30 cm
- 1 x 38 cm
- 4 x 50 cm
- 4 x 70 cm
- 18 x 40 series corner brackets (link)
- 36 x M8 bolt assemblies (link)
- 1 x 40 series slotted inside corner bracket (link)
- 1 x 1/4-20 x 9/16" screw (link)
- Intel Realsense SR300 camera (link)
- Toy objects
Laser cutting templates
Templates (link) include:
- Arm mounting plate
- Arena floor
- Arena fencing
If you do not have access to a laser cutter, then we recommend using one of many available online laser cutting services.
Assembly Instructions
Instructions for cell assembly can be found on the Assembly page.
Software
Camera Setup
Before running the Docker image, the camera drivers for the SR300 Intel Realsense camera need to be set up. Detailed instructions for setting up the drivers can be found here.
Docker Image
Pull our Docker image for operating a cell directly with:
docker pull bhyang12/replab
To run the image:
docker run -it --rm --privileged bhyang12/replab
If operating multiple rigs on one machine, you may need to manually specify which ports/devices the Docker container can access. As an example:
docker run -it --rm --device=/dev/video0 --device=/dev/video1 --device=/dev/ttyUSB0 bhyang12/replab
To run with GPU and display port access (requires nvidia-docker 2), use:
docker run --runtime=nvidia -e NVIDIA_DRIVER_CAPABILITIES=compute,utility -e NVIDIA_VISIBLE_DEVICES=all -it --net=host --env="DISPLAY" --volume="$HOME/.Xauthority:/root/.Xauthority:rw" --rm --privileged bhyang12/replab
Operating the Cell
Instructions for using our provided scripts can be found in our GitHub repository here.
Downloads
Grasp Dataset
Shared dataset of grasps (download link) [92k grasps, 73GB zip file, 226GB unzipped]
Need Help?
Contact replabbenchmark@gmail.com for any questions or concerns.