Evetac

Event-based Optical Tactile Sensing for Robotic Manipulation

Optical tactile sensors, in which an RGB camera captures an elastomer’s deformation, have recently become popular. They provide high spatial resolutions, however, struggle to offer human-like temporal resolutions in the range of milliseconds. To overcome this shortcoming, we study the idea of replacing the RGB camera with an event-based camera and introduce a new high-resolution event-based optical tactile sensor called Evetac. Along with hardware design, we develop touch processing algorithms to obtain and process its measurements online with 1000 Hz. We devise an efficient algorithm to keep track of the elastomer’s deformation through the imprinted markers despite the sensor’s sparse output. Benchmarking experiments demonstrate Evetac's capabilities of sensing vibrations up to 498 Hz, reconstructing shear forces, and offering significantly reduced data rates compared to RGB optical tactile sensors. Moreover, Evetac’s measurements and the marker tracking provide meaningful features for learning data-driven slip detection and prediction models. The learned models form the basis for developing a robust, reliable, and adaptive closed-loop grasp controller capable of handling a wide range of objects. We believe that fast and efficient event-based tactile sensors like Evetac will be essential for bringing human-like manipulation capabilities to robotics.

Table of Contents

Paper Summary Video

This summary video covers most aspects of the paper. It focuses on Evetac's raw output and touch processing, comparison with an RGB optical tactile sensor, data collection for learning the slip detection & prediction models, and the performance of the closed-loop grasp controller.

More & longer videos about the experiments can be found below the following video.

Supplementary Material

Evetac Assembly Video

This video describes the process of assembling & setting up Evetac, including the calibration procedure for the dot tracker. For the list of parts, please click on this link.

Benchmarking Evetac - Evetac's output in comparison with an RGB Optical Tactile Sensor

This Video illustrates the procedure for evaluating Evetac's data rate compared to the data rate of a standard RGB optical tactile sensor (GelSight Mini). The maneuver includes grasping the object, perturbing the object, and finally causing object slippage by opening the gripper.

Despite Evetac's increased sensing frequency of 1000 Hz compared to 25 Hz of the RGB optical tactile sensor, its sparse output results in only generating 1,7% of the data produced by the RGB optical tactile sensor considering the entire maneuver.

eb_vs_rgb-optical_x1.mp4

Evaluating the Different Versions of the Dot Tracking Algorithm

These videos illustrate the experiments comparing the two different versions of the dot tracking algorithm. On the left, we show the tactile interaction with the sensor, while on the right, we show Evetac's output signal and the results from the dot tracking. In blue, we show the result of the regularized dot tracking. In red, the result from the unregularized version. The visualization displays the estimates of the dot centers as crosses for algorithm versions. Additionally, arrows that indicate how the dots moved w.r.t. their initial position. Note that in the initial frames, only blue crosses are visible, as they occlude the red ones, which are located just behind them.

For both cases below, i.e., interacting with a fingertip or the tail of scissors, the unregularized version loses track of some of the dots. In contrast, the regularized dot tracker is robust w.r.t. the additional events triggered by the external moving object and does not lose track. 

Tactile Interaction with a Fingertip

no_reg_1_20_x1.mp4

Speed: x1.0

no_reg_1_20_x01.mp4

Speed: x0.1

Tactile Interaction with the tail of Scissors

no_reg_5_x1.mp4

Speed: x1.0

no_reg_5_x01.mp4

Speed: x0.1

Regularized Tracking during Real Robot Grasping

These additional videos show the performance of the proposed regularized tracking algorithm during the robot grasping experiments presented in the paper. Despite the dots deforming during the manipulation, the tracker still tracks them well.

slip_rr_bottle_heavy_11_2_x1.mp4
slip_rr_porridge_100_g_7_2_x1.mp4

For more extreme gel deformations during manipulation, the tracking performance starts to decrease.

diff_cam_diff_gel_elephant_full_control_2_2_x1.mp4

Slip Detection using Evetac

Reliable detection of slip is a crucial task in robotics and has received lots of attention. The task is especially important as any slippage is related with unstable contacts between finger, i.e., sensor, and object. For achieving stable grasping, any slippage requires a quick corrective action to be taken in order to prevent dropping the grasped object. Due to the importance of robust and reliable slip detection for robotics, in this section, we investigate Evetac's effectiveness for this task. Contrary to prior work, herein, we focus on learning efficient slip detectors from labelled data only.

Slip Detection using Evetac - Objects

The table below illustrates the objects that have been investigated in this work. They differ in terms of size, weight and material. In the material column of the table G represents glass, M metal, P plastic, and Pa paper. 

The training objects are used for training the data-driven slip detectors, while the testing objects are used for evaluation.

Slip Detection using Evetac - Training Data Collection

For the data-driven slip detectors, training data has to be generated, as shown in the following videos. Starting from a stable initial grasp, the parallel gripper is opened, resulting in the objects slipping. Left and right to the view of an external camera, we show the readings from the left and right Evetac sensors. In between the external view and the images of the right Evetac sensor, a black bar remains. When the bar's color changes from black to red, the optical-flow based slip classifier detects slip on the readings of the right Evetac sensor.

Obj 1

train_slip_data_obj1_10_x1.mp4

Speed: x1.0

train_slip_data_obj1_10_x005_crop.mp4

Speed: x0.05

Obj 3

train_slip_data_obj3_31_x1.mp4

Speed: x1.0

train_slip_data_obj3_31_x005_crop.mp4

Speed: x0.05

Slip Detection - Data Validation Through OptiTrack

Robot Grasp Control using Evetac

Leveraging one of the trained slip detection and prediction models, we implement a real-time grasp control loop for picking up and balancing the testing objects. The controllers are reactive and adaptively adjust the grasping force to realize stable lifting and balancing, counteracting any signal of slip through corrective actions.

Note, to the right of the visualization of Evetac's measurements, we visualize a bar that changes its color from black to blue whenever slip is detected by the model.

Experiment Validation with Object 17 - No Grasp Control. As explained in the paper, without any grip control, the gripper would just slide over the object. This validates the soft initialization of the controller, i.e., initially the gripper only makes light contact with the object, applying insufficient forces for stable grasping.

bottle_slide.mp4

Online Grasp Control experiment with Object 17 - This video illustrates successful grasping and stabilization of Object 17. In contrast to the left video, we deploy an online grasp controller which constantly adapts the grasping forces to counteract object slippages. As shown, the controller results in stable grasping and object lifting.

slip_rr_bottle_light_4_x1.mp4

Online Grasp Control with Object 17

slip_rr_bottle_light_6_x1.mp4

Online Grasp Control with Object 18

slip_rr_bottle_heavy_10_x1.mp4

Online Grasp Control with Object 16

slip_rr_porridge_normal_12_x1.mp4

Online Grasp Control with Object 15

slip_rr_sanetizer_1_x1.mp4

Online Grasp Control with Object 13

slip_rr_meridol_2_x1.mp4

Online Grasp Control with Object 12

slip_rr_pen_6_x1.mp4

Online Grasp Control with Object 11

slip_rr_axe_1_x1.mp4

Online Grasp Control with Object 10

slip_rr_balsamico_6_x1_extended.mp4

Online Grasp Control with Object 14 - failed balance

slip_rr_pearls_normal_3_x1.mp4

Online Grasp Control with Object 14 - success

slip_rr_pearls_normal_7_x1.mp4

Online Grasp Control with Object 19

mandarine_full_control_6_x1.mp4

Online Grasp Control with Object 20

elephant_full_control_5_x1.mp4

Robustness Evaluation - Different Sensor

We evaluate the robustness of the slip detection models w.r.t. using a different Evetac sensor during the grasp control experiments. The sensor is mounted on the other side of the parallel gripper.

Online Grasp Control with Different Evetac - Object 19

diff_cam_mandarine_full_control_5_x1.mp4

Online Grasp Control with Different Evetac - Object 20

diff_cam_elephant_full_control_2_x1.mp4

Online Grasp Control with Different Evetac - Object 16

diff_cam_porridge_full_control_1_x1.mp4

Online Grasp Control with Different Evetac - Object 17

diff_cam_light_bottle_full_control_3_x1.mp4

Robustness Evaluation - Different Sensor & Different (Closed) Gel

We also showcase that the slip detection models can transfer to using the closed, original gel while still using the different Evetac sensor. The experiments therefore underline that our data collection procedure indeed results of during the grasp control experiments. The sensor is mounted on the other side of the parallel gripper.

Online Grasp Control with Different Evetac & Closed Gel - Object 19

diff_cam_closed_gel_mandarine_full_control_5_x1.mp4

Online Grasp Control with Different Evetac & Closed Gel - Object 20

diff_cam_diff_gel_elephant_full_control_2_x1.mp4

Online Grasp Control with Different Evetac & Closed Gel - Object 16

diff_cam_closed_gel_porridge_full_control_1_x1.mp4

Online Grasp Control with Different Evetac & Closed Gel - Object 17

diff_cam_closed_gel_light_bottle_full_control_1_x1.mp4

Robustness Evaluation - Sideways Grasps & Active Grasp Perturbation

Last, we evaluate the robustness of the previously introduced grasp controller. We demonstrate the robustness in three specific scenarios. First, we show that the controller is still functional if the object is grasped sideways. Second, we showcase reactivity w.r.t. grasp perturbations by dropping a weight of 20g onto the grasped object. Third, we increase the perturbation and drop a weight of 100g onto the grasped object.

Online Grasp Control with Object 16 - Sideways Grasp

slip_rr_porridge_sideways_2_x1.mp4

Online Grasp Control with Object 16 - Sideways Grasp

slip_rr_porridge_sideways_3_x1.mp4

Online Grasp Control with Object 16 - Sideways Grasp & Drop 20g

porridge_sideways_20g_full_control_1_x1.mp4

Online Grasp Control with Object 16 - Sideways Grasp & Drop 100g

porridge_sideways_100g_full_control_1_x1.mp4