Evetac
Event-based Optical Tactile Sensing for Robotic Manipulation
Optical tactile sensors, in which an RGB camera captures an elastomer’s deformation, have recently become popular. They provide high spatial resolutions, however, struggle to offer human-like temporal resolutions in the range of milliseconds. To overcome this shortcoming, we study the idea of replacing the RGB camera with an event-based camera and introduce a new high-resolution event-based optical tactile sensor called Evetac. Along with hardware design, we develop touch processing algorithms to obtain and process its measurements online with 1000 Hz. We devise an efficient algorithm to keep track of the elastomer’s deformation through the imprinted markers despite the sensor’s sparse output. Benchmarking experiments demonstrate Evetac's capabilities of sensing vibrations up to 498 Hz, reconstructing shear forces, and offering significantly reduced data rates compared to RGB optical tactile sensors. Moreover, Evetac’s measurements and the marker tracking provide meaningful features for learning data-driven slip detection and prediction models. The learned models form the basis for developing a robust, reliable, and adaptive closed-loop grasp controller capable of handling a wide range of objects. We believe that fast and efficient event-based tactile sensors like Evetac will be essential for bringing human-like manipulation capabilities to robotics.
Table of Contents
Paper Summary Video
This summary video covers most aspects of the paper. It focuses on Evetac's raw output and touch processing, comparison with an RGB optical tactile sensor, data collection for learning the slip detection & prediction models, and the performance of the closed-loop grasp controller.
More & longer videos about the experiments can be found below the following video.
Supplementary Material
Evetac Assembly Video
This video describes the process of assembling & setting up Evetac, including the calibration procedure for the dot tracker. For the list of parts, please click on this link.
Benchmarking Evetac - Evetac's output in comparison with an RGB Optical Tactile Sensor
This Video illustrates the procedure for evaluating Evetac's data rate compared to the data rate of a standard RGB optical tactile sensor (GelSight Mini). The maneuver includes grasping the object, perturbing the object, and finally causing object slippage by opening the gripper.
Despite Evetac's increased sensing frequency of 1000 Hz compared to 25 Hz of the RGB optical tactile sensor, its sparse output results in only generating 1,7% of the data produced by the RGB optical tactile sensor considering the entire maneuver.
Evaluating the Different Versions of the Dot Tracking Algorithm
These videos illustrate the experiments comparing the two different versions of the dot tracking algorithm. On the left, we show the tactile interaction with the sensor, while on the right, we show Evetac's output signal and the results from the dot tracking. In blue, we show the result of the regularized dot tracking. In red, the result from the unregularized version. The visualization displays the estimates of the dot centers as crosses for algorithm versions. Additionally, arrows that indicate how the dots moved w.r.t. their initial position. Note that in the initial frames, only blue crosses are visible, as they occlude the red ones, which are located just behind them.
For both cases below, i.e., interacting with a fingertip or the tail of scissors, the unregularized version loses track of some of the dots. In contrast, the regularized dot tracker is robust w.r.t. the additional events triggered by the external moving object and does not lose track.
Tactile Interaction with a Fingertip
Speed: x1.0
Speed: x0.1
Tactile Interaction with the tail of Scissors
Speed: x1.0
Speed: x0.1
Regularized Tracking during Real Robot Grasping
These additional videos show the performance of the proposed regularized tracking algorithm during the robot grasping experiments presented in the paper. Despite the dots deforming during the manipulation, the tracker still tracks them well.
For more extreme gel deformations during manipulation, the tracking performance starts to decrease.
Slip Detection using Evetac
Reliable detection of slip is a crucial task in robotics and has received lots of attention. The task is especially important as any slippage is related with unstable contacts between finger, i.e., sensor, and object. For achieving stable grasping, any slippage requires a quick corrective action to be taken in order to prevent dropping the grasped object. Due to the importance of robust and reliable slip detection for robotics, in this section, we investigate Evetac's effectiveness for this task. Contrary to prior work, herein, we focus on learning efficient slip detectors from labelled data only.
Slip Detection using Evetac - Objects
The table below illustrates the objects that have been investigated in this work. They differ in terms of size, weight and material. In the material column of the table G represents glass, M metal, P plastic, and Pa paper.
The training objects are used for training the data-driven slip detectors, while the testing objects are used for evaluation.
Slip Detection using Evetac - Training Data Collection
For the data-driven slip detectors, training data has to be generated, as shown in the following videos. Starting from a stable initial grasp, the parallel gripper is opened, resulting in the objects slipping. Left and right to the view of an external camera, we show the readings from the left and right Evetac sensors. In between the external view and the images of the right Evetac sensor, a black bar remains. When the bar's color changes from black to red, the optical-flow based slip classifier detects slip on the readings of the right Evetac sensor.
Obj 1
Speed: x1.0
Speed: x0.05
Obj 3
Speed: x1.0
Speed: x0.05
Slip Detection - Data Validation Through OptiTrack
Robot Grasp Control using Evetac
Leveraging one of the trained slip detection and prediction models, we implement a real-time grasp control loop for picking up and balancing the testing objects. The controllers are reactive and adaptively adjust the grasping force to realize stable lifting and balancing, counteracting any signal of slip through corrective actions.
Note, to the right of the visualization of Evetac's measurements, we visualize a bar that changes its color from black to blue whenever slip is detected by the model.
Experiment Validation with Object 17 - No Grasp Control. As explained in the paper, without any grip control, the gripper would just slide over the object. This validates the soft initialization of the controller, i.e., initially the gripper only makes light contact with the object, applying insufficient forces for stable grasping.
Online Grasp Control experiment with Object 17 - This video illustrates successful grasping and stabilization of Object 17. In contrast to the left video, we deploy an online grasp controller which constantly adapts the grasping forces to counteract object slippages. As shown, the controller results in stable grasping and object lifting.
Online Grasp Control with Object 17
Online Grasp Control with Object 18
Online Grasp Control with Object 16
Online Grasp Control with Object 15
Online Grasp Control with Object 13
Online Grasp Control with Object 12
Online Grasp Control with Object 11
Online Grasp Control with Object 10
Online Grasp Control with Object 14 - failed balance
Online Grasp Control with Object 14 - success
Online Grasp Control with Object 19
Online Grasp Control with Object 20
Robustness Evaluation - Different Sensor
We evaluate the robustness of the slip detection models w.r.t. using a different Evetac sensor during the grasp control experiments. The sensor is mounted on the other side of the parallel gripper.
Online Grasp Control with Different Evetac - Object 19
Online Grasp Control with Different Evetac - Object 20
Online Grasp Control with Different Evetac - Object 16
Online Grasp Control with Different Evetac - Object 17
Robustness Evaluation - Different Sensor & Different (Closed) Gel
We also showcase that the slip detection models can transfer to using the closed, original gel while still using the different Evetac sensor. The experiments therefore underline that our data collection procedure indeed results of during the grasp control experiments. The sensor is mounted on the other side of the parallel gripper.
Online Grasp Control with Different Evetac & Closed Gel - Object 19
Online Grasp Control with Different Evetac & Closed Gel - Object 20
Online Grasp Control with Different Evetac & Closed Gel - Object 16
Online Grasp Control with Different Evetac & Closed Gel - Object 17
Robustness Evaluation - Sideways Grasps & Active Grasp Perturbation
Last, we evaluate the robustness of the previously introduced grasp controller. We demonstrate the robustness in three specific scenarios. First, we show that the controller is still functional if the object is grasped sideways. Second, we showcase reactivity w.r.t. grasp perturbations by dropping a weight of 20g onto the grasped object. Third, we increase the perturbation and drop a weight of 100g onto the grasped object.
Online Grasp Control with Object 16 - Sideways Grasp
Online Grasp Control with Object 16 - Sideways Grasp
Online Grasp Control with Object 16 - Sideways Grasp & Drop 20g
Online Grasp Control with Object 16 - Sideways Grasp & Drop 100g