Evetac: An Event-based Optical Tactile Sensor for Robotic Manipulation

Niklas Funk, Erik Helmut, Georgia Chalvatzaki, Roberto Calandra, Jan Peters

Accepted at the IEEE Transactions on Robotics (T-RO)

Optical tactile sensors, in which an RGB camera captures an elastomer’s deformation, have recently become popular. They provide high spatial resolutions, however, struggle to offer human-like temporal resolutions in the range of milliseconds. To overcome this shortcoming, we study the idea of replacing the RGB camera with an event-based camera and introduce a new high-resolution event-based optical tactile sensor called Evetac. Along with hardware design, we develop touch processing algorithms to obtain and process its measurements online with 1000 Hz. We devise an efficient algorithm to keep track of the elastomer’s deformation through the imprinted markers despite the sensor’s sparse output. Benchmarking experiments demonstrate Evetac's capabilities of sensing vibrations up to 498 Hz, reconstructing shear forces, and offering significantly reduced data rates compared to RGB optical tactile sensors. Moreover, Evetac’s measurements and the marker tracking provide meaningful features for learning data-driven slip detection and prediction models. The learned models form the basis for developing a robust, reliable, and adaptive closed-loop grasp controller capable of handling a wide range of objects. We believe that fast and efficient event-based tactile sensors like Evetac will be essential for bringing human-like manipulation capabilities to robotics.