Tactile Servoing framework

Introduction:

The advent of sensor arrays providing tactile feedback with high spatial and temporal resolution asks for new control strategies to exploit this important and valuable sensory channel for grasping and manipulation tasks.

In this topic, we introduce a control framework to realize a whole set of tactile servoing tasks, i.e. control tasks that intend to realize a specific tactile interaction pattern. This includes such simple tasks like tracking a touched object, maintaining both contact location and contact force, as well as more elaborate tasks like tracking an object’s pose or tactile object exploration. Exploiting methods known from image processing, we introduce robust feature extraction methods to estimate the 2D contact position, the contact force, and the orientation of an object edge being in contact to the sensor. The flexible control framework allows us to adapt the PID-type controller to a large range of different tasks by specification of a projection matrix toggling certain control components on and off.

We demonstrate and evaluate the capabilities of the proposed control framework in a series of experiments employing a 16X.16 tactile sensor array attached to a Kuka LWR as a large fingertip and iCub platform.

Experiment:

Video 2: Demonstrate tactile servo and admittance control on iCub platform

Reference paper:

Qiang Li, CarstenSchürmann, Robert Haschke, Helge Ritter, "A control framework for tactile servoing", Oral presentation by Robotics: Science and Systems 2013

External cooperation:

Herke van Hoof - Autonomous Systems Labs, Darmstadt University