Tactile Object Identification

Classical robotic approaches to tactile object identification often involve rigid mechanical grippers, dense sensor arrays and exploratory procedures (EPs). Though EPs are a natural method for humans to acquire object information, evidence also exists for meaningful tactile property inference from brief, non-exploratory motions (a 'haptic glance'). In this work we implement tactile object identification and feature extraction techniques on data acquired during a single, unplanned grasp with a simple, underactuated robot hand equipped with inexpensive barometric pressure sensors. Our methodology utilizes two cooperating schemes based on an advanced machine learning technique (random forests) and parametric methods that estimate object properties. The available data is limited to actuator positions (one per two link finger) and force sensors values (8 per finger). The schemes are able to work both independently and collaboratively, depending on the task scenario. When collaborating, the results of each method contribute to the other, improving the overall result in a synergistic fashion. Unlike prior work, the proposed approach does not require object exploration, re-grasping, grasp release or force modulation and works for arbitrary object start positions and orientations. Due to these factors the technique may be integrated into practical robotic grasping scenarios without adding time or manipulation overheads.

Single-Grasp Object Classification and Feature Extraction with Simple Robot Hands and Tactile Sensors.