NTE22: Neuromorphic tactile exploration

Topic Leaders

Staff

  • S. Mueller-Cleve, Istituto Italiano di Tecnologia

  • L. Khacef, University of Groningen

  • M. Cartiglia, University of Zürich/ETH Zürich

Goal

We will exploit neuromorphic event-driven encoding, biologically inspired computation and learning, spiking neural networks and their supporting hardware processors for the physical interaction of artificial agents with the external world. Tactile and proprioceptive sensing are paramount to perceive the physical characteristics of objects such as softness, weight, shape, texture, etc. It helps build a solid knowledge about the objects the robot has to interact with and plan adequate control strategies for grip control, manipulation and tactile exploration.

Tactile exploration of objects

Leaders: C. Bartolozzi, J. Triesch, E. Donati

This project focuses on the exploration of tactile stimuli (surfaces and objects) to learn their properties. We will use the Omega robot (6DoF) and the iCub robot (both in simulation and remotely, with support of the IIT group) to perform exploratory actions of increasing complexity. The robot will slide its fingertips on different objects, starting from planar shapes up to 3D complex objects. Tactile information will be used to guide the exploratory movements of the robot. Contact detection and stimuli pattern and orientation will be used to find and follow the contour of objects. The recognition of the object at hand will be performed by integrating information from multiple sensory modalities over time: tactile sensor for texture and orientation, fingertip trajectories and movements from proprioception (motor's encoders, position sensors in the fingers, F/T sensors in the joints, etc.). Eye movements can complement the tactile and proprioceptive information, studying the effect of sensor fusion on the classification.

Tactile exploration for object recognition will be the context to study the use of event-driven sensors and Spiking Neural Networks for decision making to guide the exploratory actions of the robot, multisensor fusion, spatio-temporal pattern recognition

Check out the template code we wrote for a quick start:

Contour following: https://github.com/event-driven-robotics/telluride_tac21_contour

Docker Image : https://hub.docker.com/repository/docker/eventdrivenrobotics/telluride

Video tutorial on how to install and run the code is below!

Tactile exploration of tissue for tumor detection

Leaders: C. Bartolozzi, , E. Donati, N. Thakor

This project focuses on the tactile inspection of soft tissue to detect size and depth of small hard included items. This is instrumental for the early detection of tumors and can support clinicians in diagnosis. The robot will first perform stereotyped inspection movements (moving in the z direction over random x,y positions) and, when a possible tumor has been detected, will optimize its control strategy to gather information about depth, size, roughness and shape. We will use the Omega robot and tissue-like samples obtained with silicon rubber and 3D printed included items.

  • How to use TouchSim

Loihi Resources

Note: only participants who have signed the agreement with INTEL will get access to the INRC cloud material for the duration of the agreement from as soon as they sign the agreement up until the end of Telluride.

Available resources

  • General-purpose spike-based neuromorphic processors: Loihi, SpiNNaker

  • Neuromorphic sensors: Tactile sensors from IIT, John Hopkins and NUS, ATIS cameras and RGB cameras for ground truth.

  • iCub, accessible remotely with support, equipped with ATIS, tactile sensors (and encoders, IMU, force-torque sensors), online connection with Loihi (Kapoho Bay) and/or SpiNNaker (48-nodes). The robot can output both synchronous absolute readouts from vision and tactile sensors (for ground truth) and event-driven readouts.

  • Software: NxSDK (Intel Loihi API), YARP (robot middleware), Event-driven library, iCub Gazebo simulator equipped with tactile sensors and event-generators for vision and touch, all in a docker container.

  • iCub skin patch on a desk with 6DoF robot with force sensors to apply controlled stimuli.

  • Example subthreshold mixed-mode circuits and simulations.

  • Pre-recorded tactile datasets, e.g. MNIST-like data, Tactile textures dataset, Vision+Tactile dataset