NTE22: Neuromorphic tactile exploration
Topic Leaders
Chiara Bartolozzi, Istituto Italiano di Tecnologia
Elisa Donati, University of Zürich/ETH Zürich
Staff
S. Mueller-Cleve, Istituto Italiano di Tecnologia
L. Khacef, University of Groningen
M. Cartiglia, University of Zürich/ETH Zürich
Invitees
Veronica Santos, UCLA
Goal
We will exploit neuromorphic event-driven encoding, biologically inspired computation and learning, spiking neural networks and their supporting hardware processors for the physical interaction of artificial agents with the external world. Tactile and proprioceptive sensing are paramount to perceive the physical characteristics of objects such as softness, weight, shape, texture, etc. It helps build a solid knowledge about the objects the robot has to interact with and plan adequate control strategies for grip control, manipulation and tactile exploration.
Tactile exploration of objects
Leaders: C. Bartolozzi, J. Triesch, E. Donati
This project focuses on the exploration of tactile stimuli (surfaces and objects) to learn their properties. We will use the Omega robot (6DoF) and the iCub robot (both in simulation and remotely, with support of the IIT group) to perform exploratory actions of increasing complexity. The robot will slide its fingertips on different objects, starting from planar shapes up to 3D complex objects. Tactile information will be used to guide the exploratory movements of the robot. Contact detection and stimuli pattern and orientation will be used to find and follow the contour of objects. The recognition of the object at hand will be performed by integrating information from multiple sensory modalities over time: tactile sensor for texture and orientation, fingertip trajectories and movements from proprioception (motor's encoders, position sensors in the fingers, F/T sensors in the joints, etc.). Eye movements can complement the tactile and proprioceptive information, studying the effect of sensor fusion on the classification.
Tactile exploration for object recognition will be the context to study the use of event-driven sensors and Spiking Neural Networks for decision making to guide the exploratory actions of the robot, multisensor fusion, spatio-temporal pattern recognition
Check out the template code we wrote for a quick start:
Contour following: https://github.com/event-driven-robotics/telluride_tac21_contour
Docker Image : https://hub.docker.com/repository/docker/eventdrivenrobotics/telluride
Video tutorial on how to install and run the code is below!
Tactile exploration of tissue for tumor detection
Leaders: C. Bartolozzi, , E. Donati, N. Thakor
This project focuses on the tactile inspection of soft tissue to detect size and depth of small hard included items. This is instrumental for the early detection of tumors and can support clinicians in diagnosis. The robot will first perform stereotyped inspection movements (moving in the z direction over random x,y positions) and, when a possible tumor has been detected, will optimize its control strategy to gather information about depth, size, roughness and shape. We will use the Omega robot and tissue-like samples obtained with silicon rubber and 3D printed included items.
How to use TouchSim
How to use SpyTorch for learning SNN
Notebooks: https://github.com/fzenke/spytorch
Docker Installation : https://youtu.be/yiy1f03kbsA?t=178
Nvidia Runtime Installation : https://youtu.be/yiy1f03kbsA?t=470
Running the image : https://youtu.be/yiy1f03kbsA?t=615
Get helpers function to run the image : https://youtu.be/yiy1f03kbsA?t=807
Using the container : https://youtu.be/yiy1f03kbsA?t=1152
Run YARP and gazebo : https://youtu.be/yiy1f03kbsA?t=1239
Start develop and code explanation : https://youtu.be/yiy1f03kbsA?t=1505
Running the code : https://youtu.be/yiy1f03kbsA?t=2220
Docker Image : https://hub.docker.com/repository/docker/eventdrivenrobotics/telluride
Code : https://github.com/event-driven-robotics/telluride_tac21_contour
Loihi Resources
Introduction: https://ieeexplore.ieee.org/document/8259423
Architecture overview: https://www.youtube.com/watch?v=3HRyb7Bmp5U&list=PLJ506hQ4g3Th3sDNaHiqmK6Pr0YTH_aZo&index=50
NxSDK architecture and tutorials: https://www.youtube.com/watch?v=Bf4CskHBTOQ&list=PLJ506hQ4g3Th3sDNaHiqmK6Pr0YTH_aZo&index=52&t=0s
Other Loihi tutorials and presentations: https://intel-ncl.atlassian.net/wiki/spaces/INRC/pages/1080328193/Tutorials+and+Related+Presentations
SLAYER-auto tutorial: https://intel-ncl.atlassian.net/wiki/spaces/INRC/pages/1087734558/SLAYER+office+hours
SLAYER code: https://github.com/bamsumit/slayerPytorch
SpyTorch code: https://github.com/fzenke/spytorch
Note: only participants who have signed the agreement with INTEL will get access to the INRC cloud material for the duration of the agreement from as soon as they sign the agreement up until the end of Telluride.
Available resources
General-purpose spike-based neuromorphic processors: Loihi, SpiNNaker
Neuromorphic sensors: Tactile sensors from IIT, John Hopkins and NUS, ATIS cameras and RGB cameras for ground truth.
iCub, accessible remotely with support, equipped with ATIS, tactile sensors (and encoders, IMU, force-torque sensors), online connection with Loihi (Kapoho Bay) and/or SpiNNaker (48-nodes). The robot can output both synchronous absolute readouts from vision and tactile sensors (for ground truth) and event-driven readouts.
Software: NxSDK (Intel Loihi API), YARP (robot middleware), Event-driven library, iCub Gazebo simulator equipped with tactile sensors and event-generators for vision and touch, all in a docker container.
iCub skin patch on a desk with 6DoF robot with force sensors to apply controlled stimuli.
Example subthreshold mixed-mode circuits and simulations.
Pre-recorded tactile datasets, e.g. MNIST-like data, Tactile textures dataset, Vision+Tactile dataset