Tutorial on Sustainable Deep Learning for Time-series: Reservoir Computing

Tutorial of the European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML-PKDD 2023

🗓️ September 22, 2023  - 🕘 h9 am - 🏛️ Room R3

this event is combined with the 2nd International workshop on Machine Learning for Irregular Time Series

This tutorial is targeted to both researchers and practitioners that are interested in setting up fastly-trained recurrent neural networks for structured data. It will open their mind to a computing paradigm different from the most well-known ones, which offers state-of-the-art performances on several applications. Such a paradigm should be considered as its own and to implement mixtures of approaches in order to tackle seriously the environmental challenges. It is also recognized in neuroscience as an important principle to interpret and model brain computations.
Basic concepts on Machine Learning and Deep Neural Networks are suggested prerequisites.

Abstract


Reservoir Computing (RC) is a promising approach for time-series forecasting that offers efficient and sustainable deep learning alternatives to traditional neural networks. This tutorial provides an introduction to sustainable deep learning with RC, including its theoretical foundations, practical implementation, and advantages over other approaches.

The tutorial starts with an overview of the RC framework, including the Echo State Network (ESN) and Liquid State Machine (LSM) models. The fundamental principles of these models are explored, including their unique network structure, the learning mechanisms, and the fundamentals of RC mathematical background. The tutorial then proceeds to explore the strong links with computational neuroscience, and the exciting recent advances in neuromorphic hardware implementations of reservoirs. Finally, the tutorial covers practical implementation aspects, existing software libraries, as well as successful application examples in several domains including automotive, avionics, health, and robotics.

Outline

This tutorial has a duration of 90 minutes. It is structured in the following chapters.

Chapter 1: Introduction

Sustainable AI & motivations to RC. Basic RC concepts & formulation. Basic models: echo state networks, deep echo state networks, simple cycle reservoir, euler state networks, graph echo state networks, etc. 

Chapter 2: Mathematical foundations

Recurrent Neural Networks as input-driven dynamical systems. Stability of recurrent neural networks; Memory-nonlinearity trade-off; Edge-of-chaos dynamics.   

Chapter 3: Computational Neuroscience of Reservoir Computing

Reservoir computing in spiking neural networks; advantages and disadvantages of RC as a model of cortical computation; influence of biological network features on RC performance

Chapter 4: Neuromorphic Hardware implementations

Principles of neural networks in unconventional and neuromorphic hardware; Physical substrates for neuromorphic reservoirs: photonics, memristors, spintronics, and mechanical systems.

Chapter 5: Software libraries and applications

Python frameworks for Reservoir Computing; hyperparameter exploration; building complex architectures; example applications (e.g., sound classification with implicit segmentation) and tricks of the trade. Short hands-on lab demonstration (ReservoirPy): Implementing Echo State Networks in Python for time-series classification.

Chapter 6: Quo vadis? 

Conclusions and outlook. 

Speakers

Claudio Gallichio

University of Pisa, Italy

claudio.gallicchio@unipi.it

Claudio Gallicchio is an Assistant Professor of Machine Learning at the Department of Computer Science of the University of Pisa, Italy. He received his PhD from the University of Pisa, where he focused on Reservoir Computing models and theory for structured data. 

His research is based on the fusion of concepts from Deep Learning, Recurrent Neural Networks, and Randomized Neural Systems. He is the founder and former chair of the IEEE CIS Task Force on Reservoir Computing.

Andrea Ceni

University of Pisa, Italy

andrea.ceni@di.unipi.it 

Andrea Ceni received the M.Sc. degree in Mathematics cum laude from Università degli Studi di Firenze, Italy, in 2017, and the Ph.D. degree from the Department of Computer Science of the University of Exeter, UK, in 2021. He has been a Postdoctoral Research Associate at CEMPS, University of Exeter. Currently, he is a Research Fellow at the Department of Computer Science of the University of Pisa, Italy. 

His research interests include recurrent neural networks, deep learning, computational neuroscience, chaos theory, and complex systems.

Abigail Morrison

Institute of Neuroscience and Medicine (INM-6), Research Centre Jülich, and Computer Science 3 - Software Engineering, RWTH Aachen, Germany

morrison@fz-juelich.de 

Abigail Morrison has a background in physics and AI, which led her ineluctably to the field of computational neuroscience.  

Her primary research focus is uncovering the brain's computational principles, with particular emphasis on representation and learning, through the development of spiking neural network models.

Gianluca MIlano

INRiM, Torino, Italy

g.milano@inrim.it 

Gianluca Milano is currently a permanent researcher at the Italian National Institute of Metrological Research (INRiM). He received a Ph.D. in Physics cum laude from Politecnico di Torino, Italy, in collaboration with the Italian Institute of Technology (IIT). 

His main research interests and activities focus on: i) the investigation of electronic and ionic transport properties and physicochemical phenomena in nanodevices and low dimensional systems; and ii) memristive devices and architectures for memory and neuromorphic applications, from material synthesis to device characterization, modeling, and implementation of unconventional and brain-inspired computing paradigms in neuromorphic architectures.

Xavier Hinaut

INRIA, Bordeaux, France

xavier.hinaut@inria.fr 

Xavier Hinaut is Research Scientist in Computational Neuroscience at Inria, Bordeaux, France. 

His work is at the frontier of machine learning, neurosciences, linguistics and robotics: from the modelling of human sentence processing to the analysis of birdsongs. He leads ReservoirPy library development. His current focus is on creating complex reservoir architectures (Deep Reservoirs, Reservoir Transformers, ...) to imitate brain processing.