Reservoir Computing (RC) is a promising approach for time-series forecasting that offers efficient and sustainable deep learning alternatives to traditional neural networks. This tutorial provides an introduction to sustainable deep learning with RC, including its theoretical foundations, practical implementation, and advantages over other approaches. The tutorial starts with an overview of the RC framework, including the Echo State Network (ESN) and Liquid State Machine (LSM) models. The fundamental principles of these models are explored, including their unique network structure, the learning mechanisms, and the fundamentals of RC mathematical background. The tutorial then proceeds to explore the strong links with computational neuroscience, and the exciting recent advances in neuromorphic hardware implementations of reservoirs. Finally, the tutorial covers practical implementation aspects, existing software libraries, as well as successful application examples in several domains including automotive, avionics, health, and robotics.
In addition to providing practical guidance, this tutorial also emphasizes the importance of sustainability in deep learning. The environmental impact of training deep neural networks is substantial, and RC offers a more energy-efficient approach that is particularly well-suited for time-series forecasting. By using RC, researchers can reduce the carbon footprint of their research while achieving state-of-the-art results.
Overall, this tutorial provides a comprehensive introduction to sustainable deep learning with RC for time-series forecasting, and it is suitable for both beginners and advanced researchers looking to explore this exciting area of machine learning.
Slides: [here]
Slides: [here]
Slides: [here]
Slides: [here]
Slides: [here]
Slides: [here]
Slides: [here]
Claudio Gallicchio is an Assistant Professor of Machine Learning at the Department of Computer Science of the University of Pisa, Italy. He received his PhD from the University of Pisa, where he focused on Reservoir Computing models and theory for structured data.
His research is based on the fusion of concepts from Deep Learning, Recurrent Neural Networks, and Randomized Neural Systems. He is the founder and former chair of the IEEE CIS Task Force on Reservoir Computing.
Andrea Ceni received the M.Sc. degree in Mathematics cum laude from Università degli Studi di Firenze, Italy, in 2017, and the Ph.D. degree from the Department of Computer Science of the University of Exeter, UK, in 2021. He has been a Postdoctoral Research Associate at CEMPS, University of Exeter. Currently, he is a Research Fellow at the Department of Computer Science of the University of Pisa, Italy.
His research interests include recurrent neural networks, deep learning, computational neuroscience, chaos theory, and complex systems.
Peter Ford Dominey is a CNRS Research Director at INSERM U1093 in Dijon France. In 1989 and 1993 respectively he obtained the M.Sc. and Ph.D. in computer science from the University of Southern California, developing neural network models of sensorimotor sequence learning, including the first instances of reservoir computing. In 1997 he joined the CNRS at the Institut des Sciences Cognitives in Lyon France and initiated the Robot Cognition Laboratory.
His research interests include understanding and simulating the neurophysiology of cognitive sequence processing, action and language, and their application to robot cognition and language processing, and human-robot cooperation.
Gianluca Milano is currently a permanent researcher at the Italian National Institute of Metrological Research (INRiM). He received a Ph.D. in Physics cum laude from Politecnico di Torino, Italy, in collaboration with the Italian Institute of Technology (IIT).
His main research interests and activities focus on: i) the investigation of electronic and ionic transport properties and physicochemical phenomena in nanodevices and low dimensional systems; and ii) memristive devices and architectures for memory and neuromorphic applications, from material synthesis to device characterization, modeling, and implementation of unconventional and brain-inspired computing paradigms in neuromorphic architectures.
Xavier Hinaut is Research Scientist in Computational Neuroscience at Inria, Bordeaux, France.
His work is at the frontier of machine learning, neurosciences, linguistics and robotics: from the modelling of human sentence processing to the analysis of birdsongs. He leads ReservoirPy library development. His current focus is on creating complex reservoir architectures (Deep Reservoirs, Reservoir Transformers, ...) to imitate brain processing.
Paul Bernard is research engineer in the Mnemosye team, at Inria, Bordeaux, France. He is the actual main developper of ReservoirPy library. He obtained a Engineering Master diploma of ENSC in 2023.