The tutorial addresses the timely and critical topic of sustainable deep learning, focusing on energy-efficient alternatives to traditional neural networks for time-series processing. The emphasis on Reservoir Computing (RC) offers a unique perspective that aligns with the growing importance of Green AI and environmentally conscious machine learning practices.
This tutorial is targeted to both researchers and practitioners who are interested in setting up fastly-trained recurrent neural networks for structured data. It will open their mind to a computing paradigm different from the most well-known ones, which offers state-of-the-art performances on several applications. Such a paradigm should be considered as its own and implement mixtures of approaches to tackle seriously the environmental challenges. It is also recognized in neuroscience as an important principle to interpret and model brain computations.
Basic concepts of Machine Learning and Deep Neural Networks are suggested prerequisites.
Reservoir Computing (RC) is a promising approach for time-series forecasting that offers efficient and sustainable deep learning alternatives to traditional neural networks. This tutorial provides an introduction to sustainable deep learning with RC, including its theoretical foundations, practical implementation, and advantages over other approaches. The tutorial starts with an overview of the RC framework, including the Echo State Network (ESN) and Liquid State Machine (LSM) models. The fundamental principles of these models are explored, including their unique network structure, the learning mechanisms, and the fundamentals of RC mathematical background. The tutorial then proceeds to explore the strong links with computational neuroscience, and the exciting recent advances in neuromorphic hardware implementations of reservoirs. Finally, the tutorial covers practical implementation aspects, existing software libraries, as well as successful application examples in several domains including automotive, avionics, health, and robotics.
In addition to providing practical guidance, this tutorial also emphasizes the importance of sustainability in deep learning. The environmental impact of training deep neural networks is substantial, and RC offers a more energy-efficient approach that is particularly well-suited for time-series forecasting. By using RC, researchers can reduce the carbon footprint of their research while achieving state-of-the-art results.
Overall, this tutorial provides a comprehensive introduction to sustainable deep learning with RC for time-series forecasting, and it is suitable for both beginners and advanced researchers looking to explore this exciting area of machine learning.
Speaker: Claudio Gallicchio
Preliminaries on Deep Learning for sequential data and Recurrent Neural Networks (RNNs); green AI and the problem of designing sustainable Deep Learning algorithms.
Slides: [here]
Speaker: Claudio Gallicchio
Randomization in Deep Neural Networks; Echo State Networks, Liquid State Machines, and deep architectures; quality of dynamics and network topologies; training algorithms; Reservoir Computing for graphs.
Slides: [here]
Speaker: Xavier Hinaut
Python frameworks for Reservoir Computing; hyperparameter exploration; building complex architectures; example applications (e.g., sound classification with implicit segmentation) and tricks of the trade. Short hands-on lab demonstration (ReservoirPy): Implementing Echo State Networks in Python for time-series prediction and classification.
Slides: [here]
Speaker: Andrea Ceni
Recurrent Neural Networks as input-driven dynamical systems. Stability of recurrent neural networks; Memory-nonlinearity trade-off; Edge-of-chaos dynamics.
Slides: [here]
Speaker: Gianluca Milano
Principles of neural networks in unconventional and neuromorphic hardware; Physical substrates for neuromorphic reservoirs: photonics, memristors, spintronics, and mechanical systems.
Slides: [here]
Speaker: Claudio Gallicchio
Conclusions and outlook.
Slides: [here]