Reservoir Computing (RC) indicates a popular approach for designing and training Recurrent Neural Networks (RNNs) that is distinctively based on leaving the hidden dynamical layer in the architecture - i.e., the reservoir - fixed after initialization informed by stability analysis of the resulting dynamics. With the untrained hidden dynamics of the neural network providing a rich temporal encoding of the driving input signal, only the (typically linear) dense output layer is required to undergo a training process, leading to a clear advantage in terms of computational efficiency in comparison to fully trained RNNs architectures. Major instances of the approach are given by Echo State Networks and Liquid State Machines.
Overall, the field of RC attracted a lot of research attention, due to several reasons. Indeed, besides the striking efficiency of training algorithms, RC neural networks are distinctively amenable to hardware implementations (including in neuromorphic unconventional substrates, like photonics), enable clean mathematical analysis (rooted, e.g., in the field of random matrix theory), and finds natural applications in resource-constrained contexts, such as edge AI systems. Moreover, in the broader picture of Deep Learning development, RC is a breeding ground for testing innovative ideas, e.g. biologically plausible training algorithms beyond gradient back-propagation. Noticeably, although established in the Machine Learning field, RC lends itself naturally to interdisciplinarity, where ideas coming from diverse areas such as computational neuroscience, complex systems and non-linear physics can lead to further developments and new applications.
This session intends to strongly encourage RC research within the international neural networks community, bringing ideas from other connected interdisciplinary fields. We then invite the researchers to submit papers on all aspects of RC research, targeting contributions on new algorithms, implementations, and applications.
A list of relevant topics for this session includes, without being limited to, the following:
* New Reservoir Computing models and architectures, including Echo State Networks and Liquid State Machines
* Hardware, physical and neuromorphic implementations, including spintronics, photonics, and quantum Reservoir Computing
* Reservoir Computing in Computational Neuroscience
* Reservoir Computing on the edge systems
* Novel learning algorithms rooted in Reservoir Computing concepts
* Novel applications of Reservoir Computing, e.g., to images, video and structured data
* Federated and Continual Learning in Reservoir Computing
* Deep Reservoir Computing neural networks
* Theory of complex and dynamical systems in Reservoir Computing
* Extensions of the Reservoir Computing framework, such as Conceptors
* Reservoir dimensionality reduction
* Efficient reservoir hyper-parameter search