Reservoir Computing (RC) denotes a class of recurrent neural models whose dynamics are left unadapted after initialization. The approach is appealing for several reasons, among which are fast training, neuromorphic hardware implementations, and a natural propensity to edge computing.
In the wake of the recent success of the past editions, the 3rd International Workshop on Reservoir Computing (RC 2025) intends to once more bring together researchers to discuss the state-of-the-art and open challenges in the field of RC, in all its declinations. These include, among others, new models of Echo State Networks and Liquid State Machines, non-conventional hardware (e.g., photonic) implementations of RC systems, applications to problems of AI size with human-level performance, emerging paradigms (e.g., conceptors), RC for structured data, deep RC, hybrid RC/fully trained RNN models, and many more. The workshop provides an open forum for researchers to meet and present recent contributions and ideas in a fervent and highly interdisciplinary environment. Industrial contributions are welcome.
Potential topics of interest for the workshop include (without being limited to) the following:
Echo State Networks, Liquid State Machines
Hybrids of fully-trained and RC models
Deep Reservoir Computing
Reservoir Computing for structured data (trees, graphs, networks, …)
Ensemble learning and Reservoir Computing
Trustworthy AI concepts for Reservoir Computing
Reservoir dimensionality reduction, efficient reservoir hyper-parameter search and learning
Reservoir Computing in Neuroscience
Theoretical analysis of Reservoir Computing
Statistical Learning Theory of Reservoir Computing networks
Reservoir Computing for AI applications (e.g., vision, natural language processing, health, bioinformatics, etc.)