Call for Papers
Reservoir Computing (RC) denotes a class of recurrent neural models whose dynamics are left unadapted after initialization. The approach is appealing for several reasons, such as fast training, a natural propensity to edge computing and strong theoretical foundations with implications for the basic properties of recurrent neural networks in general.
After the success of the first edition, the 2nd International Workshop on Reservoir Computing (RC 2024) intends to bring back together researchers to update the discussion on the state-of-the-art and the cutting-edge challenges in the field of RC, in all its declinations. These include, among the others, new models of Echo State Networks and Liquid State Machines, applications to problems of AI also in the human-centric perspective, emerging paradigms, RC for structured data, deep RC, hybrid RC/fully trained RNN models, and many more. The workshop provides an open forum for researchers to meet and present recent contributions and ideas in a fervid and highly interdisciplinary environment. Industrial contributions are welcome.
Potential topics of interest for the workshop include (without being limited to) the following:
Echo State Networks, Liquid State Machines
Hybrids of fully-trained and RC models
Deep Reservoir Computing
Reservoir Computing for structured data (trees, graphs, networks, …)
Ensemble learning and Reservoir Computing
Trustworthy AI concepts for Reservoir Computing
Reservoir dimensionality reduction, efficient reservoir hyper-parameter search and learning
Reservoir Computing in Neuroscience
Theoretical analysis of Reservoir Computing
Statistical Learning Theory of Reservoir Computing networks
Reservoir Computing for AI applications (e.g., vision, natural language processing, health, bioinformatics, etc.)