Reservoir Computing:
theory, models, and applications
Special Session of the International Joint Conference on Neural Networks (IJCNN 2023)
18 - 23 June 2023, Gold Coast (Queensland, Australia)
Key Dates
Paper submission deadline: extended to February 7, 2023 (11:59 pm AoE)
Decision notification: March 31, 2023
Description and Topics
Reservoir Computing (RC) is a popular approach for efficiently training Recurrent Neural Networks (RNNs), based on (i) constraining the recurrent hidden layers to develop stable dynamics, and (ii) restricting the training algorithms to operate solely on an output (readout) layer.
Over the years, the field of RC attracted a lot of research attention, due to several reasons. Indeed, besides the striking efficiency of training algorithms, RC neural networks are distinctively amenable to hardware implementations (including neuromorphic unconventional substrates, like those studied in photonics and material sciences), enable clean mathematical analysis (rooted, e.g., in the field of random matrix theory), and finds natural engineering applications in resource-constrained contexts, such as edge AI systems. Moreover, in the broader picture of Deep Learning development, RC is a breeding ground for testing innovative ideas, e.g. biologically plausible training algorithms beyond gradient back-propagation. Noticeably, although established in the Machine Learning field, RC lends itself naturally to interdisciplinarity, where ideas and inspirations coming from diverse areas such as computational neuroscience, complex systems and non-linear physics can lead to further developments and new applications.
This special session is intended to be a hub for discussion and collaboration within the Neural Networks community, and therefore invites contributions on all aspects of RC, from theory, to new models, to emerging applications.
We invite researchers to submit papers on all aspects of RC research, targeting contributions on theory, models, and applications.
A list of topics relevant to this session includes, but is not limited to, the following:
New Reservoir Computing models and architectures, including Echo State Networks and Liquid State Machines
Hardware, physical and neuromorphic implementations of Reservoir Computing systems
Learning algorithms in Reservoir Computing
Reservoir Computing in Computational Neuroscience
Reservoir Computing on the edge systems
Novel learning algorithms rooted in Reservoir Computing concepts
Novel applications of Reservoir Computing, e.g., to images, video and structured data
Federated and Continual Learning in Reservoir Computing
Deep Reservoir Computing neural networks
Theory of complex and dynamical systems in Reservoir Computing
Extensions of the Reservoir Computing framework, such as Conceptors
Submission
Papers submission for this Special Session follows the same process as for the regular sessions of IJCNN 2023, which uses EDAS as submission system.
The review process for IJCNN 2023 will be double-blind. For prospected authors, it is therefore mandatory to anonymize their manuscripts.
Each paper should have 6 to MAXIMUM 8 pages, including figures, tables and references. Please refer to the Submission Guidelines at https://2023.ijcnn.org/authors/paper-submission for full information.
Submit your paper at the following link https://edas.info/N30081 and choose the track "Reservoir Computing: theory, models, and applications", or use the direct link: https://edas.info/newPaper.php?c=30081&track=116064.
Organizers
Andrea Ceni (University of Pisa, Italy), Claudio Gallicchio (University of Pisa, Italy), Gouhei Tanaka (University of Tokyo, Japan).