Reservoir Computing: Advances in Models, Applications, and Implementations

Special Session of the International Joint Conference on Neural Networks (IJCNN) 202118-22 July 2021, Shenzhen, Chinahttps://www.ijcnn.org

- Papers Submission Deadline: January 15, 2021 February 10, 2021
-
Decision notification: March 15, 2021 April 10, 2021
- Camera ready paper due: March 30, 2021 April 25, 2021
(Note: all deadlines are 11:59pm US pacific time.)

Description and Topics

Reservoir Computing (RC) indicates a class of recurrent neural networks where the dynamical hidden layer is left untrained after initialization based on asymptotic stability. Training is typically restricted only to a simple (often linear) dense output layer, with an evident advantage in terms of computational efficiency in comparison to fully trained recurrent architectures. Major instances of the approach are given by Echo State Networks and Liquid State Machines. Overall, the RC approach to designing recurrent neural networks attracted a lot of research attention in recent years, due to several reasons. Indeed, besides the striking efficiency of training algorithms, RC neural networks are distinctively amenable to hardware implementations (including in neuromorphic unconventional substrates like photonics), and enable clean mathematical analysis (rooted, e.g., in the fields of dynamical systems and random matrix theory). Noticeably, although established in the Machine Learning field, RC lends itself naturally to interdisciplinarity, where ideas coming from diverse areas such as computational neuroscience, complex systems and non-linear physics can lead to further developments and new applications.

This session intends to give a new impetus to RC research within the international neural networks community, also bringing ideas from other connected interdisciplinary fields. We then invite the researchers to submit papers on all aspects of Reservoir Computing research, targeting contributions on new models, applications and implementations.

A list of relevant topics for this session includes, without being limited to, the following:

  • New Reservoir Computing models and architectures, including Echo State Networks and Liquid State Machines

  • New applications of Reservoir Computing, e.g. to vision and structured data

  • Reservoir Computing in (unconventional) neuromorphic hardware

  • Deep Reservoir Computing neural networks

  • Theory of dynamical systems in neural networks, including stability of input-driven temporal embeddings

  • Statistical Learning Theory of Reservoir Computing networks

  • Ensemble learning and Reservoir Computing

  • Extensions of the Reservoir Computing concept, such as Conceptors

  • Reservoir dimensionality reduction, efficient reservoir hyper-parameter search and Learning

Submission

Papers submission for this Special Session follows the same process as for the regular sessions of IJCNN 2021.
Important Note: When submitting your paper choose "S14 Reservoir Computing: Advances in Models, Applications, and Implementations" as (main) research topic.

For submissions, please visit the dedicated IJCNN 2021 website: https://www.ijcnn.org/paper-submission

Organizers

Claudio Gallicchio (University of Pisa, Italy), Azarakhsh Jalalvand (Ghent University-imec, Belgium), Kohei Nakajima (University of Tokyo, Japan)