Reservoir Computing:
Progress in Methods, Applications, and Implementations
Special Session of the IEEE World Congress on Computational Intelligence (WCCI 2024),
the Internatioinal Joint Conference on Neural Networks (IJCNN 2024)
June 30 – July 5, 2024, Yokohama (Japan)
Key dates
- Paper submission deadline:
January 15, 2024January 29, 2024 (extended) - Decision notification: March 15, 2024
- Final Paper Submission & Early Registration Deadline: May 1, 2024
- Conference dates: June 30 – July 5, 2024 (Yokohama, Japan)
Thank you for joining the SS
Sessions
- Session IJCNN S4_10: Special Session: Reservoir Computing: Progress in Methods, Applications, and Implementations [July 1st (08:20-09:40 JST) @ Room 315]
- #3126 (1570991881): Predictive Modeling in the Reservoir Kernel Motif Space
- #3335 (1570992157): Multi-Task Wavelength-Multiplexed Reservoir Computing Using a Silicon Microring Resonator
- #4165 (1570993222): Chaotic Time Series Prediction in Biological Neural Network Reservoirs on Microelectrode Arrays
- #5054 (1570994360): Comparing Connectivity-To-Reservoir Conversion Methods for Connectome-Based Reservoir Computing
- Session IJCNN S5_4: Special Sessions: Reservoir Computing and Representation Learning [July 2nd (08:20-10:00 JST) @ Room 411+412]
- #2242 (1570989144): Designing Network Topologies of Multiple Reservoir Echo State Networks: A Genetic Algorithm Based Approach
- #2554 (1570990742): Reservoir Computing Neural Networks for Estimating Mechanical Properties of Hot Steel Strips
- #2749 (1570991222): Decentralized Incremental Federated Learning with Echo State Networks
- Session Virtual: Machine Learning Algorithms 3 [July 2nd (21:00-22:40) in Zoom 21]
- #2446 (1570990302): A Connectivity Gradient in Structured Reservoir Computing Predicts a Hierarchy for Mixed Selectivity in Human Cortex
Reservoir Computing (RC) is a computational framework derived from efficient training methods for recurrent neural network models, which consists of a fixed hidden recurrent layer (called a reservoir) and a trainable output layer (called a readout).
Description and topics
Reservoir Computing (RC) is a computational framework derived from efficient training methods for recurrent neural network models, which consists of a fixed hidden recurrent layer (called a reservoir) and a trainable output layer (called a readout).
The research field of RC has expanded broadly in recent years. First, many advanced RC models and learning methods have been developed to enhance computational performance in temporal pattern recognition while maintaining its low computational cost. Some RC models have been combined with other machine learning techniques, data scientific approaches, and inspirations from neuroscience. Second, the RC approaches have been applied to engineering applications with sensor data under resource/power-constrained conditions such as edge AI systems. Third, physical RC has been actively explored for realizing efficient AI hardware and device in the context of neuromorphic and unconventional computing, e.g., based on photonics, material science, and spintronics. Fourth, mathematical theory and analysis for revealing RC properties have made much progress. In this way, RC is a rapidly growing research topic, while being closely related to machine learning, dynamical systems theory, neuroscience, and bio/natural-computing, and AI hardware.
This special session is intended to be a hub for discussion and collaboration within the Neural Networks community, and therefore invites contributions on all aspects of RC, from theory to new models to emerging applications.
Main Topics:
A list of topics relevant to this session includes, but is not limited to, the following:
* New Reservoir Computing models and architectures, including Echo State Networks and Liquid State Machines
* Hardware, physical, and neuromorphic implementations of Reservoir Computing systems
* Learning algorithms in Reservoir Computing
* Reservoir Computing in Computational Neuroscience
* Reservoir Computing on the edge systems
* Novel learning algorithms rooted in Reservoir Computing concepts
* New applications of Reservoir Computing, e.g., to images, video and structured data
* Federated and Continual Learning in Reservoir Computing
* Deep Reservoir Computing neural networks
* Theory of complex and dynamical systems in Reservoir Computing
* Extensions of the Reservoir Computing framework, such as Conceptors
Submission
Papers submission for this Special Session follows the same process as for the regular sessions of WCCI 2024 (https://2024.ieeewcci.org/), which uses the EDAS Conference and Journal Management System.
**NOTE**: Please first check the instruction (https://2024.ieeewcci.org/submission). The review process for WCCI 2024 will be double-blind. Anonymizing your paper is mandatory, and papers that explicitly or implicitly reveal the authors’ identities may be rejected. Each paper is limited to 8 pages, including figures, tables, and references.
1) In the Paper Submission site (https://edas.info/N31614), please choose the following track:
IJCNN 2024 | Special Session Papers |
2) In the Topics section, please choose “Special Session: Reservoir Computing: Progress in Methods, Applications, and Implementations.”
Organizers
Andrea Ceni (University of Pisa, Italy), Claudio Gallicchio (University of Pisa, Italy), Ryosho Nakane (The University of Tokyo, Japan), Gouhei Tanaka (Nagoya Institute of Technology, Japan)* [*Contact]