Thursday, 30 April 2026, 15:00-16:00
Prof. Michelle Girvan
Register via: https://cam-ac-uk.zoom.us/meeting/register/zWrv83PcT267A-sWjk_CHA#/registration
Harnessing Network Complexity and Local Adaptation for Robust Computation in Reservoir Computers
Reservoir computers (RCs) leverage the inherent complexity of dynamical networks to process information with dramatically lower training costs than deep learning architectures. In an RC, a fixed, randomly connected network—the reservoir—projects input signals into a high-dimensional state space, where only the output weights are trained. This framework not only offers computational efficiency but also provides a means to explore how the intrinsic complexity of nonlinear dynamical networks—governed by relatively simple equations—can be utilized to uncover brain-inspired computational principles. We show that the statistical structure of the reservoir, particularly the balance between excitatory and inhibitory (E–I) connections, plays a critical role in determining computational performance. However, achieving optimal E–I balance can require careful tuning of hyperparameters. To address this, we introduce a local adaptation mechanism that dynamically adjusts E–I balance to achieve desirable neuronal firing rates, enhancing performance and reducing sensitivity to hyperparameter choice. Incorporating heterogeneity in target firing rates further increases robustness across both linear and nonlinear tasks. These results highlight how the interplay between network topology, dynamics, and adaptive mechanisms can be exploited to improve machine learning efficiency and deepen our understanding of computation in complex systems.