Efficient Learning in Spiking Neural Networks
(ELSpiNN 2023)
(ELSpiNN 2023)
Spiking neural networks have seen a recent resurgence in interest owing particularly to their greater representational capacity and correspondingly (much) lower computational cost. On the other hand, reliable learning in such networks has been historically difficult to achieve in contrast to conventional (non-spiking) networks that can use efficient gradient-descent algorithms, exemplified by backpropagation.
The challenges of adopting key aspects of gradient-descent methods in spiking networks include, for example, the credit assignment problem, the absence of any known backward signal across synaptic connections to axons, and the non-differentiability of the activation function.
There is a need for learning algorithms for spiking models that can compete favourably against conventional neural networks in terms of training time as well as computational power.
Organized by Alex Rast and Nigel Crook (Oxford Brookes University, UK)
Paper submission for this special session is through the ESANN 2023 website: https://www.esann.org/