Spiking neural networks (SNN) stand at the forefront of efforts to model the brain's intricate dynamics, aiming to replicate how neurons process and transmit information through electrical pulses:
Neurons integrate input signals from other neurons over time;
When a specific threshold is reached, the neuron generates a sharp electrical pulse (or "spike"). The spike is projected to other neurons through synapses;
After that, the neuron undergoes a refractory period (unresponsive to inputs) before returning to the rest state.
These models are pivotal for both the neuroscience community and machine learning advancements, offering insights into the physiological underpinnings of neural activities and computations. An ideal, comprehensive mathematical theory could help by mapping the network architecture, parameters, and stimuli of SNNs to their dynamics. However, the journey to the mathematical theory is fraught with significant challenges. To name a few:
Dimensionality. The exponential increase in complexity with the neuron number N presents a formidable barrier to analysis. Theories from thermodynamics and statistical physics suggest simplifications by assuming N→∞, and then reducing the SNN system to a few differential equations. However, the tricky part is that N doesn't go to infinity in real brains, making these approaches less directly applicable. For example, the mouse cortex only contains O(10^7) neurons. Even in primate brains, while the whole cortex contains as many as O(10^10) neurons, fundamental structures like minicolumns and hypercolumns only contain O(10^2-10^4) neurons.
Dynamical heterogeneity. SNNs may contain many parameters to resemble the biological details. With different choices of parameters, SNNs yield a diverse array of dynamics, ranging from stable to chaotic. A crucial phenomenon within this spectrum is neural oscillation, manifesting as rhythmic, synchronized activities of neurons across the cortex. Neural oscillations take place on a continuous spectrum of timescales—from milliseconds to seconds. They are viewed as crucial substrates underlying many cognitive functions such as attention, learning, memory, etc. While SNN simulations can emulate neural oscillations as correlated spiking events of different neurons, a comprehensive mathematical framework that accurately predicts these dynamics across different time scales is notably absent.
Singularity and irreversibility of spikes. The unique characteristics of neural spikes highlight the nonlinear dynamics and precision timing in neural communication. Unlike the mutual interactions between objects in a Newtonian system, spikes are asymmetrical and usually modeled as singular signals, such as delta functions. Furthermore, the process of a neuron generating a spike, i.e., surpassing a threshold and then reset, is irreversible. (That is, a neuron at rest state cannot spontaneously return to the threshold state.) This leads to, in many regimes, the sensitivity of SNN dynamics to the precise timing of even one single spike.
Efforts to address these challenges have included model reduction methods reducing the high-dimensional SNNs to a few simpler differential equations. However, to obtain the convenience of analysis, it's usually necessary to make multiple assumptions that may be biologically unrealistic, such as N→∞, the coupling weights between neurons going to 0, the SNN architecture being homogeneous, and so on. Furthermore, most of these studies have concentrated on average firing rates, largely overlooking the temporal correlations essential for understanding high-frequency neural oscillations and information transfer between cortical areas.