We propose a mathematical framework for finite-size SNNs, aiming at quantitatively mapping parameters to network dynamics. We first plan to apply our framework to tackle the following two types of questions:
The interplay between neural oscillations on different timescales, including transient synchrony taking place within 1-4 milliseconds.
Metastability. In both real brains and SNNs can exhibit multiple stable attractors induced by external stimuli, but also in the absence of external stimulation. The spontaneous transitions between these attractors are viewed as the biological substrates behind decision-making and working memory.
Compared to traditional theories for SNNs based on differential equations, the Markov framework naturally provides multiple conveniences in analysis. To name a few:
The production of spikes becomes no more special compared to other network activities, since everything is described by the transition between different states;
The long-term average activities of SNNs can be directly computed from the invariant probability measure. By incorporating SNN parameters into the probability transition matrix, we could systematically map parameters to statistics of dynamics, including firing rates, synchronicities and more;
Markovian framework is suitable for the application of many classical stochastic analysis tools. For example, we could use large deviation theories to analyze the stochastic transition between multiple attractors;
Model reductions can be carried out more directly by deleting or merging states for rare events – that is just deleting & merging rows and columns in the probability transition matrix.
The adjoint flow of the Markov chain can significantly facilitate parameter exploration of SNNs.
(TBC)