Energy function and emergent spiking dynamics:

We introduced a simple model for the net extrinsic power needed by a spiking neural network to operate during the process of spike generation, transmission and stimulation. The network shows emergent spiking activity by minimizing this metric using growth transforms - an iterative algorithm for minimizing a polynomial objective function under bound constraints.

Single-neuron responses:

Single neurons show a vast repertoire of response characteristics and dynamical properties that lend richness to their computational properties at the network level. Our model integrates a dominant sub-threshold dynamics that determines the optimal solution, with a modulating super-threshold dynamics that determines the trajectory followed by the neurons to converge to the optimal point. The latter can be tuned in a multitude of ways to program the neuron to exhibit many of these response characteristics without affecting the underlying optimization process.

Coupled networks:

The model can be extended to a coupled network with inhibitory and excitatory connections that can exhibit interesting population dynamics based on different network states.

For example, as the network starts to converge to the optimal solution, it can be made to globally adapt to reduce the firing rate across the entire network without affecting the steady-state solution. This can be used to design power-efficient spiking networks where power is dissipated only during transients.

The coupled spiking network can also be used as a short-term memory element, exhibiting history-dependent stimulus response.

Learning and plasticity:

The Growth transform network derives its responses continuously by minimizing a system objective function for a fixed set of weights. We can simultaneously optimize the network w.r.t. its weights using a slower dynamical system. Optimizing under different constraints gives us different plasticity curves commonly seen in neurobiology. We can also use the network for supervised learning to adapt its weights in response to a teaching signal.

Spiking and bursting support vector machines:

We used the network to solve binary classification problems using a support vector machine framework that exhibits noise-shaping properties similar to sigma-delta modulation, and learns to encode information using spikes and bursts. The emergent spiking and bursting dynamics produced by each neuron encodes its respective margin of separation from the classification hyperplane whose parameters are encoded by the network population dynamics. We see that only the 'support vector' neurons, i.e., the neurons closest to the hyperplane exhibit spiking, while the neurons away from the classfication hyperplane (and hence not as important) are silent.