Flexible information processing in complex networks: from brains to neuromorphic computing
Christoph Kirst, University of California San Francisco (UCSF)
Abstract:
Considerable evidence has shown that the brain can reconfigure certain operations “on the fly”, at speeds seemingly incompatible with lasting plastic changes in the underlying neuroanatomy. Such dynamic reconfiguration processes are observed in the visual system, in attention guided perception or in context based processing. Imagining studies of global brain activity have further uncovered that information might be exchanged between brain areas on an “as needed” basis. Loss of this flexibility has been implicated in neurological and psychiatric disorders. However, the network dynamical mechanisms underlying flexible computational reconfiguration of neuronal networks are not well understood.
Here we I identify [1,2] a generic mechanism to flexibly distribute information in complex networks. We propose that neuronal network activity has two separate components: a collective reference state on top of which information is encoded and distributed in deviations from this reference, a networked version of how radio signals broadcast information via frequency or amplitude modulation. In networks, switching between dynamical reference states then enables fast and flexible rerouting of information. In coupled oscillator networks we show analytically how the physical network structure and the dynamical reference state co-act in order to generate a specific information routing pattern [1].
We then discuss how this mechanism can be used to flexibly reconfigure computations [2]. We numerically show that such a mechanism can be employed for self-organized information processing that naturally enables context dependent pattern-recognition in an oscillatory Hopfield network and an analog version of believe propagation.
We are currently exploring learning strategies within this approach [3] and developing novel data analysis tools combining dimension reduction and dynamic motif detection to identify possible reference dynamics in multi-site electrode recordings of neuronal brain activity.
If time permits, we will also discuss how we are currently using our brain inspired approach to design novel neuromorphic hardware based on energy efficient super-conducting oscillators [4].
[1] Kirst, Timme, Battaglia, Nature communications (2016)
[2] Kirst, Magnasco, Modes, Current Opinion in Systems Biology (2017)
[3] Zhang, Kirst (in prep)
[4] Cheng, Vasudevan*, Kirst*, IEEE Transactions on Applied Superconductivity (2023)
Bio:
Dr. Kirst studies how brains perform computations with a focus on how neuronal circuits achieve their flexible function and coordinate processing among different sub-networks.
He works in the interface between mathematics, physics, computer science, and neurobiology to build theoretical, mechanistic as well as conceptual understanding of flexible brain function. He collaborates closely with experimentalists on a broader range of model systems, data sets, and experimental paradigms to investigate the structure, dynamics, modulation, and function of single neurons, neuronal circuits, large-scale neuronal networks, and whole brains.
Summary:
Focus: computational neuroscience
How do brains generate complex behaviors?
Interaction of complex areas:
Brain structure
Brain activity
Behavioral dynamics
Complexity spans
Single neurons:
Response to inputs
Dendritic integration,
Action potential (‘decision’),
Synaptic transmission
May change over time
Neuronal network dynamics:
Synchronization, oscillation, activity waves
Irregular, chaotic dynamics
Network function can depend on dynamical state
Functions
Signal gating
Binding by synchrony
Generation of behavioural outputs
Focus of talk: how can we use dynamical state dependency to affect computations in brain circuits
Example: face made out of fruit can be seen holistically or as individual fruit
Attention can vary selectively
Can override default face detection neural circuitry to focus on fruits
Collective neuronal oscillations
Not clear whether it is an epi-phenomenon or has a functional role in the brain
Hypothesis: communication through coherence
Suppose two different brain regions undergo oscillations in neural activity
Groups of neurons fire at some frequency
Individual neurons don’t participate in each firing but fire on-phase within the group’s common frequency
If two brain regions’ firing is aligned in phase (adjusted for speed of signal propagation between them) it's easier for them to communicate
This hypothesis is currently supported by indirect evidence but evidence base is evolving
Analyzed the causal structure of brain firing
Delayed mutual information analysis
Take two signals X, Y, one delayed by a constant time relative to the other
Mutual information between X and Y: compare
Join probability p(X,Y) to
Independent probability p(X)p(Y)
Evaluate the entropy of the dependence
Generally not a symmetric relationship
Can be affected by auto-correlation within each time series
There are variants to adjust predictions of future X to remove information flow from past states of X (Transfer entropy)
Neural population: Wilson-Cowan Neural Oscillator
Two populations of neurons that excite/inhibit each other in a feedback cycle
These time series can be simplified to produce a time series of phase positions of the system in the cycle (more stable, less variable signal)
Can look at the dynamics of multiple such oscillators, accounting for the phase difference between their time series evolution
Analyze the behavior of two such oscillators
Circuit with biological parameters shows bistability of two phase locked states with offset phase difference
Information flows into a specific direction between the two oscillators for these offset phases
Two phase delays correspond to two different directions of information flow
Can use these dynamics to build dynamic signal routing mechanisms
Can flexibly couple / decouple circuits
Can flexibly change direction of signal flow
Can propagate signals in certain information flow pattern as long as the oscillator stays within its oscillation pattern (basin of attraction)
Modifying local oscillation frequency can dynamically change information flow of multiple remote oscillatory networks
Signals modify amplitude can communicate different information (Amplitude modulation-like)
Can build large hierarchical circuits from networks of multiple oscillatory subnetworks (networks of networks)
Modeled the use of these circuits for signal recognition
Hopfield Network architecture
Different sub-circuits detect different image sub-circuits (e.g. edges, stripes, colors)
Oscillator interactions dynamically couple these sub-circuits for image detection depending on the subnetworks performance
Only 'valuable' information is propagated through the network as context information
Currently researching learning mechanisms for these oscillator-based circuits