Schedule

Schedule

Session I

9:30 - 9:45

Workshop Introduction

Samuele Virgili, Simone Azeglio, Gabriel Mahuas

9:45 - 10:15

Neural dynamics underlying active vision in the mouse

Cris Niell

Abstract - TBA

10:15 - 10:45

Towards a simplified model of primary visual cortex

Marius Pachitariu

Abstract - TBA

10:45 - 11:15

Coffee Break


11:15 - 11:45

How prospection skews the human representation of space during navigation

Christopher Summerfield

Abstract - TBA

11:45 - 12:15

On assigning semantic meaning to the preferred stimulus of a visual neuron

Stuart Trenholm

Recent advances in deep learning modeling of visual neurons provide a powerful toolset for discovering the visual stimuli that best activate neurons. However, this can generate stimuli that are difficult to semantically label or parameterize. Furthermore, if we want to use such tools not only to identify maximally activating stimuli, but also study tuning curves, trajectories through generated image space may also be difficult to semantically describe. Here, we will discuss our results related to modeling the feature selectivity of neurons in mouse HVAs in the context the these issues.

12:15 - 12:45

Interacting with the environment through flexible sensory codes

Wiktor Młynarski

Sensory systems are the brain’s window to the world - they represent the organism's surrounding in order to enable successful action. To instantiate such representations efficiently and accurately, the brain must adapt to the structure of natural environments. Indeed - analysis of natural stimulus statistics, grounded in the theoretical framework known as the efficient coding hypothesis, has brought great progress in understanding the principles of sensory information processing - mainly under stable conditions. However, nothing in the natural world is completely static - environments change, animals' goals and demands fluctuate, and the two are coupled in a closed-loop - the surrounding can be affected by animal's actions. In this talk I will discuss our theoretical attempts to understand how and why should sensory systems adapt to changing environments, varying internal states and dynamic behavior in natural environments.

Session II

15:30 - 16:05

A Neuro-AI approach to decrypting neural representations

Andreas Tolias

Abstract - TBA

16:05 - 16:40

The impact of behavior and internal states on subcortical visual processing

Sylvia Schröder

The early visual system is thought to efficiently encode visual input, e.g., by adapting to the recent stimulus history. Based on similar arguments, early visual processing could benefit from adapting to the animal’s current behaviour, its goals and internal states as these different contexts change the expected stimulus statistics and the importance of specific visual features. Indeed, we find that contextual modulation is evident as early as in the output of the retina and in neurons of the superficial superior colliculus receiving direct retinal input.


Using two-photon imaging of calcium signals, we have recorded responses of large populations of neurons in superior colliculus (SC) and of retinal axons projecting to the superior colliculus in awake mice engaged in different behaviours. When mice were running, spontaneous activity in about half of the recorded retinal axons and SC neurons was either enhanced or suppressed compared to stationary periods. Interestingly, the effect of running on visually driven activity was dependent on the visual input. While running had purely linear effects on the tuning to motion direction, i.e., changing the gain or offset of responses, running shifted neurons’ preferences towards higher temporal and lower spatial frequencies, which indicates that neurons are better equipped to encode high visual motion speeds during running.


Independent of the effects of running and arousal, we found that receiving water reward is an additional modulator of visual responses. Reward increased responses to successive visual stimuli in about 20% of recorded SC neurons. These effects by reward could not be explained by pupil-linked arousal or body movements like licking and affected visual responses on a time scale of seconds. The increase of visual responses after receiving water reward led to improved decoding of stimulus presence from the neural population activity.

These results show that behavioural and internal state contexts affect visual processing at a very early stage, possibly to optimize efficiency of stimulus encoding and motivationally driven encoding.

16:40 - 17:10

Coffee Break


17:10 - 17:50

Vision through CEBRA 

Jin Hwa Lee

In the last few years, the field has developed technologies which enable the recording of large neural and behavioural datasets.

To study the neural representation of complex sensory information and behaviour, we need a method to extract meaningful latent space from the high-dimensional neural data. 

CEBRA (Consistent EmBeddings of high-dimensional Recordings using Auxiliary variables) is a novel non-linear machine learning tool to discover consistent and highly decodable latent neural embeddings by jointly leveraging behaviour and neural data using contrastive learning. In this tutorial, we will walk through the CEBRA algorithm, and demonstrate how it can be applied to a visual neural dataset such as the Allen Brain Observatory mouse V1 dataset. 


17:50 - 18:30

Transition to social