Gaussian Information Bottleneck


Starting from a joint Gaussian (X,Y), the problem is to find a linear projection A that compresses a variable X while preserving as much information as possible about Y. A parameter β governs the tradeoff between compression and preservation. More details about the implementation and some examples are available in the Notebook Tutorial that you can find here: https://github.com/michnard/GIB_tutorial

I refer the reader to the original paper by Gal Chechik, Amir Globerson, Naftali Tishby, and Yair Weiss (JMLR 2005) for mathematical proofs: https://www.jmlr.org/papers/volume6/chechik05a/chechik05a.pdf

  

Mutual Information and Noise Correlations

Neural co-variability can be partitioned into an explainable component, driven by external stimuli, oscillations, and other global signals; and an unexplained component, or “noise” correlations, which can lead to profound coding (dis)advantages.
Are noise correlations truly information-limiting? 

Try it out by yourself! The widgets work directly in your browser; no need to install anything:

1) https://michnard.github.io/MI_max/lab?path=2cells_covariab_source.ipynb

2) https://michnard.github.io/MI_max/lab?path=2cells_MI_fixAvg.ipynb

All code on Github: https://github.com/michnard/MI_max

SAND Workshop 2023

Intro to spike sorting

Short tutorial on how to sort spikes from LFP data, and basic analysis on that data.


All code on Github: https://github.com/michnard/SpikeSortingTutorial

Analysis of e-phys data release