The goal of my research is to strenghten the theoretical basis of modern statistical methods, and in particular of Bayesian methods.
The key object in Bayesian inference is the posterior probability distribution, which captures the knowledge we have about the hidden parameters of a model given some data. In most cases, this posterior is uncomputable and most research on Bayesian inference revolves around this issue, either by producing models for which the posterior is computable, or by producing algorithms for computing efficient approximations of the true posterior.
One aspect of my research centers around a subclass of such approximation methods which aim at computing in an iterative manner a Gaussian approximation to the true posterior. The three most famous methods in this class are the Laplace approximation, the Variational Bayes approximation and Expectation Propagation. I aim to understand why and when these methods work, and how fast they are at computing an answer. This knowledge will help us to better choose which method to apply to which problem, and will help us in designing more efficient variants of these methods.
Another aspect of my research investigates the frequentist properties of Bayesian inference, in order to better bridge the gap between these two subdomain of statistics. The object of this research is double. On the one hand, it can be used to prove that Bayesian inference is a valid frequentist method. Furthermore, such research is also interesting in its own right, as it enables us to understand what are the typical properties of the posterior distribution, and adapt our methods so that they are efficient on this typical case.
I'm currently a lecturer in probability and statistics at EPFL (Ecole Polytechnique Fédérale de Lausanne) in Switzerland. I started this position in september 2016 after I succesfully defended my PhD in computational neuroscience at university of Geneva and Paris-Descartes university.