Lieu : IHP, amphi Darboux
14.00 : Pierre Vandekerkhove (Université Gustave Eiffel)
Titre : Two sample contamination testing problem: the Inversion-Best Matching approach
Résumé : In this presentation we consider two-component mixture models having one single known component. This type of model is of particular interest when a known random phenomenon is contaminated by an unknown random effect. We propose in this setup to test the equality in distribution of the unknown random sources involved in two separate samples generated from such a model. For this purpose, we introduce the so-called IBM (Inversion-Best Matching) approach resulting in a tuning-free relaxed semiparametric Cramér-von Mises type two-sample test requiring minimal assumptions about the unknown distributions. The accomplishment of our work lies in the fact that we establish, under some natural and interpretable mutual-identifiability conditions specific to the two-sample case, a functional central limit theorem about the proportion parameters along with the unknown cumulative distribution functions of the model. An intensive numerical study is carried out from a large range of simulation setups to illustrate the asymptotic properties of our test. Finally, our testing procedure, implemented in the admix R package, is applied to a real-life situation through pairwise post COVID-19 mortality excess profil testing across a panel of European countries.
15.00 : Pierre Humbert (Université Paris-Saclay)
Titre : Robust Kernel Density Estimation with Median-of-Means principle
Résumé : In this presentation, I introduce a robust nonparametric density estimator combining the popular Kernel Density Estimation method and the Median-of-Means principle (MoM-KDE). This estimator is shown to achieve robustness for a large class of anomalous data, potentially adversarial. While previous works only prove consistency results under very specific contamination models, this work provides finite-sample high-probability error-bounds without any prior knowledge on the outliers. To highlight the robustness of this method, an influence function adapted to the considered OUI framework is introduced. Finally, it is shown that MoM-KDE achieves competitive results when compared with other robust kernel estimators, while having significantly lower computational complexity.
16.00 : Thomas Bonis (Université Gustave Eiffel)
Titre : Presentation of the Stein Variational Gradient Descent algorithm
Résumé : Approximating intractable integrals is a recurrent issue in Bayesian statistics. Two very different techniques have been developed to deal with it: variational inference methods and Monte-Carlo algorithms. However, both approaches suffer from their own shortcomings. A third approach, called Stein Variational Gradient Descent, was introduced a few years ago. In this talk, I will briefly describe the variational and Monte-Carlo approaches before focusing on this new technique. After giving some intuition regarding this algorithm, I will present existing theoretical guarantees for it as well as open questions regarding its performance. Finally, I will provide a first answer to one of these question