Lieu : IHP, amphi Yvonne Choquet-Bruhat (second étage du bâtiment Perrin)
14.00 : Thanh Mai Pham Ngoc (LAGA, Université Sorbonne Paris Nord)
Titre : Adaptive estimation for nonparametric circular regression with errors in variables
Résumé : Circular or angular data are encountered in various scientific fields, such as biology (directions of animal migration), bioinformatics (protein conformational angles), geology (rock fracture orientations), medicine (circadian rhythms), forensics (crime timing), and the social sciences (time-of-day or calendar effects). This talk investigates the nonparametric estimation of a circular regression function in an errors-in-variables framework. Two settings are studied, depending on whether the covariates are circular or linear. Adaptive estimators are constructed and their theoretical performance is assessed through convergence rates over Sobolev and Hölder smoothness classes. The obtained rates reveal the specific nature of regression for circular responses and corrupted measurements which involve a deconvolution problem. Numerical experiments on simulated and real datasets illustrate the practical relevance of the methodology.
Joint work with Tien Dat Nguyen (Vietnam National University Ho Chi Minh city, Vietnam)
15.00 : Etienne Gauthier (Inria & ENS)
Titre : Recent advances in conformal prediction with e-values
Résumé : Conformal prediction has become a versatile framework for distribution-free uncertainty quantification, offering coverage guarantees under minimal assumptions. Traditionally, these methods rely on p-values to guarantee marginal coverage when all the data is exchangeable. More recently, e-values have emerged as a powerful and flexible tool in statistics, and their integration into conformal prediction has opened the door to constructing valid prediction sets in more complex and challenging settings. In this talk, I will give an overview of these advances, explain the main ideas behind using e-values in conformal prediction, and highlight examples that illustrate both their promise and the open questions they raise.
16.00 : Rafael Pinot (LPSM, Sorbonne Université)
Titre : Federated Learning with Adversarial Nodes
Résumé : The vast amount of data collected every day, combined with the increasing complexity of machine learning models, has led to the emergence of distributed learning schemes. In the now classical Federated learning architecture, the learning procedure consists of multiple data owners (or clients) collaborating to build a global model with the help of a central entity (the server), typically using a distributed variant of SGD. Nevertheless, this algorithm is vulnerable to “misbehaving” clients that could (either intentionally or inadvertently) sabotage the learning by sending arbitrarily bad gradients to the server. These clients are commonly referred to as Byzantine and can model very versatile behaviors going from crashing machines in a datacenter to colluding bots attempting to bias the outcome of a poll on the internet. The purpose of this talk is to present a small introduction to the emerging topic of Byzantine-Robustness. Essentially, the goal is to enhance distributed optimization algorithms, such as distributed SGD, in a way that guarantees convergence despite the presence of some Byzantine clients. We will take the time to present the setting and review some rent results as well as open problems in the community.