Statistical learning with Lipschitz and convex loss functions, Chinot G, Lecué G, Lerasle M, Probality Theory and Related Field (PTRF)
Robust learning and complexity dependent bounds for regularized problems, Chinot G,
Robust high dimensional learning for Lipschitz and convex losses, Chinot G, Lecué G, Lerasle M, Journal of Machine Learning Research.
ERM and RERM are optimal estimators for regression problems when malicious outliers corrupt the labels, Chinot G, Electronic Journal of Statistics
Gradient Descent can Learn Less Over-parameterized Two-layer Neural Networks on Classification Problems, Chinot G, Nitanda A, Suzuki T,
On the robustness of the minimum ℓ2 interpolator, Chinot G, Lerasle M, Bernoulli.
On the robustness of minimum-norm interpolators and regularised empirical risk minimizers, Chinot G, Loffler Matthias, Van De Geer Sara, Annals of Statistics.
AdaBoost and robust one-bit compressed sensing, Chinot G, Felix Kuchelmeister, Loffler Matthias, Van De Geer Sara, Mathematical statistics and learning.
Computational Uncertainty Quantification: Mathematical Foundations, Methodology & Data, talk about "Adaboost and one-bit compressed sensing". Vienna, Austria, June 2022
Seminar of statistics ENSAI Rennes, talk about "Adaboost and one-bit compressed sensing". Rennes, France, May 2022
Meeting in Mathematical Statistics 2018, talk about "Minimum norm interpolators". Marseille, France, December, 2020
Meeting in Mathematical Statistics 2018, talk about "Regularized robust learning with Lipschitz and convex loss functions". Frejus, France, December 16th-21th, 2018
Talks, "Robust learning with Lipschitz and convex loss functions", October,8th 2018 at AgroParisTech
Workshop "Statistical Inference for Structured High-dimensional Models", Oberwolfach, March 11th-17th ,2018
Workshop "Meeting in Mathematical Statistics", CIRM, France, December 18th-22th, 2017
July-November 2019, intern at AIP-RIKEN, Tokyo, under the supervision of professor Taiji Suzuki