Séance du 11 avril 2022

Séance organisée par Alain Célisse et Céline Duval

Lieu : IHP, amphi Darboux


14.00 : Ester Mariucci (Université Versailles Saint Quentin)

Titre : Nonparametric estimation of the Lévy density

Résumé : We consider the problem of estimating the Lévy density of a pure jump Lévy process, possibly of infinite variation, from the high frequency observation of one trajectory. To directly construct an estimator of the Lévy density, we use a compound Poisson approximation and we build a linear wavelet estimator. Its performance is studied in terms of $L_p$ loss functions, $p\geq1$, over Besov balls. To show that the resulting rates are minimax-optimal for a large class of Lévy processes, we propose new non-asymptotic bounds of the cumulative distribution function of Lévy processes with Lévy density bounded from above by the density of an alpha-stable type Lévy process in a neighbourhood of the origin. It is a joint work with Céline Duval.


15.00 : El Mehdi Saad (Ecole Polytechnique)

Titre : Fast rates for prédiction with limited expert advice

Résumé : We investigate the problem of minimizing the excess generalization error with respect to the best expert prediction in a finite family in the stochastic setting, under limited access to information. We assume that the learner only has access to a limited number of expert advices per training round, as well as for prediction. Assuming that the loss function is Lipschitz and strongly convex, we show that if we are allowed to see the advice of only one expert per round for T rounds in the training phase, or to use the advice of only one expert for prediction in the test phase, the worst-case excess risk is Ω(1/ √ T) with probability lower bounded by a constant. However, if we are allowed to see at least two actively chosen expert advices per training round and use at least two experts for prediction, the fast rate O(1/T) can be achieved. We design novel algorithms achieving this rate in this setting, and in the setting where the learner has a budget constraint on the total number of observed expert advices, and give precise instance-dependent bounds on the number of training rounds and queries needed to achieve a given generalization error precision


16.00 : Alessandro Rudi (INRIA et ENS Paris)

Titre : Representing non-negative functions, with applications to non-convex optimization and beyond

Résumé : Many problems in applied mathematics are expressed naturally in terms of non-negative functions. While linear models are well suited to represent functions with output in R, being at the same time very expressive and flexible, the situation is different for the case of non-negative functions where the existing models lack one of these good properties. In this talk we present a rather flexible and expressive model for non-negative functions. We will show direct applications in probability representation and non-convex optimization. In particular, the model allows to derive an algorithm for non-convex optimization that is adaptive to the degree of differentiability of the objective function and achieves optimal rates of convergence. Finally, we show how to apply the same technique to other interesting problems in applied mathematics that can be easily expressed in terms of inequalities.

Ulysse Marteau-Ferey , Francis Bach, Alessandro Rudi. Non-parametric Models for Non-negative Functions. https://arxiv.org/abs/2007.03926

Alessandro Rudi, Ulysse Marteau-Ferey, Francis Bach. Finding Global Minima via Kernel Approximations. https://arxiv.org/abs/2012.11978

Alessandro Rudi, Carlo Ciliberto. PSD Representations for Effective Probability Models. https://arxiv.org/pdf/2106.16116.pdf