Séance du 3 janvier 2022

Séance organisée par Emilie Lebarbier et Umut Şimşekli.

Lieu : IHP, amphi Darboux


14.00 : Adeline Fermanian (Mines ParisTech)

Titre : Framing RNN as a kernel method: A neural ODE approach

Résumé : Building on the interpretation of a recurrent neural network (RNN) as a continuous- time neural differential equation, we show, under appropriate conditions, that the solution of a RNN can be viewed as a linear function of a specific feature set of the input sequence, known as the signature. This connection allows us to frame a RNN as a kernel method in a suitable reproducing kernel Hilbert space. As a consequence, we obtain theoretical guarantees on generalization and stability for a large class of recurrent networks. Our results are illustrated on simulated datasets.


15.00 : Alain Célisse (Université Paris 1)

Titre : Early stopping rule for kernelized spectral filter algorithms

Résumé : We investigate the construction of early stopping rules in the nonparametric regression problem. We introduce the (smoothed) discrepancy principle applied to kernelized spectral filter algorithms including Tikhonov regularization and gradient descent. Our main theoretical bounds are oracle inequalities covering the fixed and random design settings. The classical discrepancy principle is statistically adaptive for slow rates occurring in the hard learning scenario, while the smoothed discrepancy principle is adaptive over ranges of faster rates (resp. higher smoothness parameters).


16.00 : Francis Bach (INRIA-SIERRA)

Titre : Statistics, Machine Learning, and Optimization with Kernel Sums-of-Squares

Résumé : In this talk, I will present recent work on representing non-negative functions with infinite-dimensional sums of squares, with applications to non-convex optimization, optimal transport, optimal control, Bayesian inference, and shape-constrained optimization.