Lieu : IHP, amphi Yvonne Choquet-Bruhat (second étage du bâtiment Perrin)
14.00 : Thibaut Germain (CMAP, école polytechnique)
Titre : A Spectral-Grassmann Wasserstein metric for operator representations of dynamical systems
Résumé : The geometry of dynamical systems estimated from trajectory data is a major challenge for machine learning applications. Koopman and transfer operators provide a linear representation of nonlinear dynamics through their spectral decomposition, offering a natural framework for comparison. We propose a novel approach representing each system as a distribution of its joint operator eigenvalues and spectral projectors and defining a metric between systems leveraging optimal transport. The proposed metric is invariant to the sampling frequency of trajectories. It is also computationally efficient, supported by finite-sample convergence guarantees, and enables the computation of Fréchet means, providing interpolation between dynamical systems. Experiments on simulated and real-world datasets show that our approach consistently outperforms standard operator-based distances in machine learning applications, including dimensionality reduction and classification, and provides meaningful interpolation between dynamical systems.
15.00 : Vladimir Kostic (Italian institute of technology, Genova)
Titre : Forecasting distributions: from kernel methods to neural networks
Résumé : In this talk we will present recent advances on learning rates for inferring the flow of distributions from observing data of an ergodic stochastic system at equilibrium. We will treat the problem using both kernel methods and neural representations, discussing both pros and cons of both approaches. While theoretically we focus on exposing how linear algebraic tools unlock development of new uniform in time high probability bounds, the talk will also cover the impact in practical applications related molecular dynamics.
16.00 : Katia Meziani (CEREMADE, Université Paris Dauphine - PSL)
Titre : Marginal Contrastive Discrimination
Résumé : Conditional density estimation is a central problem in statistics and machine learning, particularly challenging when the dimensionality of the conditioning set is high. To address this issue, we pro- pose an innovative method inspired by noise-contrastive methods. This approach reformulates the conditional density estimation problem into two simpler sub-problems: density estimation and bi- nary classification. Our method, called Marginal Contrastive Discrimination (MCD), demonstrates performance that is comparable to, and sometimes surpasses, state-of-the-art conditional density estimation approaches, especially in scenarios where the dimensionality of the conditioning set is high.