Lieu : IHP, amphi Yvonne Choquet-Bruhat (second étage du bâtiment Perrin)
14.00 : Marylou Gabrié (LPENS - ENS)
Titre : Transporting measures for sampling: parametric and non-parametric approaches inspired by generative modelling
Résumé : Generative models and statistical mechanics have a long history of cross-fertilization. Recently, it has been shown that generative models, such as normalizing flows, can assist sampling of metastable systems. This remarkable ability comes from the high-expressivity of generative models that can approach complex distributions while remaining tractable. However, the training accuracy of generative models deteriorates when the dimension and complexity of the target measure are pushed. Inspired by recent progress in generative modelling relying on stochastic processes, non-parametric sampling algorithms can also be derived to sample from metastable systems. Are non-parametric methods more scalable?
15.00 : Francis Bach (INRIA)
Titre : Physics-informed kernel learning
Résumé : Physics-informed machine learning typically integrates physical priors into the learning process by minimizing a loss function that includes both a data-driven term and a partial differential equation (PDE) regularization. Building on the formulation of the problem as a kernel regression task, we use Fourier methods to approximate the associated kernel, and propose a tractable estimator that minimizes the physics-informed risk function. We refer to this approach as physics-informed kernel learning (PIKL). This framework provides theoretical guarantees, enabling the quantification of the physical prior’s impact on convergence speed. We demonstrate the numerical performance of the PIKL estimator through simulations, both in the context of hybrid modeling and in solving PDEs. In particular, we show that PIKL can outperform physics-informed neural networks in terms of both accuracy and computation time. Additionally, we identify cases where PIKL surpasses traditional PDE solvers, particularly in scenarios with noisy boundary conditions (joint work with Nathan Doumèche, Gérard Biau, et Claire Boyer, https://arxiv.org/abs/2409.13786)
16.00 : Yazid Janati (CMAP - Polytechnique)
Titre : Variational Diffusion Posterior Sampling with Midpoint Guidance
Résumé : Diffusion models have recently shown considerable potential in solving Bayesian inverse problems when used as priors. However, sampling from the resulting denoising posterior distributions remains a challenge as it involves intractable terms. To tackle this issue, state-of-the-art approaches formulate the problem as that of sampling from a surrogate diffusion model targeting the posterior and decompose its scores into two terms: the prior score and an intractable guidance term. While the former is replaced by the pre-trained score of the considered diffusion model, the guidance term has to be estimated. In this paper, we propose a novel approach that utilises a decomposition of the transitions which, in contrast to previous methods, allows a trade-off between the complexity of the intractable guidance term and that of the prior transitions. We also show how the proposed algorithm can be extended to handle the sampling of arbitrary unnormalised densities. We validate the proposed approach through extensive experiments on linear and nonlinear inverse problems, including challenging cases with latent diffusion models as priors.