Spring 2025

Spring 2025 YouTube playlist 








The focus of the talk will be conceptual rather than technical, with an eye towards enabling intuition for i) which high-level aspects of the target distribution influence the convergence behaviour of RWM, and ii) which concrete properties must be verified in order to obtain a rigorous proof. A key element will be the impact of tail behaviour and measure concentration on the convergence profile of the algorithm across different time-scales. 

             

      No prior knowledge of the RWM is required from the audience.  

             

             (joint work with C. Andrieu, A. Lee and A. Wang)














In the first part of this talk, I will present new KL convergence guarantees under minimal assumptions for both for continuous and discrete score-based diffusion models. Specifically, I focus on discretizations derived from the Ornstein-Uhlenbeck semigroup and its kinetic variant, and show that sharp and explicit KL bounds can be obtained for any data distribution with finite Fisher information—thereby avoiding early stopping, smoothing, or strong regularity conditions.

In the second part, I will shift to the use of diffusion models for solving inverse problems, such as image reconstruction or source separation. Here, I introduce a novel mixture-based posterior sampling framework that combines diffusion priors with observational data using a principled Gibbs sampling scheme. This approach offers theoretical guarantees, task-agnostic applicability, and robust performance across a wide range of problems—including ten image restoration tasks and musical source separation—without relying on crude approximations or heavy heuristic tuning.