Upcoming Seminar Presentations
All seminars are on Tuesdays [ 8:30 am PT ] = [ 11:30 am ET ] = [ 3:30 pm London ] = [ 4:30 pm Paris ] = [ 0:30 am Beijing + 1d]
Subscribe to our mailing list and calendar for up-to-date schedule!
Tuesday, March 3, 2026
Speaker: Pierre Del Moral (INRIA Bordeaux) [Zoom Link]
Title: On the Kantorovich contraction of Markov semigroups
Abstract: We present a novel operator theoretic framework to study the contraction properties of Markov semigroups with respect to a general class of Kantorovich semi-distances, which notably includes Wasserstein distances. This rather simple contraction cost framework combines standard Lyapunov techniques with local contraction conditions. Our results can be applied to both discrete time and continuous time Markov semigroups, and we illustrate their wide applicability in the context of (i) Markov transitions on models with boundary states, including bounded domains with entrance boundaries, (ii) operator products of a Markov kernel and its adjoint, including two-block-type Gibbs samplers, (iii) iterated random functions and (iv) diffusion models, including overdampted Langevin diffusion with convex at infinity potentials. Joint work with M. Gerber (Bristol Univ.)
Links: YouTube, Slides
Tuesday, March 10, 2026
Speaker: Luhuan Wu (Flatiron, Johns Hopkins University) [Zoom Link]
Title: Reverse Diffusion Sequential Monte Carlo Samplers
Time: [ 8:30 am PT ] = [ 11:30 am ET ] = [ 3:30 pm London ] = [ 4:30 pm Paris ] = [ 0:30 am Beijing + 1d]
Abstract: Diffusion models have emerged as a powerful paradigm for generative modeling. In this talk, we explore their use as annealing paths for sampling from unnormalized target distributions. Building on prior work, we first present a unifying framework that leverages Monte Carlo methods to estimate score functions and simulate diffusion-based sampling trajectories. However, such approaches can suffer from accumulated bias due to time discretization and imperfect score estimation.
To address these challenges, we introduce a principled Sequential Monte Carlo (SMC) framework that formalizes diffusion-based samplers as proposal mechanisms while systematically correcting their biases. The key idea is to construct informative intermediate target distributions that progressively guide particles toward the final distribution of interest. Although the ideal targets are intractable, we derive exact approximations using quantities already available from the score-based proposal, requiring no extra inference overhead. The resulting method, Reverse Diffusion Sequential Monte Carlo, enables consistent sampling and unbiased estimation of the target normalization constant. We demonstrate our method on a range of synthetic targets and Bayesian regression tasks.