Upcoming Seminar Presentations
All seminars are on Tuesdays [ 8:30 am PT ] = [ 11:30 am ET ] = [ 4:30 pm London ] = [ 5:30 pm Paris ] = [ 11:30 pm Beijing + 1d]
Subscribe to our mailing list and calendar for up-to-date schedule!
In the first part, I introduce Posterior Mean Matching (PMM), which derives generative modeling directly from online Bayesian inference. PMM translates conjugate Bayesian updates into iterative refinement rules for sampling, yielding a family of expressive generative models for real-valued, count, and discrete data. This perspective reveals how classical Bayesian machinery can underlie state-of-the-art generative processes across diverse data modalities.
In the second part, I move in the opposite direction with Structured Flow Autoencoders (SFA), which use powerful generative flows to enhance Bayesian inference. SFA augments graphical models with conditional flow-based likelihoods and introduces a flow-matching objective that explicitly accounts for latent variables, enabling simultaneous learning of structured posteriors and high-quality generators. Applied to images, videos, and single-cell RNA-seq data, SFA delivers both stronger latent representations and higher quality samples.
Together, these works show that combining Bayesian inference’s structural understanding with generative modeling’s expressive sampling can produce models that are more interpretable, more adaptable across data types, and capable of producing high-quality generations.
This is joint work with Sebastian Salazar, Michal Kucer, Emily Casleton, David Blei, Yidan Xu, and Long Nguyen.