Deep Generative Models

Learning the generative model, i.e., the underlying data generating distribution, based on large amounts of data is one of the fundamental tasks in machine learning and statistics. Recent progresses in deep generative models have provided novel techniques for unsupervised and semi-supervised learning, with broad applications varying from image synthesis, semantic image editing, image-to-image translation to low-level image processing. However, statistical understanding of deep generative models is still lacking, e.g., why the logD trick works well in training generative adversarial network (GAN).

Portrait images generated by VGrow

The training data set, comprised of about 10,000 images collected from the Wiki-Art project.

We introduce a general framework, variational gradient flow (VGrow), to learn a deep generative model to sample from the target distribution via combing the strengths of variational gradient flow on probability space, particle optimization and deep neural network. This work has been published at International Conference on Machine Learning (ICML), 2019.

  • The proposed framework is applied to minimize the f-divergence between the evolving distribution and the target distribution.

  • We prove that the particles driven by VGrow are guaranteed to converge to the target distribution asymptotically.

  • Connections of our proposed VGrow method with other popular methods, such as VAE, GAN and flow-based methods, have been established in this framework, gaining new insights of deep generative learning.

  • We have also discovered a new f-divergence, named “logD” divergence, which serves as the objective function of the logD-trick GAN.

Progressive training curves on CIFAR10

Progressive training curves on CelebA

Progressive training of VGrow using different divergences on CIFAR10

Progressive training of VGrow using different divergences on CelebA

High resolution images

Anime-Generation

Reference and Software

  • Yuan Gao, Yuling Jiao, Yang Wang, Yao Wang, Can Yang, Shunkang Zhang. Deep Generative Learning via Variational Gradient Flow. International Conference on Machine Learning (ICML), 2019. [ICML link][Software][Demo_code]

  • Gefei Wang, Yuling Jiao, Qian Xu, Yang Wang, Can Yang. Deep Generative Learning via Schrodinger Bridge. [ICML][Arxiv][DGLSB Software]. PMLR 139:10794-10804, 2021.