Group Meeting 2018

This website records our group meeting events since July, 2018.

Dec. 19, 2018. Jiaoshun Xiao. Connecting human genome with 2D face. Jiaoshun introduced his recent work on human genome with its connection to 2D face data.

Dec. 14, 2018. Zhiyuan Yu. Deep learning on point cloud. Zhiyuan introduced pointnet [link] and pointnet ++ [link] for 3D Classification and Segmentation.

Dec. 7, 2018. Shunkang Zhang. Flow-based deep generative models. Shunkang introduced Non-linear Independent Components Estimation (NICE) [link] and its generalization, such as density estimation using real-valued non-volume preserving [link]. Interestingly, this work is also closely related to Neural Ordinary Differential Equations [link].

  • Nov. 23, 2018. Yuan Gao. Deep Generative models. Yuan Discussed Deep Generative models, include GAN and VAE. In particular, he made connection of the two lines of representative work via optimal transport and then discussed recent progresses in this field, e.g., DCGAN, WGAN, AAE, and WAE. [Slides].
  • Nov. 22, Yuling JIAO. GAN and Composite Functional Gradient. Yuling made an informal discussion about training GAN with composite functional gradient [link].
  • Nov. 1, 2018. Jingsi MING. Deep learning with application in single cell data analysis. Jingsi discussed some deep models with their applications in single cell data analysis, including scVAE [link] and scVI [link]. These methods are compared with classical methods, such as ZIFA [link], ZINB-WAVE [link].
  • Oct 25, 2018. Weizhi ZHU. Some Recent Progresses on deep learning. Weizhi presented some recent progresses on deep learning in Prof. Yuan YAO's group, including Robust Estimation and Generative Adversarial Networks [link] and On Breiman's Dilemma in Neural Networks: Phase Transitions of Margin Dynamics [link].
  • September 28, 2018. Jingsi Ming. Linear Response Methods for Accurate Covariance Estimates from Mean Field Variational Bayes. Mean field variational Bayes (MFVB) is a popular posterior approximation method due to its fast runtime on large-scale data sets. However, it is well known that a major failing of MFVB is that it underestimates the uncertainty of model variables. We discussed a method called "linear response variational Bayes (LRVB)" [link]. When the MFVB posterior approximation is in the exponential family, LRVB has an analytic form for correcting posterior covariance from MFVB approximation. A deep discussion about this topic can be found in paper "Covariances, Robustness, and Variational Bayes" [link].
  • September 19, 2018. Xiaomeng Wan. As motivated by the papers "Consistent Shape Maps via Semidefinite Programming" [link] and "Near-Optimal Joint Object Matching via Convex Relaxation" [link], we discussed how to formulate a joint object matching problem into a SDP-relaxtion problem. This type of technique suggests that it is possible to "built Rome in one day" [link]. A closely related formulation is described in "Low-Rank Doubly Stochastic Matrix Decomposition for Cluster Analysis" [link] and related technique was also discussed "Regularized Optimal Transport and theRot Mover's Distance" [link].
  • September 7, 2018. Mingxuan Cai. IGREX: Quantifying impart of genetically regulated expression on complex traits. As motivated by a Science paper "Impact of regulatory variation from RNA to protein"[link] , we first discussed the role of regulatory variation in complex traits and diseases [link]. Then we presented a statistical method, IGREX, to quantify impart of genetically regulated expression on complex traits. We have applied IGREX to many complex traits, yielding a deeper understanding of GREX.
  • August 16, 2018. Xianghong Hu. Empirical Bayes Matrix Factorization. We discussed a statistical framework for matrix factorization based Empirical Bayes [link], where a key step is achieved by solving the Empirical Bayes normal means problem [link]. with connection to some popular low-rank matrix approximation methods, such as SoftImpute [link], Alternating Least Square (ALS)-SoftImpute [link] and Penalized Matrix Decomposition [link]. Interested readers are further referred to Generalized Low-Rank Models [link].
  • August 10, 2018. Youquan Pei. Heterogeneity pursuit and Robustness in a general dependence measure. Youquan presented two topics: a community-detection-based heterogeneity pursuit and robustness of ball covariance for characterizing general dependencies between two random variables, in particular, with connection to Distance Correlation [link]. See an interesting comment on measuring nonlinear dependence [link], using Distance correlation and MIC [link].
  • August 4, 2018. Yuling Jiao. Iterative Regularization for Variational Inference (Part III). We discussed variational inference in deep learning, including variational auto encoder (VAE) [link] and Generative adversarial network (GAN) [link]. In particular, we discussed Stein Variational Gradient Descent (SVGD) for training GAN -- Stein GAN [link].
  • July 26, 2018. Yuling Jiao and Yuan Gao. Iterative Regularization for Variational Inference (Part II). We discussed Stein Variational Gradient Descent (SVGD) [link], which is also closely related to Composite Functional Gradient [link]. We illustrated its usage based on a mixture model, Bayesian linear regression and logistic regression [link]. A comprehensive review on variational inference can be found here [link].
  • July 23, 2018. Hao Peng. Gaussian Process. Hao discussed Gaussian Process and its connection with linear mixed models.
  • July 16, 2018. Yuling Jiao. Iterative Regularization for Variational Inference (Part I). We first discussed both explicit and inexplicit regularization in statistical learning, and then briefly went through Reproducible Kernel Hilbert Space. These concepts were connected to Stein Variational Gradient Inference [link], a very recent topic in advanced machine learning.
  • July 11, 2018. Chang Su. Statistical Methods for Pleiotropy Mapping and Annotation Selection in GWAS. We considered integrative analysis of genomic data by leveraging pleiotropy and functional annotations. In particular, three statistical methods, GPA [link], Latent Sparse mixed model (LSMM) [link] and iMap [link] were under discussion.
  • July 4, 2018. Chao Zhou. Relevance vector machine (RVM) and SMART. We discussed the relationship between relevance vector machine (RVM) and related methods, such as Bayesian Lasso. In particular, we saw the connection between RVM and SMART, a recently developed methods in statistical genetics [link].