Our methods are compared with MCMC, SVGD [1], WNAG [2] and WNes [3]. WNAG and WNes are two accelerated methods based on W-GF.
We select the kernel bandwidth using either the MED method or the proposed BM method.
Test accuracy. BM method.
Test log-likelihood. BM method.
Test accuracy. ME Dmethod
Test log-likelihood. MED method.
BM method accelerates and stabilizes the performance of GFs and AIGs. The performance of MCMC and WGF are similar and they achieve the best log-likelihood. For a given metric, AIG flows have better test accuracy and test log-likelihood in first 2000 iterations. W-AIG and KW-AIG achieve $75\%$ test accuracy in less than 500 iterations.
[1] Qiang Liu and Dilin Wang. Stein variational gradient descent: A general purpose bayesian inference algorithm. In Advances in neural information processing systems, pages 2378–2386, 2016.
[2] Chang Liu, Jingwei Zhuo, Pengyu Cheng, Ruiyi Zhang, Jun Zhu, and Lawrence Carin. Ac- celerated first-order methods on the Wasserstein space for Bayesian inference. arXiv preprint arXiv:1807.01750, 2018.
[3] Chang Liu, Jingwei Zhuo, Pengyu Cheng, Ruiyi Zhang, and Jun Zhu. Understanding and accelerating particle-based variational inference. In International Conference on Machine Learning, pages 4082–4092, 2019.