Papers and Preprints
2025
Ruijia Niu, Dongxia Wu, Rose Yu, Yi-An Ma. Functional-level Uncertainty Quantification for Calibrated Fine-tuning, preprint (2025). [arXiv]
Kun Wang, Sumanth Varambally, Duncan Watson-Parris, Yi-An Ma, Rose Yu. Discovering Latent Structural Causal Models from Spatio-Temporal Data, preprint (2025). [arXiv]
Dongxia Wu, Nikki Lijing Kuang, Ruijia Niu, Yi-An Ma, Rose Yu. Diffusion-BBO: Diffusion-Based Inverse Modeling for Online Black-Box Optimization, preprint (2025). [arXiv]
Yingyu Lin, Yuxing Huang, Wenqin Liu, Haoran Deng, Ignavier Ng, Kun Zhang, Mingming Gong, Yian Ma, Biwei Huang. A Skewness-Based Criterion for Addressing Heteroscedastic Noise in Causal Discovery, ICLR (2025). [arXiv]
Veeramakali Vignesh Manivannan, Yasaman Jafari, Srikar Eranky, Spencer Ho, Rose Yu, Duncan Watson-Parris, Yian Ma, Leon Bergen, Taylor Berg-Kirkpatrick. ClimaQA: An Automated Evaluation Framework for Climate Foundation Models, ICLR (2025). [arXiv]
Lingkai Kong, Yuanqi Du, Wenhao Mu, Kirill Neklyudov, Valentin De Bortoli, Dongxia Wu, Haorui Wang, Aaron M Ferber, Yian Ma, Carla P Gomes, Chao Zhang. Diffusion Models as Constrained Samplers for Optimization with Unknown Constraints, AISTATS (2025). [arXiv]
Amartya Sanyal, Yaxi Hu, Yaodong Yu, Yian Ma, Yixin Wang, Bernhard Schölkopf. Accuracy on the Wrong Line: On the Pitfalls of Noisy Data for Out-of-distribution Generalisation, AISTATS (2025). [arXiv]
2024
Yingyu Lin*, Yi-An Ma*, Yu-Xiang Wang*, Rachel Redberg, Zhiqi Bu. Tractable MCMC for Private Learning with Pure and Gaussian Differential Privacy, ICLR (2024). [arXiv]
Xunpeng Huang, Hanze Dong, Yifan Hao, Yi-An Ma, Tong Zhang. Reverse Diffusion Monte Carlo, ICLR (2024). [arXiv]
Xunpeng Huang, Difan Zou, Hanze Dong, Yi-An Ma, Tong Zhang. Faster Sampling without Isoperimetry via Diffusion-based Monte Carlo, COLT (2024). [arXiv]
Xunpeng Huang, Difan Zou, Hanze Dong, Yi Zhang, Yi-An Ma, Tong Zhang. Reverse Transition Kernel: A Flexible Framework to Accelerate Diffusion Inference, Neurips (2024, spotlight). [arXiv] (Best paper award at ICML 2024 workshop on "Structured probablistic inference and generative modeling").
Yuzhou Gu, Nikki Lijing Kuang, Yian Ma, Zhao Song, Lichen Zhang . Log-concave Sampling from a Convex Body with a Barrier: a Robust and Unified Dikin Walk, Neurips (2024).
Sumanth Varambally, Yi-An Ma, Rose Yu. Discovering Mixtures of Structural Causal Models from Time Series Data, ICML (2024). [arXiv]
Xunpeng Huang, Difan Zou, Hanze Dong, Yi-An Ma, Tong Zhang. Faster Sampling via Stochastic Gradient Proximal Sampler, ICML (2024). [arXiv]
Ruijia Niu, Dongxia Wu, Kai Kim, Yi-An Ma, Duncan Watson-Parris, Rose Yu. Multi-Fidelity Residual Neural Processes for Scalable Surrogate Modeling, ICML (2024). [arXiv]
Kyurae Kim, Joohwan Ko, Yi-An Ma, Jacob R. Gardner. Demystifying Doubly Stochastic Gradient Descent, ICML (2024). [arXiv]
Kyurae Kim, Yian Ma, Jacob R Gardner. Linear Convergence of Black-Box Variational Inference: Should We Stick the Landing? AISTATS (2024). [arXiv]
Dongxia Wu, Tsuyoshi Ide, Georgios Kollias, Jiri Navratil, Aurelie Lozano, Naoki Abe, Yi-An Ma, Rose Yu. Learning Granger Causality from Instance-wise Self-attentive Hawkes Processes, AISTATS (2024). [arXiv]
Wei Deng, Qian Zhang, Yi-An Ma, Zhao Song, Guang Lin, On Convergence of Federated Averaging Langevin Dynamics, UAI (2024). [arXiv]
Felix X.-F. Ye and Yi-An Ma and Hong Qian. Estimate exponential memory decay in hidden Markov model and its applications to inference, Physica D: Nonlinear Phenomena, 460, 134053 (2024).
2023
Abhishek Roy, Geelon So, Yi-An Ma. Optimization on Pareto Sets: On a Theory of Multi-Objective Optimization, preprint (2023). [arXiv]
Abhishek Roy, Yi-An Ma. A Central Limit Theorem for Stochastic Saddle Point Optimization, preprint (2023). [arXiv]
Nikki Lijing Kuang, Ming Yin, Mengdi Wang, Yu-Xiang Wang, Yi-An Ma. Posterior Sampling with Delayed Feedback for Reinforcement Learning with Linear Function Approximation. NeurIPS (2023). [arXiv]
Chaoyue Liu, Dmitriy Drusvyatskiy, Mikhail Belkin, Damek Davis, Yi-An Ma. Aiming towards the Minimizers: Fast Convergence of SGD for Overparametrized Problems, NeurIPS (2023). [arXiv]
Kyurae Kim, Jisu Oh, Kaiwen Wu, Yi-An Ma, Jacob R Gardner. On the Convergence of Black-Box Variational Inference, NeurIPS (2023). [arXiv]
Amin Karbasi*, Nikki Lijing Kuang*, Yi-An Ma*, Siddharth Mitra*. Langevin Thompson Sampling with Logarithmic Communication: Bandits and Reinforcement Learning, ICML (2023). [arXiv]
Dongxia Wu, Ruijia Niu, Matteo Chinazzi, Yi-An Ma, Rose Yu. Disentangled Multi-Fidelity Deep Bayesian Active Learning, ICML (2023). [arXiv]
Dongxia Wu, Ruijia Niu, Matteo Chinazzi, Alessandro Vespignani, Yi-An Ma, Rose Yu. Deep Bayesian Active Learning for Accelerating Stochastic Simulation, KDD (2023), [arXiv]
Kush Bhatia, Yi-An Ma, Anca D Dragan, Peter L Bartlett, Michael I Jordan. Bayesian Robustness: A Nonasymptotic Viewpoint, Journal of the American Statistical Association (JASA), 0, 1-12 (2023). [arXiv]
Bian Li, Yi-An Ma, J Nathan Kutz, Xiu Yang, The Adaptive Spectral Koopman Method for Dynamical Systems, SIAM Journal on Applied Dynamical Systems (SIADS), 22 (3), 1523-1551 (2023). [arXiv]
2022
Kush Bhatia*, Nikki Kuang*, Yi-An Ma*, Yixin Wang*. Statistical and Computational Trade-offs in Variational Inference: A Case Study in Inferential Model Selection, preprint (2022). [arXiv]
Yoav Freund*, Yi-An Ma*, Tong Zhang*. When is the Convergence Time of Langevin Algorithms Dimension Independent? A Composite Optimization Viewpoint, Journal of Machine Learning Research (JMLR), 23 (214): 1−32, 2022. [arXiv]
Yi-An Ma*, Teodor V. Marinov*, Tong Zhang*, Dimension Independent Generalization of DP-SGD for Overparameterized Smooth Convex Optimization, preprint (2022). [arXiv]
Dongxia Wu, Matteo Chinazzi, Alessandro Vespignani, Yi-An Ma, Rose Yu. Multi-fidelity Hierarchical Neural Processes, KDD (2022). [arXiv]
Ruoqi Shen, Liyao Gao, Yi-An Ma, On Optimal Early Stopping: Over-informative versus Under-informative Parametrization, preprint (2022). [arXiv]
Alexander D'Amour, Katherine Heller, Dan Moldovan, Ben Adlam, Babak Alipanahi, Alex Beutel, Christina Chen, Jonathan Deaton, Jacob Eisenstein, Matthew D Hoffman, Farhad Hormozdiari, Neil Houlsby, Shalobo Hou, Ghassen Jerfel, Alan Karthikesalingam, Mario Lucic, Yian Ma, Cory McLean, Diana Mincu, Akinori Mitani, Andrea Montanari, Zachary Nado, Vivek Natarajan, Christopher Nielson, Thomas F Osborne, Rajiv Raman, Kim Ramasamy, Rory Sayres, Jessica Schrouff, Martin Seneviratne, Shannon Sequeira, Harini Suresh, Victor Veitch, Max Vladymyrov, Xuezhi Wang, Kellie Webster, Steve Yadlowsky, Taedong Yun, Xiaohua Zhai, D Sculley. Underspecification Presents Challenges for Credibility in Modern Machine Learning, Journal of Machine Learning Research (JMLR), 23 (226): 1-61 (2022). [arXiv]
Xiaojie Qiu, Yan Zhang, Jorge D Martin-Rufino, Chen Weng, Shayan Hosseinzadeh, Dian Yang, Angela N Pogson, Marco Y Hein, Kyung Hoi Joseph Min, Li Wang, Emanuelle I Grody, Matthew J Shurtleff, Ruoshi Yuan, Song Xu, Yian Ma, Joseph M Replogle, Eric S Lander, Spyros Darmanis, Ivet Bahar, Vijay G Sankaran, Jianhua Xing, Jonathan S Weissman, Mapping transcriptomic vector fields of single cells, Cell, 185 (4), 690-711 (2022).
Evaluation of individual and ensemble probabilistic forecasts of COVID-19 mortality in the US, Proceedings of the National Academy of Sciences (PNAS), 119, e2113561119 (2022).
2021
Yi-An Ma, Niladri S. Chatterji, Xiang Cheng, Nicolas Flammarion, Peter L. Bartlett, Michael I. Jordan. Is there an analog of Nesterov acceleration for MCMC? Bernoulli, 27, 1942-1992 (2021). [PDF] [arXiv]
Wenlong Mou*, Yi-An Ma*, Martin J. Wainwright, Peter L. Bartlett, Michael I. Jordan. High-order Langevin diffusion yields an accelerated MCMC algorithm, Journal of Machine Learning Research (JMLR), 22 (42): 1–41 (2021). [arXiv]
Dongxia Wu, Liyao Gao, Xinyue Xiong, Matteo Chinazzi, Alessandro Vespignani, Yian Ma, Rose Yu. Quantifying Uncertainty in Deep Spatiotemporal Forecasting. KDD (2021). [arXiv]
Dongxia Wu, Liyao Gao, Xinyue Xiong, Matteo Chinazzi, Alessandro Vespignani, Yian Ma, Rose Yu. DeepGLEAM: A hybrid mechanistic and deep learning model for COVID-19 forecasting, preprint (2021). [arXiv]
Variational refinement for importance sampling using the forward Kullback-Leibler divergence. Ghassen Jerfel, Serena Wang, Clara Fannjiang, Katherine Heller, Yian Ma, and Michael I. Jordan. UAI (2021). [arXiv]
2020
Eric Mazumdar*, Aldo Pacchiano*, Yi-An Ma*, Peter L. Bartlett, Michael I. Jordan. On Approximate Thompson Sampling with Langevin Algorithms, ICML (2020). [arXiv]
Matthew Hoffman, Yi-An Ma. Black-Box Variational Inference as Distilled Langevin Dynamics. ICML (2020). [PDF] (Best industry paper award at the Neurips 2019 AABI symposium).
Michael Dusenberry, Ghassen Jerfel, Yeming Wen, Yi-An Ma, Jasper Snoek, Katherine Heller, Balaji Lakshminarayanan, Dustin Tran. Efficient and Scalable Bayesian Neural Nets with Rank-1 Factors. ICML (2020). [arXiv]
2019
Yi-An Ma, Yuansi Chen, Chi Jin, Nicolas Flammarion, Michael I. Jordan. Sampling can be faster than optimization, Proceedings of the National Academy of Sciences (PNAS), 116, 20881–20885 (2019). [arXiv]
Chris Aicher, Yi-An Ma, Nick Foti, Emily B. Fox. Stochastic gradient MCMC for state space models, SIAM Journal on Mathematics of Data Science (SIMODS), 1, 555–587 (2019). [arXiv] [code]
Xin Wang, Fisher Yu, Lisa Dunlap, Yi-An Ma, Ruth Wang, Azalia Mirhoseini, Trevor Darrell, Joseph E. Gonzalez. Deep mixture of experts via shallow embedding, Conference on Uncertainty in Artificial Intelligence (UAI 2019). [arXiv]
2018
Yi-An Ma, Emily B. Fox, Tianqi Chen, Lei Wu. Irreversible samplers from jump and continuous Markov processes, Statistics and Computing, Feb, 1–26 (2018). [arXiv]
Niladri S. Chatterji, Nicolas Flammarion, Yi-An Ma, Peter L. Bartlett, Michael I. Jordan. On the theory of variance reduction for stochastic gradient Monte Carlo, in Proceedings of International Conference on Machine Learning 35 (ICML 2018), 764–773. [arXiv]
2017
Yi-An Ma, Nicholas J. Foti, Emily B. Fox. Stochastic gradient MCMC Methods for hidden Markov models, in Proceedings of International Conference on Machine Learning 34 (ICML 2017), 2265–2274 (2017). [arXiv]
Xiaojie Qiu, Andrew Hill, Jonathan Packer, Dejun Lin, Yi-An Ma, Cole Trapnell, Single-cell mRNA quantification and differential analysis with Census, Nature Methods, 14, 309–315 (2017).
Yi-An Ma, Hong Qian, Felix Ye. Stochastic dynamics: Models for intrinsic and extrinsic noises and their applications, Scientia Sinica Mathematica, 47, 1693–1702 (2017).
Before 2016
Yi-An Ma, Tianqi Chen, Emily B. Fox. A complete recipe for stochastic gradient MCMC, in Advances in Neural Information Processing Systems 28 (NIPS 2015), 2899-2907. [arXiv] [BibTeX]
Yi-An Ma, Hong Qian. A thermodynamic theory of ecology: Helmholtz theorem for Lotka-Volterra equation, extended conservation law, and stochastic predator-prey dynamics, Proceedings of the Royal Society A, 471, 20150456 (2015). [arXiv] [BibTeX]
Yi-An Ma, Hong Qian. Universal ideal behavior and macroscopic work relation of linear irreversible stochastic thermodynamics, New Journal of Physics, 17, 065013 (2015). [arXiv] [BibTeX]
Yian Ma, Qijun Tan, Ruoshi Yuan, Bo Yuan, Ping Ao. Potential function in a continuous dissipative chaotic system: Decomposition scheme and role of strange attractor, International Journal of Bifurcation and Chaos, 24, 1450015 (2014). [PDF] [arXiv] [BibTeX]
Ruoshi Yuan, Yi-An Ma, Bo Yuan, Ping Ao. Lyapunov function as potential function: A dynamical equivalency, Chinese Physics B, 23, 010505 (2014). [arXiv] [BibTeX]
Ruoshi Yuan, XinanWang, Yian Ma, Bo Yuan, Ping Ao. Exploring a noisy van der Pol type oscillator with a stochastic approach, Physical Review E, 87, 062109 (2013). [arXiv] [BibTeX]
Yian Ma, Qijun Tan, Ruoshi Yuan, Bo Yuan, Ping Ao. Decomposition scheme in continuous dissipative chaotic systems and role of strange attractors, in Proceedings of 22nd International Conference on Noise and Fluctuations (ICNF), 2013 (IEEE 2013) pp. 1-3.
Ying Tang, Ruoshi Yuan, Yian Ma. Dynamical behaviors determined by the Lyapunov function in competitive Lotka-Volterra systems, Physical Review E, 87, 012708 (2013). [arXiv] [BibTeX]
Yian Ma, Ruoshi Yuan, Yang Li, Ping Ao, Bo Yuan. Lyapunov functions in piecewise linear systems: From fixed point to limit cycle, preprint (2013). [arXiv]
Ruoshi Yuan, Yian Ma, Bo Yuan, Ping Ao. Potential function in dynamical systems and the relation with Lyapunov function, in Proceedings of 30th Chinese Control Conference (CCC), 2011 (IEEE, 2011) pp. 6573-6580.
Currently publishing as: Yi-An Ma