"Fast decentralized gradient tracking for federated learning with local updates: From mini to minimax optimization" In submission, 2024
Tong Zhang, Chris Junchi Li, "Enhancing stochastic optimization for statistical efficiency using ROOT-SGD with diminishing stepsize" In submission, 2024
Chris Junchi Li, "Accelerated fully first-order methods for bilevel and minimax optimization" In submission, 2024
Angela Yuan, Chris Junchi Li, Gauthier Gidel, Michael I Jordan, Quanquan Gu, Simon S Du, "Optimal extragradient-based algorithms for stochastic variational inequalities with separable structure" In Advances in Neural Information Processing Systems (NeurIPS) 33338-33351, 2023 [arXiv] [poster]
Chris Junchi Li*, Angela Yuan*, Gauthier Gidel, Quanquan Gu, Michael I Jordan, "Nesterov meets optimism: Rate-optimal separable minimax optimization" In International Conference on Machine Learning (ICML), PMLR 202:20351-20383, 2023 [mlr.press] [arXiv] [slides] [poster]
Chris Junchi Li, Michael I Jordan, "Nonconvex stochastic scaled gradient descent and generalized eigenvector problems" In Conference on Uncertainty in Artificial Intelligence (UAI), PMLR 216:1230-1240, 2023 [mlr.press] [arXiv] [poster]
Haikuo Yang, Luo Luo, Chris Junchi Li, Michael I Jordan, Maryam Fazel, "Accelerating inexact hypergradient descent for bilevel optimization" Manuscript in submission, 2023 [arXiv] [slides]
Angela Yuan, Chris Junchi Li, Gauthier Gidel, Michael I Jordan, "A higher-order resolution continuous-time theory for a class of accelerated smooth minimax optimization algorithms" Manuscript in submission, 2023 [arXiv] [slides]
Zixiang Chen*, Chris Junchi Li*, Angela Yuan*, Quanquan Gu, Michael I Jordan, "A general framework for sample-efficient function approximation in reinforcement learning" In International Conference on Learning Representations (ICLR), 2023. Deep RL Workshop, NeurIPS 2022 [openreview.net] [arXiv] [slides]
ICLR Spotlight Presentation (notable-top-25% among all accepted)
Chris Junchi Li*, Dongruo Zhou*, Quanquan Gu, Michael I Jordan, "Learning two-player mixture Markov games: Kernel function approximation and correlated equilibrium" In Advances in Neural Information Processing Systems (NeurIPS) 33262-33274, 2022. Deep RL Workshop, NeurIPS 2021 [openreview.net] [arXiv]
(α-β) Simon S Du, Gauthier Gidel, Michael I Jordan, Chris Junchi Li, "Optimal extragradient-based stochastic bilinearly-coupled saddle-point optimization" Manuscript in submission, 2022 [arXiv] [slides]
Chris Junchi Li*, Wenlong Mou*, Martin J Wainwright, Michael I Jordan, "ROOT-SGD: Sharp nonasymptotics and asymptotic efficiency in a single algorithm" In Conference on Learning Theory (COLT), PMLR 178:909-981, 2022 [mlr.press] [arXiv] [slides]
Chris Junchi Li*, Yaodong Yu*, Nicolas Loizou, Gauthier Gidel, Yi Ma, Nicolas LeRoux, Michael I Jordan, "On the convergence of stochastic extragradient for bilinear games using restarted iteration averaging" In International Conference on Artificial Intelligence and Statistics (AISTATS), PMLR 151:9793-9826, 2022 [mlr.press] [arXiv] [slides]
NeurIPS OPT2021 Workshop Oral Presentation (short version, 4/16 = 25% among all talks)
Chris Junchi Li, Michael I Jordan, "Stochastic approximation for online tensorial independent component analysis" In Conference on Learning Theory (COLT), PMLR 134:3051-3106, 2021 [mlr.press] [arXiv] [slides]
Wenlong Mou, Chris Junchi Li, Martin J Wainwright, Peter L Bartlett, Michael I Jordan, "On linear stochastic approximation: Fine-grained Polyak-Ruppert and non-asymptotic concentration" In Conference on Learning Theory (COLT), PMLR 125:2947-2997, 2020 [mlr.press] [arXiv] [slides]
Haishan Ye, Zhichao Huang, Cong Fang, Chris Junchi Li, Tong Zhang, "Hessian-aware zeroth-order optimization" Manuscript in submission. Workshop on Beyond First Order Methods in ML, NeurIPS 2019 [arXiv]
(α-β) Wenqing Hu, Chris Junchi Li, Xiangru Lian, Ji Liu, Huizhuo Yuan, "Efficient smooth non-convex stochastic compositional optimization via stochastic recursive gradient descent" In Advances in Neural Information Processing Systems (NeurIPS) 6929-6937, 2019 [neurips.cc] [arXiv]
Huizhuo Yuan, Yuren Zhou, Chris Junchi Li, Qingyun Sun, "A continuous limit theory for nonsmooth ADMM variants" In International Conference on Machine Learning (ICML), 2019 (short version) [mlr.press] [arXiv]
(α-β) Cong Fang, Chris Junchi Li, Zhouchen Lin, Tong Zhang, "Spider: Near-optimal non-convex optimization via stochastic path-integrated differential estimator" In Advances in Neural Information Processing Systems (NeurIPS) 689-699, 2018 [neurips.cc] [arXiv] [slides]
NeurIPS Spotlight Presentation (4.08%)
Chris Junchi Li, Mengdi Wang, Han Liu, Tong Zhang, "Diffusion approximations for online principal component estimation and global convergence" In Advances in Neural Information Processing Systems (NeurIPS) 645-655, 2017 [neurips.cc] [arXiv] [video]
NeurIPS Oral Presentation (1.23%)
Chris Junchi Li, Mengdi Wang, Han Liu, Tong Zhang, "Near-optimal stochastic approximation for online principal component estimation" Mathematical Programming, 167(1), 75–97, 2018 [DOI] [arXiv] [slides]
Chris Junchi Li, "A general continuous-time formulation of stochastic ADMM and its variants" Manuscript, 2020 [arXiv]
(α-β) Wenqing Hu, Chris Junchi Li, Lei Li, Jian-Guo Liu, "On the diffusion approximation of nonconvex stochastic gradient descent" Annals of Mathematical Sciences and Applications, 4(1), 3–32, 2019 [DOI] [arXiv]
(α-β) Wenqing Hu, Chris Junchi Li, "A convergence analysis of the perturbed compositional gradient flow: Averaging principle and normal deviations" Discrete & Continuous Dynamical Systems, A38(10), 4951–4977, 2018 [DOI] [arXiv]
(α-β) Jianqing Fan, Wenyan Gong, Chris Junchi Li, Qiang Sun, "Statistical sparse online regression: A diffusion approximation perspective" In International Conference on Artificial Intelligence and Statistics (AISTATS) 1017–1026, 2018 [mlr.press]
Chris Junchi Li, Zhaoran Wang, Han Liu, "Online ICA: Understanding global dynamics of nonconvex optimization via diffusion processes" Advances in Neural Information Processing Systems (NeurIPS) 4967–4975, 2016 [neurips.cc] [arXiv]
Chris Junchi Li, Axelrod's Model in Two Dimensions, PhD Dissertation, Duke University [dukespace]