Stochastic Optimization and its Application
Stochastic approximation provides a rigorous frameworks for designing, analyzing convergence, studying rate of convergence, etc. of the algorithms in stochastic optimizations, mini-batch computation algorithms, optimizations with sampling, etc. The classical results often require some continuity of the dynamics (equivalently, the continuous differentiability of the loss functions) and then characterize and analyze the convergence by dynamical systems generated from ODEs and the rate of convergence by diffusion generated from SDEs. However, these assumptions often fail in the modern practical problems and thus, it is necessary to propose a general (but also rigorous) framework, novel methods, and new insights for stochastic approximations with the discontinuous dynamics and/or the appearance of set-valued mappings, the non-smoothness of the loss functions, social (mean-field) interaction, and among others.
Related Publications:
Stochastic Approximation with Discontinuous Dynamics, Differential Inclusions, and Applications (with G. Yin), Annals of Applied Probability, Vol. 33 (2023), 780–823. [Journal] [arXiv]
Maximum likelihood estimation of diffusion using continuous time Markov chain (with J. Kirkby, D. Nguyen, Duy Nguyen), Computational Statistics and Data Analysis, Vol. 168 (2022), 107408. [Journal] [arXiv]
Inversion-free Subsampling Newtons Method For Large Sample Logistic Regression (with J. Kirkby, D. Nguyen, Duy Nguyen), Statistical Papers, Vol. 63 (2022), 943-963. [Journal] [pdf]
An Efficient method to Simulate Diffusion Bridges (with H. Chau, J. Kirkby, D. Nguyen, Duy Nguyen, T. Nguyen), Statistics and Computing, Vol. 34 (2024), 131. [Journal][pdf]
On The Inversion-free Newton’s Method And Its Applications (with H. Chau, J. Kirkby, D. Nguyen, Duy Nguyen, T. Nguyen), International Statistical Review, to appear.