RESEARCH INTERESTS
Mathematical Optimization
Variational Analysis
Data Science and Machine Learning
Accepted/Published Publications
1. (with P.D. Khanh, H-C. Luong, and B. S. Mordukhovich) Fundamental Convergence Analysis of Sharpness-Aware Minimization, to appear in Advances in neural information processing systems (NeurIPS) 2024, link
2. (with P.D. Khanh, H-C. Luong, B. S. Mordukhovich and T. Vo) Convergence of Sharpness-aware Minimization with momentum, to appear in Communications in Computer and Information Science, ITTA 2024
3. (with P.D. Khanh, B. S. Mordukhovich, and V. T. Phat) Inexact proximal methods for weakly convex functions, to appear J. Glob. Optim. link
4. (with P.D. Khanh, and B. S. Mordukhovich) Inexact reduced gradient methods in nonconvex optimization, J. Optim. Theory Appl. (2023), link
5. (with P.D. Khanh, and B. S. Mordukhovich) A new inexact gradient descent method with applications to nonsmooth convex optimization, Optim. Method Softw. (2024), DOI: 10.1080/10556788.2024.2322700
6. (with P.D. Khanh, B. S. Mordukhovich, and V. T. Phat) Globally convergent coderivative-based generalized Newton methods in nonsmooth optimization, Math. Program. 205A (2024), 373-429. Link
7. (with P.D. Khanh, B. S. Mordukhovich, and V. T. Phat) Generalized damped Newton algorithms in nonsmooth optimization via second-order subdifferentials, J. Global Optim. 86 (1), 93-122 (2023). Link
Submitted Manuscripts
1. (with D. H. Cuong, P.D. Khanh, and B. S. Mordukhovich) Local Convergence Analysis for Nonisolated Solutions to Derivative-Free Methods of Optimization, submitted (2024)
2. (with P.D. Khanh, and B. S. Mordukhovich) Globally Convergent Derivative-Free Methods in Nonconvex Optimization with and without Noise, submitted (2024)
REFEREE SERVICE
Mathematical Programming Computation, Optimization, Journal of Optimization Theory and Applications, Set-Valued and Variational Analysis, Optimization Letters, Vietnam Journal of Mathematics, Bulletin of the Iranian Mathematical Society
Semi-Plenary Talks
Workshop on Nonsmooth Optimization and Applications (NOPTA), University of Antwerp, Belgium (2024). Website
Title: General derivative-free optimization methods under global and local Lipschitz continuity of gradients. Slides
Contributed Talks
The 25th Midwest Optimization Meeting - Workshop on Large Scale Optimization and Applications (2023)
SIAM Conference on Optimization, University of Washington, Seattle (2023)
International Conference on Optimization and Variational Analysis with Applications, Ha Noi, Vietnam (2023)
The 24th Midwest Optimization Meeting - Workshop on Large Scale Optimization and Applications (2022) Website
Title: Inexact reduced gradient methods in smooth nonconvex optimization. Slides
The XIII International Symposium on Generalized Convexity and Monotonicity (2022) Website
Title: Inexact reduced gradient methods in smooth nonconvex optimization. Slides
Great Lakes SIAM annual meeting, Wayne State University (2022)
Applied Mathematics - Analysis seminar at the Mathematics Department, Wayne State University (2022)
Title: Inexact reduced gradient methods in smooth nonconvex optimization. Slides
The 23rd Midwest Optimization Meeting (In memory of Professor Asen Dontchev (1948-2021)), Grand Valley State University, Allendale, Michigan, US, October 29 - 30 (2021) Website
Title: Generalized Damped Newton Algorithms in Nonsmooth Optimization. Slides