Research

RESEARCH INTERESTS

Orcid  Google Scholar

PUBLICATIONS

2024

7. (with P.D. Khanh, and  B. S. Mordukhovich) A new inexact gradient descent method with applications to nonsmooth convex optimization, Optim. Method Softw. (2024), DOI: 10.1080/10556788.2024.2322700

6. (with P.D. Khanh, B. S. Mordukhovich, and V. T. Phat) Globally convergent coderivative-based generalized Newton methods in nonsmooth optimization, Math. Program. 205A (2024), 373-429. Link

5. (with P.D. Khanh,  H-C. Luong, and B. S. Mordukhovich) Fundamental Convergence Analysis of Sharpness-Aware Minimization, submitted

2023

4. (with P.D. Khanh, and  B. S. Mordukhovich) Inexact reduced gradient methods in nonconvex optimization, J. Optim. Theory Appl. (2023), link 

3. (with P.D. Khanh, and B. S. Mordukhovich) General Derivative-Free Optimization Methods under Global and Local Lipschitz Continuity of Gradients, submitted

2. (with P.D. Khanh, B. S. Mordukhovich, and V. T. Phat) Inexact proximal methods for weakly convex functions, submitted

REFEREE SERVICE

Conferences Organizing

Semi-Plenary Talks

Title: General derivative-free optimization methods under global and local Lipschitz continuity of gradients. Slides

Contributed Talks 

Title: Inexact reduced gradient methods in smooth nonconvex optimization. Slides

Title: Inexact reduced gradient methods in smooth nonconvex optimization. Slides

Title: Inexact reduced gradient methods in smooth nonconvex optimization. Slides

Title: Generalized Damped Newton Algorithms in Nonsmooth Optimization. Slides