I am Bao X. D. Nguyen (also known as Nguyễn Xuân Duy Bảo in Vietnamese, N.X.D. Bao in publications), a researcher and lecturer in Mathematics.
🎓 Philosophy Doctor
Faculty of Applied Science, Department of Applied Mathematics
Ho Chi Minh City University of Technology (HCMUT), VNU-HCM
🏢 Office: Room B4-104
📍 Adress: 268 Ly Thuong Kiet Street, Ward Dien Hong, Ho Chi Minh City, Vietnam
📞 Phone: (+84) 326.029.421
✉️ Email (work): nxdbao@hcmut.edu.vn
✉️ Email (personal): nxdbao@gmail.com
🎓 Ph.D. in Applied Mathematics (2019 – 2023)
Ho Chi Minh City University of Science, Vietnam National University – Ho Chi Minh City
Thesis: Optimality conditions in nonsmooth optimization and related problems
Supervisors: Prof. Phan Quoc Khanh, Assoc. Prof. Nguyen Minh Tung
🎓 M.Sc. in Applied Mathematics (2016 – 2019)
Ho Chi Minh City University of Science, Vietnam National University – Ho Chi Minh City
Thesis: Higher-order Karush–Kuhn–Tucker multiplier rules for set-valued optimization problems
Supervisor: Dr. Nguyen Minh Tung
🎓 B.Sc. in Mathematics Education (2012 – 2016)
Ho Chi Minh City University of Education
My research focuses on:
Higher-Order Optimality Conditions: Development of necessary/sufficient conditions and advanced KKT rules for nonsmooth and set-valued optimization problems.
Generalized Derivatives & Variational Tools: New concepts of set-valued derivatives with calculus rules, metric regularity, and variational analysis techniques.
Sensitivity Analysis: First- and higher-order sensitivity of solution maps, marginal functions, and parametric equilibrium problems.
Constraint Qualifications & Multipliers: Analysis of generalized CQs and properties of multiplier sets ensuring robust optimality conditions.
Convex analysis: Characterizations of convexity and generalized convexity in nonsmooth and non-Lipschitz settings.
Applications: Optimization with generalized inequalities, equilibrium problems, generalized equations, and game-theoretic models (Nash equilibrium, minimax theorems).
Recently, my interests extend to optimization methods in data science, focusing on: robust and stable recovery in convex regularized inverse problems, accelerated gradient dynamics (ODE perspectives of Nesterov’s scheme), and stochastic subgradient algorithms for large-scale convex optimization under uncertainty.