Time: 1:30-2:30 pm MST
Arizona State University
DRQKL: Distributionally Robust Quantum Kernel Learning via Low-Rank Nyström Approximation
Abstract: Quantum support vector machines (QSVMs) based on fidelity kernels can work well on small datasets, but training typically requires O(n^2) kernel evaluations and performance may drop under changes such as class imbalance, label noise, or shifts in the data input distribution. In this study, we present DRQKL, a distributionally robust quantum kernel learning method for QSVMs that aims to reduce the cost of kernel learning and improve reliability when test data are not drawn from the same distribution as the training data. Using low-rank Nyström approximation with $m << n$ landmark points, we construct a positive semidefinite (PSD) approximate kernel with O(nm) kernel evaluations, which enables training at larger scale. We then train the QSVM with a distributionally robust optimization (DRO) objective that minimizes the worst-case expected hinge loss over a neighborhood of the empirical distribution, using either a \chi^2-divergence or CVar constraints. We provide an analysis that relates the Nyström approximation error and the DRO radius to the classifier's risk, showing how approximation and robustness affect generalization under bounded distribution shift.
Bio: Duong Do is a fifth-year PhD student in the Department of Electrical, Computer, and Energy Engineering at Arizona State University, advised by Dr. Duong Nguyen. His current research lies at the intersection of operations research, and quantum computing, with a specific focus on developing robust and efficient quantum optimization and learning models for solving combinatorial optimization.