"Efficient and Optimized Sampling"
My research focuses on quasi-Monte Carlo (QMC) methods, which improve traditional Monte Carlo approaches by replacing IID random sampling with low-discrepancy (LD) sequences for more uniform sampling. I am primarily interested in the construction of LD point sets and sequences, both through number-theoretic methods improving classical approaches and computational techniques leveraging machine learning to optimize their distribution. The challenge lies in minimizing discrepancy measures, a complex and highly non-convex problem.
A major success of my work has been the development of Message-Passing Monte Carlo (MPMC), a graph neural network model that generates state-of-the-art LD point sets in the d-dimensional hypercube. However, I am also interested in non-ML optimization techniques, including non-gradient-based methods and combinatorial search strategies, to find optimal configurations of sample point distributions with respect to a given distribution and measure of irregularity.
More broadly, I consider myself a sampling specialist, having worked with MCMC, QMC, and adaptive sampling methods. I am always looking for new, interdisciplinary challenges where advanced sampling strategies can make a meaningful impact.