Researcher in Computational Mathematics
Illinois Institute of Technology, Chicago, USA
My name is Nathan Kirk & I'm a Senior Researcher at the Department of Applied Mathematics at the Illinois Institute of Technology, Chicago, USA working with Prof. Fred Hickernell. Before that, I was a researcher in the Department of Statistics and Actuarial Sciences at University of Waterloo, Canada mentored by Prof. Christiane Lemieux. I finished a PhD in Mathematics at Queen's University Belfast, Northern Ireland supervised by Dr. Florian Pausinger in 2023.
My main research interest for the past number of years remains improving sampling methodologies using ideas from state-of-the-art machine learning algorithms and ideas from Bayesian inference. Specifically, I am interested in constructing sampling schemes which are, in some sense, representative of the underlying known or assumed distribution. Ideally, these point sets and sequences to replace classical purely random Monte Carlo methods in simulations and applications as an alternative to variance reduction techniques. Additionally, I am increasingly involved with my collaborators on investigating what optimized sampling procedures can lend to problems from wider fields including algebraic statistics, and problems where uncertainty must be quantified, e.g., cardiology, fusion and finance.
This article introduces "Message-Passing Monte Carlo (MPMC)", the first machine learning approach for generating low-discrepancy point sets which are essential for efficiently filling space in a uniform manner, and thus play a central role in many problems in science and engineering. To accomplish this, MPMC utilizes tools from Geometric Deep Learning, specifically by employing Graph Neural Networks.
(LEFT) Input, random training data.
(RIGHT) Output, generated (learned) low-discrepancy point set
For the 'Frontiers in Probabilistic Inference' Workshop at ICLR 2025, we introduce Stein-Message-Passing Monte Carlo or Stein-MPMC utilizing the same GNN framework as the original MPMC model, with one change.
The objective function in Stein-MPMC is a kernel Stein discrepancy (KSD) which has the ability to assess the quality of a sampling node set to a nonuniform reference distribution F with known density function.
That is, Stein-MPMC generates low-discrepancy samples from nonuniform distributions if the density function is available.
Other Recent News
March 2025: Invited speaker at IMSI Workshop, Kernel Methods in Uncertainty Quantification and Experimental Design.
March 2025: "Low Stein discrepancy via Message-Passing Monte Carlo" is accepted for presentation at the Frontiers in Probabilistic Inference: Learning Meets Sampling workshop at the International Conference on Learning Representation (ICLR) 2025.
December 2024: Invited speaker for PyData Chicago, a public lecture series [link above]
October 2024: Awarded several hundred hours on Argonne National Laboratory's Sophia Computer through the ALCF Directors Discretionary Allocation.
September 2024: Our article "Message-Passing Monte Carlo: Generating low-discrepancy point sets via graph neural networks" is accepted in Proceedings of the National Academy of Sciences of the United States of America (PNAS).
August 2024: Together with colleagues Christiane Lemieux and Ben Feng, we organize the primary quasi-Monte Carlo methods conference, MCQMC 2024, at the University of Waterloo. Over 150 participants and talks are expected with 9 plenaries and 2 tutorials.
August 2024: Begin new role as Senior Research Associate in the Department of Applied Mathematics at Illinois Institute of Technology in Chicago, USA, mentored by Prof. Fred Hickernell.
June 2024: "Message-Passing Monte Carlo: Generating low-discrepancy point sets via graph neural networks" is accepted for presentation at the AI4Science Workshop at the International Conference on Machine Learning (ICML) 2024.
June 2024: "An improved Halton sequence for implementation in quasi-Monte Carlo methods" accepted for presentation and publication in Proceedings of the 2024 Winter Simulation Conference.
December 2023: "Partitions for stratified sampling" accepted to Monte Carlo Methods and Applications.
September 2023: Begin new role as Postdoctoral Research Fellow at the University of Waterloo, mentored by Prof. Christiane Lemieux.
June 2023: Successfully defend my PhD thesis "Several Problems in Discrepancy Theory" with examiners Prof. Martin Mathieu and Markus Kiderlin at Queen's University Belfast, Northern Ireland.
February 2023: "On the expected L2-discrepancy of jittered sampling" accepted to Uniform Distribution Theory.
September-October 2022: Visiting PhD student at the University of Waterloo with Prof. Christiane Lemieux.