(with M. Etienne, M. Chahine, T. K. Rusch, D. Rus)
"NeuroLDS" is the first machine learning-based framework for generating low-discrepancy sequences.
'Sequences' here differ from 'point sets'; sequences are extensible in the number of sampling nodes while maintaining a small discrepancy across all N, while point sets are optimized for a fixed N.
NeuroLDS outperforms several previous classical LDS constructions by a significant margin, and we also demonstrate their effectiveness across applications in scientific machine learning, robot motion planning and numerical integration.
Link to Model Code (Coming Soon)
Schematic of the NeuroLDS model.
Each index is mapped to a sinusoidal feature vector, and then encoded features are passed through an L-layer multilayer preception (MLP), which outputs sampling nodes.
The collection of X's generated forms a (learned) low-discrepancy sequence with very small discrepancy.
(LEFT) Input, random training data.
(RIGHT) Output, generated (learned) low-discrepancy point set
(with T. K. Rusch, M. Bronstein, C. Lemieux, D. Rus)
This article introduces "Message-Passing Monte Carlo (MPMC)", the first machine learning approach for generating low-discrepancy point sets which are essential for efficiently filling space in a uniform manner, and thus play a central role in many problems in science and engineering. To accomplish this, MPMC utilizes tools from Geometric Deep Learning, specifically by employing Graph Neural Networks.
A video depiction of the training procedure of an instance of Message-Passing Monte Carlo for 100 points in two dimensions.
(Animation credit: T. K. Rusch)
A PyData Chicago talk on 'Merging Monte Carlo with Machine Learning' where I discuss the Message-Passing Monte Carlo method.
(with J. Chen and H. Jiang)
As the dimension of your problem grows, the efficacy of QMC degrades and depends heavily on the existence of low-dimensional structure in the problem.
In this work, we study the construction of high-dimensional QMC point sets via combinatorial discrepancy. We establish error bounds for integral approximation of these constructions in weighted function spaces and present some numerical results to show the effectiveness of this approach.
An illustration of the SubgTrans algorithm as presented in this paper.
A population set of size N^2 is partitioned into two sets of equal N^2/2 points. This process is repeated until we have N sets of N points which are, in some sense, QMC or space-filling.
Two-dimensional digital nets in base golden ratio, with 21 and 34 points respectively.
(with C. Lemieux and J. Wiart)
We introduce a new QMC construction generalising the classical digital nets and sequence to irrational based, number theoretic constructions.
We provide complete equidistribution framework, focusing on the golden ratio base, but is provided more generally for specific classes of irrational numbers. Included are numerical studies of the discrepancy which illustrate an improvement in distribution properties over classical integer based constructions.