On this page, I illustrate my recent research on Monte Carlo methods.
joint work with Neil K. Chada, Benedict Leimkuhler, and Peter A. Whalley from Heriot-Watt University and University of Edinburgh
We present the unbiased UBU (UBUBU) method for Bayesian posterior means based on kinetic Langevin dynamics that combines advanced splitting methods with enhanced gradient approximations. Our approach avoids Metropolis correction by coupling Markov chains at different discretization levels in a multilevel Monte Carlo approach. Theoretical analysis demonstrates that our proposed estimator is unbiased, attains finite variance, and satisfies a central limit theorem. It can achieve accuracy ε > 0 for estimating expectations of Lipschitz functions in d dimensions with O(d^1/4 ε^−2) expected gradient evaluations, without assuming a warm start. We exhibit similar bounds using both approximate and stochastic gradients, and our method’s computational cost is shown to scale independently of the size of the dataset. The proposed method is tested using a multinomial regression problem on the MNIST dataset and a Poisson regression model for soccer scores. Experiments indicate that the number of gradient evaluations per effective sample is independent of dimension, even when using inexact gradients. For product distributions, we give dimension-independent variance bounds. Our results demonstrate that the unbiased algorithm we present can be much more efficient than the “gold-standard” randomized Hamiltonian Monte Carlo.
Elimination of bias by increasing burn-in lengths at higher discretization levels.
Dimensional dependence of gradient evaluations /Effective Sample Size over all components for Gaussian targets.
joint work with George Deligiannidis, Alexandre Bouchard-Côté and Arnaud Doucet from Oxford and UBC.
In this paper, we analyse the high dimensional behaviour of a recently discovered non-reversible Markov Chain Monte Carlo method called Bouncy Particle Sampler (BPS). This process is moving along straight lines with bounces occurring at a certain rate depending on the position to guarantee convergence to the target distribution. We have found that although BPS is moving along straight lines, in high dimensions, the behaviour of BPS tends to another process called Randomized Hamiltonian Monte Carlo (RHMC), which follows Hamiltonian equations (hence it is curved). We analyse the mixing properties of this limiting process, and show that it has dimension-free convergence rate for strongly log-concave target distributions. This understanding allows us to propose appropriate tuning parameters for the Bouncy Particle Sampler that improve its efficiency.
The figures below show the convergence of BPS towards RHMC by comparing the paths of BPS with the contours of the Hamiltonian energy (which is preserved by deterministic dynamics of RHMC and it is only changed when the jumps in velocity happen).