Multilevel and multifidelity methods are related families of techniques that exploit sequences of approximate simulations to optimally trade-off computation time, bias and variance of a Monte Carlo estimator of an otherwise challenging expectation. Through application of linearity of expectation and coupling schemes between simulations, an expensive expectation may be expressed as a telescoping summation that can be estimated very efficiently. These methods have been shown to potentially improve computation time by orders of magnitude and improve the convergence rate of the estimator in mean square error. While these methods show promise for acceleration of Bayesian inference methods, there are practical challenges that my research seeks resolve, such as generic coupling schemes, and tuning parameters.
TP Prescott, DJ Warne, RE Baker. (2024) Efficient multifidelity likelihood-free Bayesian inference with adaptive computational resource allocation. Journal of Computational Physics 112577 DOI arXiv.org
DJ Warne, TP Prescott, RE Baker, MJ Simpson. (2022) Multifidelity multilevel Monte Carlo to accelerate approximate Bayesian parameter inference for partially observed stochastic processes. Journal of Computational Physics, 469:111543 DOI arXiv.org
DJ Warne, RE Baker, MJ Simpson. (2019) Simulation and inference algorithms for stochastic biochemical reaction networks: from basic concepts to state-of-the-art. Journal of the Royal Society Interface, 16:20180943 DOI arXiv.org
DJ Warne, RE Baker, MJ Simpson. (2018) Multilevel rejection sampling for approximate Bayesian computation. Computational Statistics & Data Analysis, 124:71–86 DOI arXiv.org
DJ Warne, RE Baker, MJ Simpson (2016) Accelerating computational Bayesian inference for stochastic biochemical reaction network models using multilevel Monte Carlo sampling. bioRxiv.org
When a complex computer simulator is too computationally expensive to consider for simulation based statistical inference, it is common practice to use a surrogate model. While this surrogate may perform a good approximation of the expensive model output, any parameter estimates obtained through inference with this surrogate are potentially biased. We have considered various schemes to quantify this bias and learn transformations of approximate posterior samples to be representative of the exact posterior distribution.
X Wang, RP Kelly, AL Jenner, DJ Warne, C Drovandi. (2024) A comprehensive guide to simulation-based inference in computational biology. arXiv.org
X Wang, RP Kelly, DJ Warne, C Drovandi. (2024) Preconditioned neural posterior estimation for likelihood-free inference. Transactions on Machine Learning Research (TMLR) 09:2758 URL arXiv.org
DJ Warne, OJ Maclaren, EJ Carr, MJ Simpson, C Drovandi. (2024) Generalised likelihood profiles for models with intractable likelihoods. Statistics and Computing 34:50 DOI arXiv.org
RP Kelly, DJ Nott, DT Frazier, DJ Warne, C Drovandi. (2024) Misspecification-robust sequential neural likelihood. Transactions on Machine Learning Research (TMLR) 06:2347 URL arXiv.org
JJ Bon, DJ Warne, DJ Nott, C Drovandi. (2022) Bayesian score calibration for approximate models. arXiv.org
DJ Warne, RE Baker, MJ Simpson. (2021) Rapid Bayesian inference for expensive stochastic models. Journal of Computational and Graphical Statistics, 31:512-528 DOI arXiv.org
In addition to advanced algorithms, computational efficiency can be improve substantially though fine grain parallelism. This parallelism can be accessed through CPU single instruction multiple data (SIMD) operations, and through co-processors such as general purpose graphics processing units (GPGPUs).
DJ Warne, SA Sisson, C Drovandi. (2021) Vector operations for accelerating Bayesian computation -- A tutorial guide. Bayesian Analysis, 17(2):593-622 DOI arXiv.org
AS Hurn, KA Lindsay, DJ Warne. (2016) A heterogeneous computing approach to maximum likelihood parameter estimation for the Heston model of stochastic volatility. The ANZIAM Journal, 57:C364-381 DOI