Software

R packages


anMC: Compute High Dimensional Orthant Probabilities.

Computationally efficient method to estimate orthant probabilities of high-dimensional Gaussian vectors. Further implements a function to compute conservative estimates of excursion sets under Gaussian random field priors. Available on CRAN. Implements the methods described in Azzimonti and Ginsbourger (2018).


pGPx: Pseudo-Realizations for Gaussian Process Excursions.

Computes pseudo-realizations from the posterior distribution of a Gaussian Process (GP) with the method described in Azzimonti et al. (2016) <doi:10.1137/141000749>. The realizations are obtained from simulations of the field at few well chosen points that minimize the expected distance in measure between the true excursion set of the field and the approximate one. Also implements a R interface for (the main function of) Distance Transform of sampled Functions (<http://cs.brown.edu/people/pfelzens/dt/index.html>). Available on CRAN. Implements the methods described in Azzimonti et al. (2016).


profExtrema: Compute and Visualize Profile Extrema Functions.

Computes profile extrema functions for arbitrary functions. If the function is expensive-to-evaluate it computes profile extrema by emulating the function with a Gaussian process (using package 'DiceKriging'). In this case uncertainty quantification on the profile extrema can also be computed. The different plotting functions for profile extrema give the user a tool to better locate excursion sets. Available on CRAN. Implements the methods described in Azzimonti et al. (2017+).


Coauthored packages


KrigInv: Kriging-Based Inversion for Deterministic and Noisy Computer Experiments.

Criteria and algorithms for sequentially estimating level sets of a multivariate numerical function, possibly observed with noise. Available on CRAN. Implements the methods described in Azzimonti et al. (2018+), Chevalier et al. (2014a), Picheny et al. (2010), Bichon et al. (2008) and Ranjan et al. (2008). See Chevalier et al. (2014b) for an introductory paper.

Python code


SIFGP: Sparse Information Filter for Fast Gaussian Process Regression. (contributor)

A Tensorflow 2.0 implementation for a novel, fast training method for sparse Gaussian Process regression. We focus on training with mini-batches which allows scaling the training method to millions of data points. By exploiting an Information filter formulation and an independence assumption over the mini-batches, we propose a stochastic gradient descent method that approximate sparse variational GP. The method is around 4 times faster than SVGP on the same hardware for comparable accuracy. More details are described in Kania et al. (2021).


SkewGP: SkewGPs for classification, preference learning, mixed likelihood problems. (contributor)

Gaussian Processes (GPs) are powerful nonparametric distributions over functions. For real-valued outputs, we can combine the GP prior with a Gaussian likelihood and perform exact posterior inference in closed form. However, in other cases, such as classification, preference learning, ordinal regression and mixed problems, the likelihood is no longer conjugate to the GP prior, and exact inference is known to be intractable.

In the papers Benavoli et al. (2020), Benavoli et al. (2021), Benavoli et al. (2021b), we derive closed-form expression for the posterior process in all the above cases (not only for regression). The posterior process is a Skew Gaussian Process (SkewGP). SkewGPs are more general and more flexible nonparametric distributions than GPs, as SkewGPs may also represent asymmetric distributions. Moreover, SkewGPs include GPs as a particular case.


SRGP: Stochastic Recursive Gaussian Process Regression. (contributor)

This code implements a fast training method for sparse Gaussian Process regression. We focus on training with mini-batches which allows scaling the training method to millions of data points. By exploiting a Kalman filter formulation, we propose a novel approach that estimates the model parameters by recursively propagating the analytical gradients of the posterior over mini-batches of the data. Implements the method described in Schürch et al. (2020).