My research concerns the analysis of random objects in complex geometries arising in neuroimaging. The main focus has been on machine learning methods that provide uncertainty estimates when dealing with geometrical data. Furthermore, the geometry of such uncertainty estimates is also of interest.
Quite often I find inspiration in optimal transport, Bayesian statistics and Riemannian geometry. The way these three subjects intersect in my work, is by the optimal transport framework providing a geometry for probability distributions. Under the geometric framework, we can apply manifold-valued statistics to study populations of uncertain objects. Furthermore, such geometric frameworks can be applied to advance the learning of probability distributions.
A large body of my work has considered the so called wrapped Gaussian processes (WGPs) on Riemannian manifolds. The aim of this work was to generalize Gaussian process (GP) regression to deal with manifold-valued response variables. This way, a priori known geometrical constraints can be incorporated in statistical analysis. Taking these constraints into account has many advantages, such as not wasting resources on learning the constraints (e.g. in manifold learning), providing faithful representations for predicting new data points, and offering a family of probabilistic models that do not assign probability mass to impossible data points.
Research interests. Gaussian processes, Optimal transport, Manifold-valued statistics, Uncertainty quantification, Probabilistic numerics.