I organise the informal PhD seminar in Probability and Statistics taking place twice a month on Monday. Here is an up to date list of the talks and slides.

Upcoming sessions

TBA.

Past sessions

Yanyan Hu - Asymptotic properties for the parameter estimation in SDE with Hölder drift - 25th of March 2024

By taking Zvonkin's transformation, we investigate parameter estimation for a class of multidimensional stochastic differential equations with small perturbation parameters in diffusion coefficients, where the drift coefficients not only have an unknown parameter but also are Hölder continuous. These processes may enhance the applicability of our results to considerable practical models. Due to the irregular drift, the primary challenge is dealing with the mean square error between an accurate and numerical solution. Under these settings, we demonstrate the consistency and asymptotic normality of error concerning the least squares estimator in probability when the stepsize and small parameter go to 0 simultaneously. Moreover, we extend the results to the case of stochastic functional differential equations.

Marc Constanje - Bridge processes on manifolds - 11th of March 2024

Simulation of bridge processes is a widely used tool for statistics for stochastic processes. Such processes arise when the original process is conditioned to be in a given state at a given time. A common tool to study bridge processes is Doob's h-transform. However, a problem with the transformation is that it relies on the, typically intractable, transition density of the process. We instead consider the technique of conditioning by guiding, which circumvents this problem by using the same transformation but with a different h-function whilst maintaining absolute continuity with respect to the true bridge process.

The talk will focus on conditioning manifold-valued semimartingales. We describe semimartingales on manifolds through the so-called "rolling without slipping" (Eels-Elworthy-Malliavin) construction: Mapping an Rd-valued semimartingale to the frame bundle of the manifold and then projectiong it back to the manifold.

In the talk I'll briefly discuss the construction of manifold-valued semimartingales and then move to the simulation of bridge processes through guiding. I’ll finish by demonstrating methods for parameter inference for the drift of stochastic processes on a manifold.


Jasper Rou - Deep Gradient Flow Methods for Option Pricing in Diffusion Models - 26th of February 2024

Abstract ; We develop a novel deep learning approach for pricing European options written on assets that follow (rough) diffusion dynamics. The option pricing problem is formulated as a partial differential equation, which is approximated via a new implicit-explicit gradient flow time-stepping approach, involving approximation by deep, residual-type Artificial Neural Networks (ANNs) for each time step. In particular, we split the PDE operator in a symmetric gradient flow with known energy functional and an asymmetric part in which we substitute the neural network of the previous time step, so that we can treat it explicitly. 

We compare our method with the related Deep Galerkin Method (DGM) and with deriving the conditional characteristic function of the stock price which leads to the option price with the COS method. In the lifted Heston model with twenty volatility processes, the curse of dimensionality makes deriving the characteristic function too slow, while our method remains fast and accurate.

Ardjen Pengel - Gaussian Approximation for High-dimensional MCMC - 12th of February 2024

Abstract ; Markov Chain Monte Carlo (MCMC) methods are generally acknowledged to be the most versatile algorithms for simulating a probability distribution of interest. The fundamental idea behind MCMC is to construct a Markov chain such that its stationary distribution is given by the distribution of interest. The widespread use of Markov Chain Monte Carlo (MCMC) methods for high-dimensional applications has motivated research into the scalability of these algorithms with respect to the dimension of the problem.  Despite this, numerous problems concerning output analysis in high-dimensional settings have remained unaddressed.  

We present novel dimension-dependent Gaussian approximation results for a broad range of MCMC algorithms. Finally, we demonstrate how these results can be used for uncertainty quantification and show that the termination time of numerous MCMC algorithms scales polynomially in dimension while ensuring a desired level of precision.