Paolo Pigato (University of Rome Tor Vergata)
Title: Multivariate Rough Volatility (Slides)
Abstract: We review some empirical facts of financial markets that have motivated the rough volatility paradigm for modelling financial volatility, both from the point of view of financial time series and options pricing.
Motivated by empirical evidence from the joint behavior of realized volatility time series, we propose to model the joint dynamics of log-volatilities using a multivariate fractional Ornstein-Uhlenbeck process. This model is a multivariate version of the Rough Fractional Stochastic Volatility model proposed in Gatheral, Jaisson, and Rosenbaum, Quant. Finance, 2018. It allows for different Hurst exponents in the different marginal components and non trivial interdependencies. We discuss the main features of the model, propose parameter estimators, derive their asymptotic theory and perform a simulation study that confirms the asymptotic theory in finite sample. We carry out an extensive empirical investigation on emprical realized volatility time series, showing that these time series are strongly correlated and can exhibit asymmetries in their empirical cross-covariance function, accurately captured by our model. These asymmetries lead to spillover effects, which we derive analytically within our model and compute based on empirical estimates of model parameters. Moreover, in accordance with the existing literature, we observe behaviors close to non-stationarity and rough trajectories.
Jose Pedraza Ramirez (University of Manchester)
Title: Optimal Prediction of the Last r-Excursion Time of Brownian Motion Models (Slides)
Abstract: We investigate the optimal prediction of the last $r$-excursion time for a Brownian motion model. The last $r$-excursion time, denoted by $l_{r}$, refers to the right endpoint of the last negative excursion lasting longer than a constant $r>0$. It reduces to the standard last passage time when $r\downarrow 0$. For a Brownian motion with drift $\mu >0$ and volatility $\sigma >0$, our goal is to identify an optimal stopping time that minimises the ($L_{1}$) distance from the last $r$-excursion time $l_{r}$. We find that the optimal stopping barrier exhibits two distinct structures: a constant barrier (characterized as a solution of a non-linear equation) or a moving barrier (characterised by the unique solution to an integral equation) depending on the ratio $R=\frac{\mu \sqrt{r}}{\sigma }$ which integrates a firm's financial profitability, volatility, and risk tolerance to financial distress. To obtain the optimal stopping time, we examine the smooth fit condition, Lipschitz continuity of the barrier, and probability regularity of the boundary points. As an application in risk management, we develop a decision rule that informs the timing of business expansion and contraction.
Nadhir Ben Rached (University of Leeds)
Title: Stochastic Optimal Control with Chance Constraints (Slides)
Abstract: In this work, we seek an optimal short-term, continuous-time power procurement schedule to minimise operating expenditure and carbon footprint of cellular wireless networks equipped with energy storage capacity, and hybrid energy systems consisting of uncertain renewable energy sources. The network operator needs to ensure a certain QoS constraint with high probability. This probabilistic constraint prevents us from using dynamic programming to solve the continuous-time stochastic optimal control problem. We introduce a time-continuous Lagrangian relaxation approach tailored for real-time power procurement in cellular networks, overcoming tractability issues associated with probabilistic QoS constraints. The numerical solution procedure involves building an efficient upwind finite difference solver for the Hamilton--Jacobi--Bellman equation corresponding to the relaxed problem, and an effective stochastic sub-gradient method to efficiently navigate the stochastic problem structure. The proposed numerical approach is applied on a model cellular network base station based on the German power system and daily cellular traffic data. Our approach demonstrates computational efficiency, providing near-optimal solutions in practical timeframes.
Debbie Falden (University of Liverpool)
Title: Calibration of risk aversion to real pension asset allocation (Slides)
Abstract: An investor's risk aversion is a fundamental element in financial decision-making and preferences, but lacks a standardised calibration method. We introduce a method to infer an investor’s risk aversion based on the observed asset allocation of their pension savings. By assuming the actual allocation is optimal under a constant relative risk aversion (CRRA) utility, we invert Merton’s optimal investment formulas to estimate the risk aversion parameter. The approach incorporates the present value of future premiums, resulting in strategies that align with life-cycle pension products. To ensure stability, we develop a customised risky fund matched with the investor's allocation, enabling reliable calibration across various asset classes. A numerical study on a Danish pension portfolio demonstrates the practical use. The findings show realistic, stable risk aversion levels consistent with the CRRA assumption and provide a tool to better understand and benchmark the implicit preferences embedded in pension product design.
Şule Şahin (University of York)
Title: Third-Generation Reduction Factors for UK Ogden Tables: Employment, Disability and Work-Life Expectancy
Abstract: Determining compensation for personal injuries and fatalities arising from wrongful acts is primarily a legal matter, but the underlying methods are largely drawn from actuarial science. The UK has one of the most developed personal injury litigation frameworks, featuring standardised methods and actuarial tables—the Ogden Tables [1].
Traditionally, compensation in the UK has been awarded as a lump sum to cover future losses, such as lost earnings and ongoing care costs. This requires assumptions about life expectancy, work-life expectancy, inflation, investment returns and taxation.
In this research, we use the Labour Force Survey (LFS) longitudinal dataset and focus on contingencies other than mortality, estimating employment probabilities to develop what we refer to as third-generation reduction factors using Markov models. We examine how different disability definitions affect employment probabilities, and move beyond a single summary value to a full distributional characterisation of work-life expectancy. We also consider the dynamic nature of disability and include disability transitions in an extended model framework.
Moses Kargbo (University of Leeds)
Title: Modelling Supply Chain Dynamics under Disruption Using the Concept of Stochastic Reaction Networks (SRNs) (Slides)
Abstract: Supply chains are networks of three or more organizations or individuals that coordinate the flow of goods, information, and services from source to customer. Disruptions in supply chains are unexpected events that temporarily halt production. Building on the principles of stochastic reaction networks, we model supply chains using the Logistic-Leap method—an extension of the Delayed-Leap method—designed to capture delays in systems such as manufacturing, transportation, and logistics. This approach integrates both instantaneous consumption and delayed production, enabling the simulation of supply chains within stochastic push systems.
We further extend the model by incorporating disruption events governed by a continuous-time Markov chain (CTMC), characterised by its transition rate matrix (Q-matrix) under two states. We apply both methods to a manufacturing supply chain with five processes. The results provide insights into inventory dynamics and time-step effects, offering a robust framework for simulating complex supply chain behaviour under uncertainty. The findings for the disrupted system clearly demonstrate the impact of disruptions on the final output of the chain.
Samira Amiriyan (University of Liverpool)
Title: Computing the implied volatility through neural networks with asymptotic regimes (Slides)
Abstract: The accurate and efficient computation of implied volatility remains a fundamental task in financial modeling, particularly for option pricing. Recent works have used neural networks to estimate implied volatility, often as part of calibrating financial models. However, these methods usually struggle with approximating implied volatilities in extreme regimes such as very large or small strikes and maturities. In this paper, we propose a novel approach to address these limitations, one that combines the universal approximation properties and the flexibility of neural networks with the heavily studied characteristics of implied volatility surfaces. Inspired by Jackel’s seminal approach, the price-log-moneyness domain is partitioned into three parts while the neural network learns a suitable partition of unity and approximations of the implied volatility on each domain of the partition. Numerical experiments demonstrate superior accuracy and generalisation, particularly when benchmarked against popular asymptotic formulas and classical neural network methods.
Kessean Mitto (University of Leeds)
Title: Bayesian CART with Hierarchical Priors (Slides)
Abstract: We investigate Bayesian Classification and Regression Trees (BCART) with a focus on applications to insurance pricing, a domain where accurate risk estimation is essential. Traditional BCART models typically assume independent priors for terminal node parameters, which can lead to over-penalization of tree complexity and lack of local smoothness in predictions. While prior work has addressed these limitations in Gaussian settings using hierarchical priors, this paper extends that innovation to Poisson-distributed data—common in insurance claim modeling. The primary contribution is the development of a novel hierarchical prior for Poisson regression trees. This framework introduces structured multiplicative relationships between parent and child node parameters via gamma-distributed adjustment factors. This approach aims to facilitates local regularization and smooth variation of claim frequency estimates across the predictor space, mitigating overfitting and enhancing interpretability.