Term 2

Wednesday 25th January (Harrison 250)

Toby Jones

A framework for understanding the correlation between aggregated losses of compound events.  (slides)

Individual extreme events, such as flooding, cause widespread destruction and have a large economic impact. Compound events, such as extratropical cyclones, are events where two (or more) natural hazards occur at once.
While the financial losses from compound events can be colossal, over a longer time period the aggregated damage from such events can cause substantial insured losses. For instance, the three named storms (Dudley, Eunice and Franklin) that hit Europe in February 2022 cost insurers over €3.7 billion. A better understanding of the contributing factors to the yearly sum of losses (aggregate risk) is of specific use to the insurance industry.
Hunter et al. (2016) investigated how aggregate risk is influenced by frequency and intensity. The framework we have introduced aims to understand how the correlation between aggregate risks is influenced by the frequency and intensities of compound events.
The derivation of the framework using random sums is outlined and the positive impact of event clustering on correlation is explored. Corollaries of the framework, including the correlation between aggregate risk and event frequency, are also covered. As an illustration, the framework has been applied to European storms, using maximum wind speed and total rainfall as proxy measures for loss. The framework fits well, capturing most of the features of the correlation, including the difference between the last and sea.


Wednesday 8th February (Harrison 250)

Yu Mao

Aspects in Anabelian Geometry  (slides)

Anabelian geometry can be viewed as a far-reaching generalisation of class field theory, the goal of anabelian geometry is trying to understand how much information of a scheme can be obtained from its etale fundamental group. In this talk, we will briefly introduce the main conjectures in anabelian geometry (i.e. Grothenideck’s birational anabelian conjecture and Grothendieck’s section conjecture) and the current progress on these conjectures. Furthermore, we will discuss some interesting problems other than the main conjectures in anabelian geometry.

Wednesday 15th February (Harrison 250)

Weiteng Qiu

Winter Subtropical Highs, the Hadley Circulation and Baroclinic Instability (slides)

Subtropical highs have a profound influence on the weather and climate of adjacent continents. In this study, we use reanalysis data to investigate the interannual variability and trends in winter subtropical highs from 1979 to 2021. We find dynamical relationships between subtropical high intensity, the Hadley and Ferrel Circulation intensity and the Eady Growth Rate. A poleward shift of the maximum in Eady Growth Rate is associated with a strengthening of the descending branches of the Ferrel and Hadley Cells and with subtropical troposphere adiabatic warming and an increased intensity and poleward movement of the subtropical highs. Poleward Eady Growth Rate shifts are dominated by changes in vertical wind shear which, in turn, are in thermal wind balance with variations and trends in temperature. The northern hemisphere subtropical highs show stronger relationships and are associated with La-Niña conditions. Mechanisms for interannual variations are similar to those for trends.

Wednesday 22nd February (Harrison 209)

Nell Hartney

Moisture with Gusto: towards test cases in moist shallow water models using the Gusto dynamical core toolkit (slides)

The shallow water equations are widely used in weather and climate model development. A simpler equation set than the full three-dimensional fluid equations, they are computationally cheap but still retain many pertinent features of atmospheric dynamics.
As a tool for investigating time-stepping schemes, however, the shallow water equations are in a sense too simple, as they neglect any sub-grid scale physics processes - that is, processes such as moist physics that happen on scales too small for the grid to capture. Moist physics processes can present challenges to time-stepping schemes from a numerical point of view because moist physics introduces new timescales into the problem, as well as non-linear switch-like behaviour. For this reason, we are interested in introducing moisture into the shallow water equations - yes, adding moisture to water equations, but we’re using them to model the dry atmosphere and not water here! - to produce a model with both large-scale fluid dynamics and sub-grid scale microphysics that would be suitably challenging for testing time-stepping schemes.
In this talk I will discuss our implementation of moist shallow water models in the dynamical core toolkit Gusto, which mimics the properties of the next-generation Met Office model, LFRic. I will also speak about the suite of test cases that we are proposing for moist shallow water in the style of existing shallow water tests.

Wednesday 1st March (Harrison 250)

Jakob Wessel

Getting the tails right: multisite precipitation modelling using GAMLSS (slides)

Weather generators are statistical models that emulate the features of local scale weather variables and link them to large scale climatic ones. They have a variety of applications, the most important one being climate model downscaling whereby the resolution of climate models is increased to generate local information about climate impacts.
Standard weather generator approaches for daily precipitation are often based on so-called Generalised Linear Models (GLMs). These models have been found to be successful in emulating the main properties of daily precipitation, however they tend to have issues with properly representing seasonality in the tails of the rainfall distribution. Oftentimes the highest quantiles of summer rainfall are underestimated, whilst the ones of winter rainfall are overestimated. In the work I will present, we extend upon standard modelling approaches and use the framework of Generalized Additive Models for Location Scale and Shape (GAMLSS) to model daily rainfall at multiple locations.
I will introduce the GAMLSS framework for the modelling of multisite precipitation and show that it has positive effects for the representation of seasonal dependence in the tails of the distribution. I present how to do statistical model selection in the context of very high spatial dependence – unresolvable by the model itself – and finally present a new scheme that accounts for location dependence in simulating daily rainfall values, including dependence between rainfall occurrence and amounts at neighbouring stations. 

Wednesday 8th March (Harrison 250)

Velizar Kirkow

Modelling instabilities in Astrophysical Fluid Dynamics (slides)

Hydrodynamic stability theory can be used to understand under what conditions do laminar flows become unstable. Electrically conducting fluids such as plasma are no different. The difference is that there is now a magnetic field which provides additional forcing to the system through the Lorentz force. As such the talk is intended to introduce the audience to the theory of linear stability analysis, building up from hydrodynamic stability to hydrodynamic-convective instability before cumulating with presenting some preliminary findings in the case of a uniform magnetic field using MHD. While the majority of the talk will be presenting numerical results, I will also cover analytical methods in the form of asymptotic expansion used to isolate and examine long-wave driven instability and demonstrate how this corroborates the numerical findings. I will finish the talk off with a brief discussion about the future directions of my research and its potential applications to the Solar dynamo.

Wednesday 15th March (Harrison 250)

Kate Nechyporenko

Neuronal Network Model of Posterdorsal Medial Amygdala (slides)

Kisspeptin is a hormone that is considered a main regulator of the reproductive function in mammals. While the role of kisspeptin has been extensively studied in the hypothalamic region of the brain, little is known about its importance in other parts. In particular, large population of kisspeptin and its receptors have been found in the posterodorsal medial amygdala (MePD), where it acts as an upstream input for the sub-populations of neurotrasmitters of gamma-aminobutyric acid (GABA) and glutamate. We propose a coarse-grained model that captures the cooperative and competitive dynamics between the sub-populations of neurotransmitters using Wilson-Cowan framework. We implement the bifurcation analysis to study the connectivity strength between the sub-populations and the role of the afferent input from kisspeptin. Our model demonstrates the dynamical changes in the MePD output for different levels of kisspeptin, providing insight about the functional interactions between GABA and glutamate.”

Wednesday 22nd March (Harrison 250)

Madhuparna Das

Selberg's Central limit theorem for families of L-functions (slides)

The main topic of this talk is to give a simple proof of Selberg’s Central Limit Theorem (SCLT) for appropriate families of L-functions. All the families of Lfunctions we consider in this talk belong to the “Selberg Class”.
We start with the proof of Selberg’s central limit theorem for classical automorphic L-functions of degree 2 associated with holomorphic cusp forms, in the t aspect. Further, we give a brief overview of the proof of SCLT for GL(3) Dirichlet L-functions in the q-aspect.
Finally, we conclude that talk by giving a brief overview of the independence of the families automorphic L-functions attached to a sequence of primitive holomorphic cusp forms.

Wednesday 29th March (Harrison 250) 15:30-16:30

Michael Dunne

Uncertainty Quantification: How best to efficiently analyse complex numerical models  (slides)

Real-world phenomena are often modelled by numerical models with more advanced features of these phenomena being accounted for as computing has become more accessible. As a result these models take longer to run and analyses (such as calibration or sensitivity analysis) thus become unfeasible due to the number of model runs they require and the computational cost of each model run. This is where Uncertainty Quantification comes in.
Uncertainty Quantification (UQ) refers to a statistical analysis of these complex models (aka simulators, digital twin); in this case via a Bayesian framework. There are two facets to UQ that are being presented:
1) An emulator (aka surrogate model) which ‘models the model’. It is a statistical representation of the model which is trained through model runs of the simulator. For a given input the emulator produces a mean prediction and an uncertainty which mimics what the simulator would produce if it was evaluating the same input. The advantage to an emulator is that it can be run almost instantaneously compared to the simulator meaning we can perform the aforementioned analyses on the emulator which uses several order of magnitudes lower computational resources when compared to using the simulator directly.
2) History matching: a type of inverse modelling. It is the process of sequentially ruling out regions of input space that are inconsistent with a desired output (such as an observation) until we are left with the combinations of inputs which could ‘feasibly’ evaluate to give our desired output. We use the emulator to conduct this method.
We present these facets and apply them to a COVID model used to estimate cases and deaths given a set of initial conditions and parameter settings.