Seasonal-to-centennial predictability of the Euro-Atlantic circulation.
The impact of including stochastic parameterizations in climate models.
These themes often overlap. Below are some slightly more specific research themes.
The signal-to-noise paradox
Much of my research is motivated by the so-called `signal-to-noise paradox' of forecasts of the North Atlantic Oscillation (NAO). This says that seasonal-to-decadal forecasts can be skilful, but that the amplitude of the predictable signal is underestimated in forecast models by a factor of up to 5. This suggests that there are teleconnections responsible for the skill but that the forecast models are messing up these teleconnections somehow. I spend a lot of time trying to unravel the complex `tug-of-war' between the various proposed teleconnections (ENSO, Arctic sea-ice, the stratosphere, ...) and the biases forecast models exhibit in these.
Weather regimes and the eddy-driven jet
I have gotten considerable mileage out of taking the jet decomposition of Tim Woollings deadly seriously. This involves decomposing the jet variability into a jet speed component and a jet latitude component. These are orthogonal (i.e. uncorrelated) on daily timescales and behave very differently. On seasonal timescales the jet variability is dominated by the jet latitude, which is predictable and visibly multimodal; the jet speed is by contrast mostly unpredictable. On decadal timescales the variability is dominated by slow timescales in the Gaussian jet speed, with the jet latitude varying chaotically around its mean position. The jet speed and jet latitude also respond differently to imposed thermal forcing.
A lot of my work is concerned with making use of this information to improve upon analysis based on single indices (like the NAO index), which mix all these things together and therefore make everything more confusing. The multimodality of the jet latitude suggests its behaviour is best understood using a framework of weather regimes, modelled using Markov chains. I think a lot about how to understand predictability in these terms.
Topological approaches to studying the climate
One of the biggest problems arising when trying to study the climate systems is that (a) the dimensionality of the system is enormous relative to the amount of observations we have, and (b) most computational tools scale horribly with dimension (the `curse of dimensionality'). Recently I've tried to understand how tools from topological data analysis can be used to study climate variability. In particular, persistent homology turns out to be an extremely powerful tool for detecting topological features (connected components, holes/loops, voids, etc.) in dynamical systems. After doing any initial pre-processing, the algorithms used here scale at worst linearly upon increasing the dimension! Hence you can keep adding more atmospheric variables to your dataset without being killed by the compute cost. It turns out topological features of dynamical systems correspond to important dynamical features, such as regimes. Can we compute the `shape' of the atmosphere in this way?
Impacts of stochasticity on global warming projections
It turns out that adding stochastic parameterizations to a GCM can change both the transient and equilibrium climate sensitivity, but the reasons are still not well understood. It's also not clear if the changes are `good', i.e. if the stochastic GCM is more realistic than the non-stochastic GCM. I'm working on and off on understanding this better using radiative kernel techniques.
Animation of ERA-Interim Z500 anomalies from 2017. Courtesy of Mat Chantry.
Schematic of the Euro-Atlantic winter circulation, represented by the NAO (filled contours).
Using topological data analysis to study dynamical systems.