Stochastic Processes and Martingales by Mehmet Öz (Özyeğin University)
This minicourse will serve as an introduction to martingales, which play a central role in modern probability theory. After briefly reviewing filtrations and stopping times, we will define a martingale and discuss several stochastic processes which can be identified as martingales. We will then focus on standard results such as the martingale convergence theorem, Doob’s optional stopping theorem and Doob’s maximal inequalities. Martingale changes of measure and some applications of martingale theory to random walks will be discussed as time permits. The emphasis will be on martingales in discrete time. The course assumes some basic knowledge of probability theory; in particular, conditional expectation.
Itô's Calculus and Stochastic Differential Equations by Mine Çağlar (Koç University)
Brownian motion will be presented as the canonical example of both a martingale and a Markov process with continuous paths. In this context, the theory of stochastic integration and Itô’s calculus is developed. Construction and properties of the integral and integration with respect to continuous local martingales will be covered. Itô’s lemma will be proved as the chain rule of stochastic calculus. Continuous local martingales are revisited as stochastic integrals with respect to Brownian Motion and as time changed Brownian motions. Then, stochastic differential equations will be introduced, and the existence and uniqueness of the solutions will be proved. Both strong and weak solutions are discussed with examples drawn from applications in finance and physics.
Partial Differential Equations by Havva Yoldaş (TU Delft) and Raphael Winter (Cardiff University)
This course serves as an introduction to Partial Differential Equations (PDE) theory. In the first half of the course, after some preliminary concepts and general motivation, we will introduce the four important class of PDEs: transport equations, Laplace's equation, heat equation, wave equation and their fundamental properties. In the second half of the course, we will focus on some linear evolution equations arising from applied sciences. We will discuss their qualitative and quantitative properties such as well-posedness and long-time behaviour. We will finish with some perspectives and open problems in the area.
Background Material from C. Evans
Notes on Entropy in Cross-Diffusion Systems - A. Jüngel
Entropy Methods for Diffusive Partial Differential Equations - A. Jüngel (access may be restricted)
Stochastic Partial Differential Equations by Avi Mayorcas (University of Bath)
Many important physical, economic and biological phenomena are well described by PDE models of observed quantities, such as temperature, velocity, angular momentum, wealth, population density etc. However, in many cases, exact PDE models provide less information than statistical models, against which one can calibrate statistics of the same observed quantities; e.g. averages, variances and higher moments. Mathematically, these statistical models are given by stochastic perturbations of PDE i.e. stochastic PDE (SPDE). The purpose of this course is to give students an insight into the various analytic tools which can be applied to obtain basic solution theories for a large class of these equations. The main focus will be on the so-called variational formulation, which is of particular relevance as it is well-suited to deriving numerical methods, for example those based on Galerkin or neural network approximations. The course assumes some basic background in functional analysis (familiarity with infinite dimensional Banach and Hilbert spaces) and some basics of probability theory. The course will aim to give sufficient additional background, from this base, to understand the solution theory for a basic class of SPDE as well as give insight into the tools necessary to solve more complex examples.
Problem Sheet 1 - Problem Sheet 1 Solutions
Problem Sheet 2 - Problem Sheet 2 Solutions
Machine Learning by Umut Şimşekli (INRIA & École Normale Supérieure)
Many problems in machine learning can be seen as obtaining point estimates (e.g., maximum likelihood). While point estimates have proven very useful, unfortunately, they do not convey any uncertainty information, which can be crucial for risk-intolerant application domains (such as self-driving vehicles). In contrast, Markov Chain Monte Carlo (MCMC) methods are indeed able to provide uncertainty estimates along with point estimates. In this short course, we will cover a specific instance of MCMC algorithms, called "Langevin Monte Carlo" (LMC). The LMC algorithm is built on the Langevin diffusion, a particular stochastic differential equation (SDE) that has been widely used in a broad range of mathematical branches. The particularity of LMC is that it is able to scale up to modern, large-scale machine learning problems, and hence has recently attracted a significant amount of interest in both engineering domains as well as applied probability. The course will be based on the following content:
-- Formalization of the sampling problem and its link to uncertainty estimation.
-- Development of the Langevin SDE for solving the sampling problem.
-- The connections between the Langevin SDE and its associated Fokker-Planck Equation, that is a linear PDE.
-- Development of the LMC algorithm as a time-discretization of the Langevin SDE and its error analysis.
An Introduction to Malliavin-Stein Method by Arturo Jaramillo (Centro de Investigación en Matemática (CIMAT))
This minicourse provides an introduction to the Malliavin-Stein method, which combines Malliavin calculus and Stein’s method for probabilistic normal approximations. These two techniques complement each other remarkably well, and their interaction has led to significant advances in central and non-central limit theorems for functionals of infinite-dimensional Gaussian fields. Originally introduced in 2007 by Giovanni Peccati and Ivan Nourdin, the Malliavin-Stein method has gained substantial relevance in recent years. The course begins with a presentation of the fundamental tools of Malliavin calculus, including derivative, divergence, and Ornstein-Uhlenbeck operators, and their interplay with isonormal Gaussian processes and Wiener chaos. The theoretical framework is then extended to Hilbert space-valued operators, multiple integrals, and the Ornstein-Uhlenbeck semigroup. Next, we explore Stein’s method, focusing on its application to one-dimensional and multidimensional normal approximations. This discussion includes Stein’s equations and their generalizations. The heart of the minicourse lies in the unification of these two approaches. We will study univariate and multivariate normal approximations, normal approximations on Wiener chaos, and chaos decompositions for central limit theorems (CLTs). Finally, we apply these tools to the Breuer-Major theorem, which provides a CLT for weakly dependent random variables. We will discuss its general statement and its application to the increments of fractional Brownian motion.
Colloquium talk by Ali Süleyman Üstünel (Bilkent University)
Title: Some applications of variational calculus on Wiener space
Abstract: We show how to prove the regularity of the solutions of a SDE with rough coefficients with the help of variational calculus on the paths of a cylindrical Brownian motion (sheet).