Teaching:
Below is some material that I wrote for the classes I taught.
Probability Essentials for Financial Calculus:
The course provides an introduction to the essential probabilistic concepts, laying the groundwork for their application in advanced stochastic analysis and financial calculus. Specifically, it begins by presenting the probability theory axioms and introducing random variables, with a primary emphasis on discrete random variables. Subsequently, the course progresses to cover measure theory (integration, monotone & dominated convergence theorems) and the various notions of convergence of random variables. Towards the conclusion, the course briefly delves into conditional expectation and martingales.
Lecture notes:
Exercises:
Exercise sheet 1 (set theory, sigma-algebras, probability measures.)
Exercise sheet 2 (basic probability, random variables, conditional probability.)
Exercise sheet 3 (distribution and expectation of a discrete random variable.)
Exercise sheet 4 (expectations of functions of discrete random variables.)
Exercise sheet 5 (monotone classes and π-systems.)
Exercise sheet 6 (measures, measurabilty and distributions of real-valued functions.)
Exercise sheet 7 (Borel-Cantelli lemma, integration with respect to a measure.)
Exercise sheet 8 (Integration convergence theorems, density functions.)
Exercise sheet 9 (Real-valued random variables.)
Exercise sheet 10 (Vector-valued random variables, a proof of the strong law of large numbers under weaker assumptions.)
Exercise sheet 11 (Characteristic function, convergence of random variables.)
Some of the material is inspired from the textbook "Probability Essentials" by J. Jacod and P. Protter.
High-dimensional Probability Theory (exercises):
The course studies the non-asymptotic theory of random objects in high-dimensional spaces (random vectors and matrices, random projections. etc.) that are useful in applications in data science (machine learning, dimensionality reduction, compressive sensing, etc.) It closely follows the presentation suggested by R. Vershynin's book "High-dimensional probability" and covers topics such as concentration inequalities, decoupling and symmetrisation techniques, Johnson-Lindenstrauss' lemma, chaining and comparison techniques for stochastic processes, etc. Below are some of the exercise sheets I wrote for this class taught at RWTH Aachen University (summer semesters 2020, 2021 & 2022).
Exercise sheet 1 (useful inequalities, combinatorics, classical probability results.)
Exercise sheet 2 (rate of convergence of sample mean, Gaussian distribution, Hoeffding's inequality.)
Exercise sheet 3 (Chernoff's inequality & applications.)
Exercise sheet 4 (sub-gaussian random variables.)
Exercise sheet 5 (Khintchine's inequality, sub-exponential distributions.)
Exercise sheet 6 (moments & MGF, concentration of the norm, isotropy, Gaussian distributions.)
Exercise sheet 7 (sub-gaussian random vectors.)
Exercise sheet 8 (Isotropic random vectors supported in finite sets.)
Exercise sheet 9 (Grothendieck's identity and inequality.)
Exercise sheet 10 (maximal-cut algorithm, isometries, covering and packing numbers.)
Exercise sheet 11 (Weyl's inequality, expected spectral norm of a random Gaussian matrix.)
Exercise sheet 13 (Gaussian concentration, Johnson-Lindenstrauss' lemma, decoupling, concentration of anisotropic random vectors.)
Many of these exercises are either taken or inspired from R. Vershynin's book "High-dimensional probability".
Mathematical Foundations of Machine Learning (exercises):
The aim of this class is to build a mathematical foundation to understand the most common and classical techniques used in the ever-growing field of machine learning and to obtain quantitative guarantees for learning algorithms. Below are some of the exercise sheets I wrote for this course given at RWTH Aachen University (winter semesters 2019/2020 & 2020/2021 ).
Exercise sheet 1 (basic probability revision, PAC-learnability.)
Exercise sheet 3 (cumulants, PAC-learnability in the presence of noise.)
Exercise sheet 5 (Gaussian width, growth function, VC-dimension.)
Exercise sheet 7. (VC-dimension, convex functions, constrained optimisation.)
Exercise sheet 8 (VC-dimension, soft SVM.)
Exercise sheet 9 (Kernels.)
Exercise sheet 11 (Boosting, subgradient, gradient descent.)
Exercise sheet 12 (Boosting.)
Exercise sheet 13 (Neural networks.)
Some of these exercises are either taken or inspired from the following resources:
- C.M. Bishop: Pattern recognition and machine learning.
- M. Mohri, A. Rostamizadeh and A. Talwalkar: Foundations of machine learning.
- Prof. Dr. Michael M. Wolf (TU München): webpage of the "Mathematical Foundations of Machine Learning" course.