RTG Lecture Series:

 Introduction to Stein's Method

About the RTG Lecture Series

This lecture series is one many events created as part of NSF Research Training Group grant at the Department of Statistics and Operations Research at the University of North Carolina at Chapel Hill. The main goal of this grant is to develop a comprehensive training and mentoring program, for undergraduate and graduate students as well as postdoctoral associates, centered around the theme of theory and applications of Networks. For more information see the webpage

Abstract

In this lecture series, I will give a gentle introduction to Stein’s method for proving distributional convergence. In 1972 Charles Stein combined Gaussian integration by parts with a certain “noise robustness” property of a sequence of random variables to derive Central Limit Theorem (CLT) with an explicit bound on the rate of convergence. In particular, this gave a completely new way of showing CLT by converting the problem to the question of understanding the local behavior of the model. In the last 50 years, this idea was generalized and extended to a plethora of settings such as different limiting distributions (Poisson, Exponential, Geometric, etc.), various dependency structures, Malliavin calculus, Concentration inequalities, and more. More recently, Stein’s method led to advances in computational statistics, machine learning, and sample quality testing. I will focus on the fundamentals of Stein’s method and present several approaches for establishing Berry-Esseen type theorems for the sums of random variables under various dependency assumptions.

Lecture notes

Here you can find my (so far incomplete and with typos) lecture notes. 

Collection of exercises for Stein’s Method 

Partha S. Dey and I are in the process of creating a large collection of exercises for Stein's method.  We hope that this would be of use to anyone interested in learning/teaching this subject.  If you are have an exercise or suggestion in mind - do not hesitate to contact me.

The current version of the collection is available here [last update: September 10, 2023].

So far we collected 74 exercises.

I update this file regularly, so comeback here time to time to see the newest version.