Drift Method - From Stochastic Networks to Machine Learning
(OIT 611)

Winter 2022, Instructor: Kuang Xu

Attention: If you are interested in taking this course in Winter 2022, please indicate your interest here. This will help us get a sense of the students' backgrounds and the class size.

Summary: This course is an exploration of the drift method: a family of simple, yet surprisingly powerful, meta-algorithms that in each step greedily and incrementally minimizes a certain potential function. The drift method powers the design and analysis behind some of the most popular algorithmic paradigms in queueing networks, optimization and machine learning, include MaxWeight, Back Pressure, EXP3, Information-Directed Sampling, to name a few. Using the drift method as a unifying theme, we will explore major developments in these areas to understand what features can explain the method’s effectiveness, how we can evaluate its performance, and some of the emerging research topics.

We will develop rigorous probabilistic and optimization methodologies for answering these questions, such as Lyapunov stability theory, state-space collapse, and weak convergence. Applications to be covered include dynamic control and scheduling in queueing networks, delay and stability analysis of stochastic networks, stochastic approximation, online learning and multi-armed bandits. The course will be primarily taught in a lecture format, along with some guest lectures and student project presentations.

Objective: For students to acquire fundamental methodologies that can be applied to pursuing research topics in theoretical or applied areas.

Target Audience: The course is intended for PhD students in Business, Engineering and Economics. The students should have a good background in probability and stochastic processes (e.g., Stat 310A / MS&E 321). Most topics will be self-contained.

Meeting Time: Wednesdays, 1:30-4:30PM (with a 10 min break)

Lecture Notes
The lecture notes will be continually updated. If you are feeling generous, please report any errors and typos here.

Tentative topic outline:

  • Lecture 1: Introduction. Chapter 1.

  • Lecture 2: Probability and stochastic processes. Chapter 2.

  • Lecture 3: Introduction to Stochastic Networks. Chapter 3. [Python Notebook for M/M/1 example]

  • Lecture 4: Maximum stability. Chapter 4, through Section 4.3

  • Lecture 5: The Foster-Lyapunov theorem. Chapter 5.

  • Lecture 6: Drift methods in complex system via fluid limits. Chapter 6.

  • Lecture 7: Introduction to state space collapse. Chapter 7

  • Lecture 8: Finite-time state-space collapse. Chapter 7

  • Lecture 9: Stationary Lyapunov Analysis with SSC. Chapter 8

  • Lecture 10: Introduction to drift methods in learning. Chapter 9

  • Lecture 11 - A unified view via Blackwell approachability. Chapter 9

  • Lecture 12 - Bayesian bandit analysis via information ratio - Part I. Chapter 10

  • Lecture 13 - Bayesian bandit analysis via information ratio - Part II. Chapter 10

  • Guest Lectures + Project presentations

Guest Lectures:

  • Professor Yash Kanoria, Columbia University, Topic: TBD

  • Professor Benjamin Van Roy, Stanford University, Topic: TBD