Syllabus

Grades

Final project, 50%: The final project should involve reading one or more research papers related to the course, identifying an open research problem, and making progress towards resolving the problem. Around the middle of the semester, a project proposal consisting of one page describing the problem and all references will be due.

Homework, 40%: There will be 3 homework sets throughout the semester. You are encouraged to work together with others on the problems, but you have to write down the solutions on your own.

Scribing and participation, 10%: Part of the grade will consist of typing up the lecture notes of one or two lectures so that they can be posted on the course website. Here is the latex template for these.

List of Lectures

  1. Course overview

I. Undirected graphical models

  1. Definition: factorization and Markov properties

  2. Equivalence of definitions, Hammersley-Clifford Theorem

  3. Faithfulness; Semi-graphoids, graphoids

  4. Maximum likelihood estimation: Gaussian graphical models

  5. Maximum likelihood estimation: discrete graphical models

  6. The Sum-product and junction-tree algorithms

  7. Algorithms for learning the graph structure I

  8. Algorithms for learning the graph structure II

II. Causal models

  1. Directed graphical models: factorization and Markov properties

  2. Equivalence of definitions; Markov equivalent graphs

  3. Algorithms for learning the graph structure I

  4. Algorithms for learning the graph structure II

  5. Structural equation models, Interventions, Counterfactuals

  6. Linear causal models: Gaussian and non-Gaussian

  7. Additive noise models

  8. Learning the graph structure using interventions

  9. Potential outcomes

  10. Hidden variable models: Simpson's paradox, instrumental variables

  11. Hidden variable models: Markov properties, m-separation, trek-separation

  12. Hidden variable models: other constraints; tensor decomposition

  13. Time series