Syllabus
Grades
Final project, 50%: The final project should involve reading one or more research papers related to the course, identifying an open research problem, and making progress towards resolving the problem. Around the middle of the semester, a project proposal consisting of one page describing the problem and all references will be due.
Homework, 40%: There will be 3 homework sets throughout the semester. You are encouraged to work together with others on the problems, but you have to write down the solutions on your own.
Scribing and participation, 10%: Part of the grade will consist of typing up the lecture notes of one or two lectures so that they can be posted on the course website. Here is the latex template for these.
List of Lectures
Course overview
I. Undirected graphical models
Definition: factorization and Markov properties
Equivalence of definitions, Hammersley-Clifford Theorem
Faithfulness; Semi-graphoids, graphoids
Maximum likelihood estimation: Gaussian graphical models
Maximum likelihood estimation: discrete graphical models
The Sum-product and junction-tree algorithms
Algorithms for learning the graph structure I
Algorithms for learning the graph structure II
II. Causal models
Directed graphical models: factorization and Markov properties
Equivalence of definitions; Markov equivalent graphs
Algorithms for learning the graph structure I
Algorithms for learning the graph structure II
Structural equation models, Interventions, Counterfactuals
Linear causal models: Gaussian and non-Gaussian
Additive noise models
Learning the graph structure using interventions
Potential outcomes
Hidden variable models: Simpson's paradox, instrumental variables
Hidden variable models: Markov properties, m-separation, trek-separation
Hidden variable models: other constraints; tensor decomposition
Time series