This course provides a systematic view of a range of contemporary machine learning algorithms, as well as an introduction to the theoretical aspects of the subject. Topics covered include the statistical learning framework, estimation theory, model complexity, ensemble methods, mixture models, multilayer neural networks and deep learning, nonparametric methods, and active learning.
Appropriate for graduate students who have taken CMSC 25300/35300 (Mathematical Foundations of Machine Learning) or equivalent (e.g. Part 1 covered by Mathematics for Machine Learning).
(updated 03/29/22) We are at capacity as of 03/28; please sign up for waitlist here.
Instructor: Yuxin Chen <chenyuxin@uchicago.edu>
Teaching staff: Jingwen Jiang (Head TA) <jingwen@uchicago.edu>; Sue Parkinson <sueparkinson@uchicago.edu>; Huanlin Zhou <hlzhou@uchicago.edu>; Fuwei Yu <fwy@uchicago.edu>
Format: Tu/Th, 9:30-10:50am CT @ STU 102. Class will meet in person starting from 03/29/2022. Zoom option available -- please retrieve the meeting links on Canvas.
Office hours (Zoom link provided on Canvas):
M: 9:30am-10:30am (Sue)
Tu: 5pm-6pm (Huanlin)
W: 9:30am-10:30am (Sue)
Th: 8am-9am (Yuxin)
F: 10am-11am (Jingwen)
Sa: 11am-12pm (Huanlin)
Discussion and Q&A: Via Ed Discussion (link provided on Canvas). The system is highly catered to getting you help quickly and efficiently from classmates, the TAs, and the instructors. Rather than emailing questions to the teaching staff, we encourage you to post your questions on Ed Discussion. For new users, see the following quick start guide: https://edstem.org/quickstart/ed-discussion.pdf
Assignment & Grading: Via Gradescope (link provided on Canvas)
Email policy: We will prioritize answering questions posted to Ed Discussion, not individual emails.
Week 1: The statistical learning framework, bias-variance tradeoffs
Week 2: Point estimation (MOME, MLE, MAP)
Week 3: Model complexity
Week 4: Classification (logistic regression; Naive Bayes, LDA)
Week 5: Ensemble methods (bagging, random forests, boosting)
Week 6: Graphical models (mixtures of Gaussians, graphical models)
Week 7: Multi-layer perceptrons and neural networks
Week 8: Nonparametric models (Kernel ridge regression, Gaussian processes)
Week 9: Support vector machines; active learning
Pattern Recognition and Machine Learning; by Christopher Bishop. The textbooks will be supplemented with additional notes and readings.
Optional supplementary materials:
Probabilistic Machine Learning: An Introduction; by Kevin Patrick Murphy, MIT Press, 2021.
Understanding Machine Learning; by Shai Shalev-Shwartz and Shai Ben-David
Pattern Classification; by Duda, Hart, and Stork
Mathematics for Machine Learning; by Marc Peter Deisenroth, A Aldo Faisal, and Cheng Soon Ong. Cambridge University Press, 2020.
Spring 2020, co-taught by Prof. Rebecca Willett and Prof. Yuxin Chen.