This course provides a systematic view of a range of contemporary machine learning algorithms, as well as an introduction to the theoretical aspects of the subject. Topics covered include the statistical learning framework, estimation theory, model complexity, ensemble methods, mixture models, multilayer neural networks and deep learning, nonparametric methods, and active learning.
Appropriate for graduate students who have taken Statistics 27700 & CMSC 25300/35300 (Mathematical Foundations of Machine Learning) or equivalent.
Instructor: Yuxin Chen <chenyuxin@uchicago.edu>
Teaching staff: Lang Yu (head TA) <langyu@uchicago.edu>; Truong Son Hy <hytruongson@uchicago.edu>; Yibo Jiang <yiboj@uchicago.edu>
Format: Pre-recorded lectures + live Zoom discussions during class time and office hours. During lecture time, we will not do the lectures in the usual format, but instead hold zoom meetings, where you can ask questions directly to the instructor.
Lecture hours: Tu/Th, 9:40-11am CT via Zoom (starting 01/12/2021); Please retrieve the Zoom meeting links on Canvas.
Office hours: M/W/F/Sa (starting 01/13/2021)
Monday 1-2pm (Yibo Jiang)
Wednesday 1:50-2:50pm (Yibo Jiang)
Friday 11am-noon (Lang Yu)
Saturday 1-3pm (Truong Son Hy)
Announcements: We use Canvas as a centralized resource management platform. Link: https://canvas.uchicago.edu/courses/33142
Discussion and Q&A: Via Ed Discussion (link provided on Canvas). The system is highly catered to getting you help quickly and efficiently from classmates, the TAs, and the instructors. Rather than emailing questions to the teaching staff, we encourage you to post your questions on Ed Discussion. For new users, see the following quick start guide: https://edstem.org/quickstart/ed-discussion.pdf
Assignment & Grading: Via Gradescope (link provided on Canvas)
Email policy: We will prioritize answering questions posted to Ed Discussion, not individual emails.
Week 1: The statistical learning framework, bias-variance tradeoffs
Week 2: Point estimation (MOME, MLE, MAP)
Week 3: Model complexity
Week 4: Classification (logistic regression; Naive Bayes, LDA)
Week 5: Ensemble methods (bagging, random forests, boosting)
Week 6: Graphical models (mixtures of Gaussians, graphical models)
Week 7: Multi-layer perceptrons and neural networks
Week 8: Nonparametric models (Kernel ridge regression, Gaussian processes)
Week 9: Support vector machines; active learning
Pattern Recognition and Machine Learning; by Christopher Bishop. The textbooks will be supplemented with additional notes and readings.
Optional supplementary materials:
Probabilistic Machine Learning: An Introduction; by Kevin Patrick Murphy, MIT Press, 2021.
Understanding Machine Learning; by Shai Shalev-Shwartz and Shai Ben-David
Pattern Classification; by Duda, Hart, and Stork
Mathematics for Machine Learning; by Marc Peter Deisenroth, A Aldo Faisal, and Cheng Soon Ong. Cambridge University Press, 2020.
Spring 2020, co-taught by Prof. Rebecca Willett and Prof. Yuxin Chen.