CS294-158-SP19
Deep Unsupervised Learning
Spring 2019
Note: Spring 2020 offering of the course is hosted here
About: This course will cover two areas of deep learning in which labeled data is not required: Deep Generative Models and Self-supervised Learning. Recent advances in generative models have made it possible to realistically model high-dimensional raw data such as natural images, audio waveforms and text corpora. Strides in self-supervised learning have started to close the gap between supervised representation learning and unsupervised representation learning in terms of fine-tuning to unseen tasks. This course will cover the theoretical foundations of these topics as well as their newly enabled applications.
Instructors: Pieter Abbeel, Peter Chen, Jonathan Ho, Aravind Srinivas
Communication: https://piazza.com/berkeley/spring2020/cs294158
Communication: https://piazza.com/class/jr8swguu59nem
(Alternative: cs294-158-staff@lists.berkeley.edu)
Lectures:
When: Wednesdays 5-8pm; first lecture on 1/30
Where: Moffitt Library 145 [CalID required to entire library]
Tentative list of topics (much subject to change):
Generative adversarial networks, variational autoencoders, autoregressive models, flow models, energy based models, compression, self-supervised learning, semi-supervised learning.
Prerequisites: significant experience with probability, optimization, deep learning
Office Hours
(starting week of 2/4)
Jonathan: Mondays 4-5pm 734 Sutardja Dai
Aravind: Tuesdays 9-10am 734 Sutardja Dai
Pieter: Thursdays 9-10am 746 Sutardja Dai [but there will be no office hours on Thu 4/11 and Thu 4/18; instead there will be office hours Wed 4/17 11-noon]
Peter: Thursdays 6-7pm 734 Sutardja Dai
Homework
HW1: Autoregressive Models (due 2/11) HW1.PDF, HW1_template.tex, mnist-hw1.pkl, distribution.npy
HW2: Flow Models (due 2/26) HW2.PDF, HW2_template.tex, hw2_q2.pkl
HW3: Latent Variable Models (due 3/14) HW3.PDF, HW3_template.tex, hw3_q2.pkl
HW4: Implicit Models (due 4/9) HW4.PDF, HW4_template.tex
Tentative Schedule / Syllabus
Week 1 (1/30) [youtube]
Lecture 1c: Likelihood-based Models Part I: Autoregressive Models
Week 2 (2/6) [youtube]
Lecture 2a: Likelihood-based Models Part I: Autoregressive Models (ctd) (same slides as week 1)
Lecture 2b: Lossless Compression
Lecture 2c: Likelihood-based Models Part II: Flow Models
Week 3 (2/13) [youtube]
Lecture 3a: Likelihood-based Models Part II: Flow Models (ctd) (same slides as week 2)
Lecture 3b: Latent Variable Models - part 1
Week 4 (2/20) [youtube]
Lecture 4a: Latent Variable Models - part 2
Week 5 (2/27) [youtube]
Lecture 5a: Latent Variable Models - wrap-up (same slides as Latent Variable Models - part 2)
Lecture 5b: ANS coding (same slides as bits-back coding)
Lecture 5c: Implicit Models / Generative Adversarial Networks
Week X (3/6)
Final Project Discussion
Week 6 (3/13) [youtube]
Lecture 6a: Implicit Models / Generative Adversarial Networks (ctd) (same slides as 5c)
Lecture 6b: Non-Generative Representation Learning [UPDATED 3/24]
Week 7 (3/20) [youtube]
Lecture 7: Non-Generative Representation Learning (same slides as 6b)
Spring Break Week (3/27)
you are on your own :)
Lecture 8a: Strengths/Weaknesses of Unsupervised Learning Methods Covered Thus Far
Lecture 8b: Semi-Supervised Learning
Lecture 8c: Guest Lecture by Ilya Sutskever
Lecture 9a: Unsupervised Distribution Alignment
Lecture 9b: Guest Lecture by Alyosha Efros
Week 10 (4/17) [youtube]
Lecture 10: Language Models (Alec Radford)
Week 11 (4/24) [youtube]
Lecture 11: Representation Learning in Reinforcement Learning
Week 12 (5/1) [youtube]
Lecture 12: Guest Lecture by Aaron van den Oord [slides not available]
Week 13 (5/8)
RRR week: no lecture
Week 14 (5/15)
Final Project Presentations
FAQ
Q: How do I get into this course?
A: Please fill out this survey, which we will use for admissions. (deadline passed)
Q: Can undergraduates take this course?
A: This course is targeted towards a PhD level audience. But certainly exceptional undergraduates could be good fits, too, and your ability to take this course is not directly affected by your grad/undergrad student status, but by things we measure in the survey and by your performance on Homework 1.
Q: Is this a real course or a seminar?
A: This is a real course. Instructors will give most of the lectures. There will be substantial homework. There will be a substantial final project.
Q: How will grading work?
A: Details to be determined. But we expect grades to be determined largely by 3-5 substantial homeworks + a final project.
Q: I already want to start learning now, what can I do, can you point me to some research papers maybe?
A: Certainly, here is a zip file with about 100 papers much on-topic for this course, happy readings!!