Artificial Neural Networks
May 17 (Finals Week)
Reminder: the Final Exam is scheduled for 5:00pm Monday, a half-hour earlier than our regular class meeting.
Tasks
Work with the group in your breakout room on the in-class exercise:
Group Final Exam Review
May 10 / May 12 (Week 16)
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 8.5 - Visualization and Unsupervised Learning
Section 8.6 - Applications of Convolutional Networks
Section 10.4 - Generative Adversarial Networks (GANs)
If you'd like to learn more about Game Theory...
Read David K. Levine's article What is Game Theory?
Tasks
Submit Project 7 by Wednesday.
Study the following for the Final:
The Reading sections listed for each week 8-16
The following exercises:
Section 4.13 - Exercises 1, 6, 8, 9, 10, 11
Section 7.10 - Exercises 4, 5, 6, 8
Section 8.9 - Exercises 1, 2, 3, 4, 5, 7, 8
Complete the Final Exam before class next Monday.
The exam will be available beginning Wednesday evening.
Even if you plan to wait until the last moment to take the exam, be sure to read the instructions posted to Canvas as soon as they are available.
May 3 / May 5 (Week 15)
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 8.2 - The Basic Structure of a Convolutional Network
Section 8.3 - Training a Convolutional Network
Section 8.4 - Case Studies of Convolutional Architecture
Tasks
Download ImagePlay and experiment with the 2D Convolution filter.
April 26 / April 28 (Week 14)
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 7.6 - Gated Recurrent Units (GRUs)
Section 7.7 - Applications of Recurrent Neural Networks
Section 8.1 - Introduction
Tasks
Submit Project 6 by Wednesday.
Begin work on Project 7, due May 12.
April 19 / April 21 (Week 13)
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 7.1 - Introduction
Section 7.2 - The Architecture of Recurrent Neural Networks
Section 7.5 - Long Short-Term Memory (LSTM)
Read the following article by Christopher Olah:
April 12 / April 14 (Week 12)
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 6.1 - Introduction
Section 6.2 - Hopfield Networks
Tasks
Submit Project 5 by Wednesday.
Begin work on Project 6, due April 28.
April 5 / April 7 (Week 11)
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 2.5 - Matrix Factorization with Autoencoders
Section 4.7 - Unsupervised Pretraining
Section 4.10 - Regularization in Unsupervised Applications
Tasks
Submit Project 4 by Wednesday.
March 29 / March 31 (Week 10)
Spring Break - No class.
If you would like to get a head start on the rest of the semester...
Read the following sections of Neural Networks and Deep Learning: A Textbook:
April 5 / April 7 (Week 11)
Section 2.5 - Matrix Factorization with Autoencoders
Section 4.10 - Regularization in Unsupervised Applications
April 12 / April 14 (Week 12)
Section 6.1 - Introduction
Section 6.2 - Hopfield Networks
April 19 / April 21 (Week 13)
Section 7.1 - Introduction
Section 7.2 - The Architecture of Recurrent Neural Networks
Section 7.5 - Long Short-Term Memory (LSTM)
April 26 / April 28 (Week 14)
Section 7.6 - Gated Recurrent Units (GRUs)
Section 7.7 - Applications of Recurrent Neural Networks
Section 8.1 - Introduction
May 3 / May 5 (Week 15)
Section 8.2 - The Basic Structure of a Convolutional Network
Section 8.3 - Training a Convolutional Network
Section 8.4 - Case Studies of Convolutional Architectures
May 10 / May 12 (Week 16)
Section 8.5 - Visualization and Unsupervised Learning
Section 8.6 - Applications of Convolutional Networks
Section 10.4 - Generative Adversarial Networks (GANs)
March 22 (Week 9)
Tasks
Complete the Midterm Exam before class.
Work with a group on the in-class exercise:
Group Midterm Exam Review
March 24 (Week 9)
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 3.6 - Batch Normalization
Section 4.1 - Introduction
Section 4.2 - The Bias-Variance Trade-Off
Section 4.4 - Generalization Issues in Model Tuning and Evaluation
Section 4.5 - Ensemble Methods
Section 4.6 - Early Stopping
If you're interested in what Batch Normalization is really doing...
Shibani Santurkar, Dimitris Tsipras, Andrew Ilyas, and Aleksander Mądry. 2018. How does batch normalization help optimization? In Proceedings of the 32nd International Conference on Neural Information Processing Systems (NIPS'18). Curran Associates Inc., Red Hook, NY, USA, 2488–2498.
Tasks
Begin work on Project 5, due April 14.
March 15 / March 17 (Week 8)
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 3.3 - Setup and Initialization Issues
Section 3.4 - The Vanishing and Exploding Gradient Problems
Section 3.5 - Gradient-Descent Strategies
Section 3.5.1 - Learning Rate Decay
Section 3.5.2 - Momentum-Based Learning
Section 3.5.3 - Parameter-Specific Learning Rates
Read the following article by Matt Mazur:
Tasks
Submit Project 3 by Wednesday.
Study the following for the Midterm:
The Reading sections listed for each week 1-7
The following exercises:
Section 1.11 - Exercises 1, 4, 5, 9
Section 3.10 - Exercises 1, 2, 3, 4
Complete the Midterm Exam before class next Monday.
The exam will be available beginning Wednesday evening.
Even if you plan to wait until the last moment to take the exam, be sure to read the instructions posted to Canvas as soon as they are available.
Begin work on Project 4, due April 7.
March 8 / March 10 (Week 7)
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 3.1 - Introduction
Section 3.2 - Backpropagation: The Gory Details
Read the following article by Christopher Olah:
If you'd like to learn more about Information Theory...
Read the following section of Pattern Recognition and Machine Learning by Christopher Bishop:
Section 1.6 - Information Theory
March 1 / March 3 (Week 6)
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 2.1 - Introduction
Section 2.2 - Neural Architectures for Binary Classification Models
Section 2.3 - Neural Architectures for Multiclass Models
If you'd like to learn more about the Fisher discriminant...
Read the following section of Pattern Recognition and Machine Learning by Christopher Bishop:
Section 4.1.4 - Fisher's linear discriminant
February 22 / February 24 (Week 5)
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 1.5 - The Secrets to the Power of Function Composition
Section 1.6 - Common Neural Architectures
Section 1.7 - Advanced Topics
Section 1.7.3 - Generative Adversarial Networks
Section 1.8 - Two Notable Benchmarks
Tasks
Take a look at The Asimov Institute's Neural Network Zoo.
February 15 (Week 4)
Presidents Day - No class.
February 17 (Week 4)
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 1.4 - Practical Issues in Neural Network Training
Tasks
Submit Project 1 by Wednesday.
Begin work on Project 2, due March 3.
February 8 / February 10 (Week 3)
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 1.2.2 - Multilayer Neural Networks
Section 1.2.3 - The Multilayer Network as a Computational Graph
Section 1.3 - Training a Neural Network with Backpropagation
If you're interested in the research comparing the power of deep networks to shallow networks...
Hrushikesh Mhaskar, Qianli Liao, and Tomaso Poggio. 2017. When and why are deep networks better than shallow ones? In Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence (AAAI'17). AAAI Press, 2343–2349.
Ronen Eldan and Ohad Shamir. 2016. The Power of Depth for Feedforward Neural Networks. In Proceedings of the twenty-ninth annual conference on Learning theory (COLT '16). JMLR: Workshop and Conference Proceedings vol 49: 1-34, 2016.
Tasks
Bookmark the following section of Neural Networks and Deep Learning: A Textbook for future reference:
Section 1.2.1.6 - Some Useful Derivatives of Activation Functions
Consider signing up to attend the Zoom webinar for Geoffrey Hinton's lecture at the National Science Foundation Thursday morning 2/11.
February 1 / February 3 (Week 2)
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 1.2.1 - Single Computational Layer: The Perceptron
Tasks
Begin work on Project 1, due February 17.
January 25 (Week 1)
Reading
Read the Syllabus.
Read the following articles from blogs.nvidia.com:
Tasks
Participate in the breakout room icebreaker:
Introduce yourself. Tell everyone your name, and tell them about your previous experience with machine learning and neural networks.
January 27 (Week 1)
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 1.1 - Introduction
Tasks
Work with the group in your breakout room on the in-class exercise:
Syllabus Questions