Artificial Neural Networks
May 15 and May 17 - Finals Week
No class
Tasks
Submit the Research Paper by May 20.
May 8 and May 10 - Week 16
Reading
Chapter 7 - Graph Neural Networks
May 1 and May 3 - Week 15
Reading
Chapter 9 - Generative Adversarial Networks
9.7 - GAN Training
9.8 - GAN Losses
9.9 - GAN Architectures
9.10 - Evaluation
9.11 - Applications
9.12 - Software Libraries, Benchmarks, and Visualization
Tasks
Submit discussion questions for Week 15.
Complete the team evaluation survey for Project 2.
Begin work on the Group Project 2 - Review assignment.
Begin work on the Research Paper.
If you'd like to experiment with a GAN...
Try GAN Lab, as seen in Section 9.12 of the textbook.
April 24 and April 26 - Week 14
Reading
Chapter 9 - Generative Adversarial Networks
9.1 - Introduction
9.2 - Minimax Optimization
9.3 - Divergence between Distributions
9.4 - Optimal Objective Value
9.5 - Gradient Descent Ascent
9.6 - Optimistic Gradient Descent Ascent
Tasks
Submit discussion questions for Week 14.
April 17 and April 19 - Week 13
Reading
Read the following sections of The Science of Deep Learning:
Chapter 8 - Transformers
Tasks
Submit discussion questions for Week 13.
Review Problems 89 - 96 of Exercises and Solutions.
Note: This is listed as Chapter 10 in Exercises and Solutions rather than as Chapter 8.
For addtional diagrams of Transformer architectures...
See the following:
Figures 1 and 2 of the original Transformers paper: Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Łukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need. Advances in neural information processing systems 30, (2017).
Chapter 11 of Deep Learning with Python, Second Edition by Francois Chollet
Chapter 20 of Deep Learning: A Visual Approach by Andrew Glassner
Note: If the last two links redirect you to a page asking you to "Start your free trial," visit https://libraryguides.fullerton.edu/oreilly first, log in with your student email address, then try again.
April 10 and April 12 - Week 12
Reading
Read the following sections of The Science of Deep Learning:
Chapter 6 - Sequence Models
6.4 - Gated Recurrent Unit
6.5 - Long Short-Term Memory
Tasks
Submit discussion questions for Week 12.
Work through Problems 47, 50, and 51 of Exercises and Solutions.
Contact the members of your group for Group Project 2:
Begin work on Group Project 2.
April 3 and April 5 - Week 11
Reading
Read the following sections of The Science of Deep Learning:
Chapter 6 - Sequence Models
6.3 - Recurrent Neural Network
6.6 - Sequence to Sequence
6.7 - Attention
6.9 - Introduction to Transformers
Tasks
Complete the team evaluation survey for Project 1.
Begin work on the Group Project 1 - Review assignment.
Submit discussion questions for Week 11.
March 27 and March 29 - Week 10
Spring Break - No class
March 20 and March 22 - Week 9
Reading
Read the following sections of The Science of Deep Learning:
Chapter 6 - Sequence Models
6.1 - Introduction
6.2 - Natural Language Models
6.8 - Embeddings
Chapter 10 - Variational Autoencoders
10.3.1 - Autoencoder
Tasks
Work through Problems 49, 52, and 53 of Exercises and Solutions.
Examine the code for Problems 50 - 54 and 68 in the Programming Exercises and Sample Code.
Submit discussion questions for Week 9.
March 13 and March 15 - Week 8
Reading
Read the following sections of The Science of Deep Learning:
Chapter 5 - Convolutional Neural Networks
5.4 - Example
5.5 - Architectures
5.6 - Applications
Tasks
Submit discussion questions for Week 8.
For more on CNN architectures...
See Cong, S., Zhou, Y. A review of convolutional neural network architectures and their optimizations. Artif Intell Rev 56, 1905–1969 (2023). https://doi.org/10.1007/s10462-022-10213-5
March 6 and March 8 - Week 7
Reading
Read the following sections of The Science of Deep Learning:
Chapter 5 - Convolutional Neural Networks
5.1 - Introduction
5.2 - Convolution
5.3 - Layers
Tasks
Download ImagePlay and experiment with the 2D Convolution filter.
Submit discussion questions for Week 7.
Contact the members of your group for Group Project 1:
Begin work on Group Project 1.
Feburary 27 and February 29 - Week 6
Reading
Read the following sections of The Science of Deep Learning:
Chapter 4 - Regularization
Tasks
Compare the descriptions of the Momentum, Adagrad, and Adam optimizers in lecture with:
The equations in Sections 3.3.6, 3.3.7, and 3.3.8
The code for Problems 14, 15, and 16 in the Programming Exercises and Sample Code.
Submit discussion questions for Week 6.
February 22 - Week 5
Monday - Presidents' Day - No class
Reading
Read the following sections of The Science of Deep Learning:
Chapter 3 - Optimization
3.1 - Introduction
3.2 - Overview
3.3 - First-Order Methods
3.5 - Evolution Strategies
Tasks
Sections 02 and 03 - Submit discussion questions for Week 5.
February 6 and February 8 - Week 4
Reading
Read the following sections of The Science of Deep Learning:
Chapter 2 - Forward and Backpropagation
2.6 - Backpropagation
2.7 - Differentiable Programming
2.8 - Computation Graph
2.9 - Derivative of Non-linear Activation Functions
2.10 - Backpropagation Algorithm
2.12 - Gradient of Loss Function
2.13 - Gradient Descent
2.14 - Initialization and Normalization
2.15 - Software Libraries and Platforms
Tasks
Download Exercises and Solutions from the webpage for the textbook and work through Problems 4, 5 and 15.
Submit discussion questions for Week 4.
Note: School is off next Monday for Presidents' Day, so the due date for Section 01 is 2/26 and the due date for Section 02 is 2/21.
Begin work on the Introductory Project.
February 6 and February 8 - Week 3
Reading
Read the following sections of The Science of Deep Learning:
Chapter 2 - Forward and Backpropagation
2.4 - Non-linear Activation Functions
2.5 - Loss Functions
2.11 - Chain Rule for Differentiation
Tasks
Bookmark The Matrix Calculus You Need For Deep Learning by Terence Parr and Jeremy Howard and refer to it as necessary.
Download Programming Exercises and Sample Code from the resources for the textbook.
Submit discussion questions for Week 3.
January 30 and February 3 - Week 2
Reading
Read the following sections of The Science of Deep Learning:
Chapter 2 - Forward and Backpropagation
2.1 - Introduction
2.2 - Fully Connected Neural Network
2.3 - Forward Propagation
Tasks
Work through Section 2.3.2 - Example by hand, on paper until you are certain that you understand how the output y was derived from the input vector x.
Submit discussion questions for Week 2.
January 23 and January 25 - Week 1
Reading
Read the Syllabus.
Read the following sections of The Science of Deep Learning:
Chapter 1 - Introduction
Tasks
Submit discussion questions for Week 1.