Artificial Neural Networks
Check the Grading Queue to see if marking is finished.
May 16 / 18 - Finals Week
Tasks
Submit the following by 9:45 Friday evening:
Project 5
Peer evaluation for Project 5
May 9 / May 11 - Week 16
Reading
Read the following articles:
An Introduction to Adversarial Examples in Deep Learning by Sayak Paul
Read the following papers:
Alex Serban, Erik Poll, and Joost Visser. 2020. Adversarial Examples on Object Recognition: A Comprehensive Survey. ACM Comput. Surv. 53, 3, Article 66 (May 2021), 38 pages.
Tasks
Experiment with Kenny Song's adversarial.js.
May 2 / May 4 - Week 15
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 4.10.4 - Hidden Probabilisitic Structure: Variational Autoencoders
Section 8.5.2 - Convolutional Autoencoders
Section 10.4 - Generative Adversarial Networks (GANs)
Read the following papers:
Yongjun Hong, Uiwon Hwang, Jaeyoon Yoo, and Sungroh Yoon. 2019. How Generative Adversarial Networks and Their Variants Work: An Overview. ACM Comput. Surv. 52, 1, Article 10 (January 2020), 43 pages.
Tasks
Begin work on Project 5.
Submit peer evaluation for Project 4.
April 24 / April 26 - Week 14
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 10.1 - Introduction
Section 10.2 - Attention Mechanisms
Section 10.2.2 - Attention Mechanisms for Machine Translation
Read the following papers:
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Łukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need. In Proceedings of the 31st International Conference on Neural Information Processing Systems (NIPS'17). Curran Associates Inc., Red Hook, NY, USA, 6000–6010.
Ross Gruetzemacher and David Paradice. 2021. Deep Transfer Learning & Beyond: Transformer Language Models in Information Systems Research. ACM Comput. Surv. Just Accepted (December 2021).
Note: The link above is to the preprint on arXiv. The version in the ACM Digital Library is undergoing revision.
April 24 / April 26 - Week 14
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 10.1 - Introduction
Section 10.2 - Attention Mechanisms
Section 10.2.2 - Attention Mechanisms for Machine Translation
Read the following papers:
Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N. Gomez, Łukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need. In Proceedings of the 31st International Conference on Neural Information Processing Systems (NIPS'17). Curran Associates Inc., Red Hook, NY, USA, 6000–6010.
Ross Gruetzemacher and David Paradice. 2021. Deep Transfer Learning & Beyond: Transformer Language Models in Information Systems Research. ACM Comput. Surv. Just Accepted (December 2021).
Note: The link above is to the preprint on arXiv. The version in the ACM Digital Library is undergoing revision.
If you are interesting in learning more about transformers...
See also Yi Tay, Mostafa Dehghani, Dara Bahri, and Donald Metzler. 2022. Efficient Transformers: A Survey. ACM Comput. Surv. Just Accepted (April 2022).
Tasks
Complete and submit the Transformer paper response.
April 18 / April 20 - Week 13
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 2.5 - Matrix Factorization with Autoencoders
Section 4.7 - Unsupervised Pretraining
Section 4.10 - Regularization in Unsupervised Applications
Read the following paper through the end of section 3.3 (p. 5:16).
Shuai Zhang, Lina Yao, Aixin Sun, and Yi Tay. 2019. Deep Learning Based Recommender System: A Survey and New Perspectives. ACM Comput. Surv. 52, 1, Article 5 (January 2020).
Tasks
Participate in the online reflection for Week 13.
April 11 / April 13 - Week 12
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 7.6 - Gated Recurrent Units (GRUs)
Section 7.7 - Applications of Recurrent Neural Networks
Section 8.1 - Introduction
Section 8.2 - The Basic Structure of a Convolutional Network
Section 8.3 - Training a Convolutional Network
Section 8.4 - Case Studies of Convolutional Architecture
Section 8.5 - Visualization and Unsupervised Learning
Omit Section 8.5.2 - Convolutional Autoencoders
Section 8.6 - Applications of Convolutional Networks
Tasks
Download ImagePlay and experiment with the 2D Convolution filter.
Participate in the online reflection for Week 12.
Contact me if you don't yet have a team for Project 4.
Submit peer evaluation for Project 3.
April 4 / April 6 - Week 11
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 7.1 - Introduction
Section 7.2 - The Architecture of Recurrent Neural Networks
Section 7.5 - Long Short-Term Memory (LSTM)
Section 7.6 - Gated Recurrent Units (GRUs)
Read the following article by Christopher Olah:
Tasks
Participate in the online reflection for Week 11.
Begin work on Project 4.
March 28 / March 30 - Week 10
Spring Break - No class.
March 21 / March 23 - Week 9
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 3.6 - Batch Normalization
Section 4.1 - Introduction
Section 4.2 - The Bias-Variance Trade-Off
Section 4.4 - Generalization Issues in Model Tuning and Evaluation
Section 4.5 - Ensemble Methods
Section 4.6 - Early Stopping
If you're interested in what Batch Normalization is really doing...
Shibani Santurkar, Dimitris Tsipras, Andrew Ilyas, and Aleksander Mądry. 2018. How does batch normalization help optimization? In Proceedings of the 32nd International Conference on Neural Information Processing Systems (NIPS'18). Curran Associates Inc., Red Hook, NY, USA, 2488–2498.
Tasks
Bookmark this Twitter thread by François Chollet for help speeding up training on Project 3.
Participate in the online reflection for Week 9.
Submit peer evaluation for Project 2.
Contact me if you don't yet have a team for Project 3.
March 14 / March 16 - Week 8
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 3.3 - Setup and Initialization Issues
Section 3.4 - The Vanishing and Exploding Gradient Problems
Section 3.5 - Gradient-Descent Strategies
Section 3.5.1 - Learning Rate Decay
Section 3.5.2 - Momentum-Based Learning
Section 3.5.3 - Parameter-Specific Learning Rates
Section 3.6 - Batch Normalization
If you just can't get enough Calculus...
See The Matrix Calculus You Need For Deep Learning by Terence Parr and Jeremy Howard.
March 7 / March 9 - Week 7
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 1.6 - Common Neural Architectures
Section 1.7 - Advanced Topics
Section 1.8 - Two Notable Benchmarks
Section 3.1 - Introduction
Section 3.2 - Backpropagation: The Gory Details
Read the following article by Christopher Olah:
Tasks
Participate in the online reflection for Week 7.
Take a look at Andrej Karpathy's micrograd engine for a (relatively) simple implementation of reverse-mode automatic differentiation.
February 28 / March 2 - Week 6
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 1.4 - Practical Issues in Neural Network Training
Section 1.5 - The Secrets to the Power of Function Composition
Section 1.6 - Common Neural Architectures
If you're interested in the research comparing the power of deep networks to shallow networks...
Take a look at the following papers:
Hrushikesh Mhaskar, Qianli Liao, and Tomaso Poggio. 2017. When and why are deep networks better than shallow ones? In Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence (AAAI'17). AAAI Press, 2343–2349.
Ronen Eldan and Ohad Shamir. 2016. The Power of Depth for Feedforward Neural Networks. In Proceedings of the twenty-ninth annual conference on Learning theory (COLT '16). JMLR: Workshop and Conference Proceedings vol 49: 1-34, 2016.
Tasks
Take a look at The Asimov Institute's Neural Network Zoo.
Participate in the online reflection for Week 6.
Submit peer evaluation for Project 1.
Contact me if you don't yet have a team for Project 2.
February 21 / February 23 - Week 5
Monday - Presidents' Day - No class
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 1.3 - Training a Neural Network with Backpropagation
Section 1.4 - Practical Issues in Neural Network Training
Tasks
Participate in the online reflection for Week 5.
Watch the four videos in Grant Sanderson's 3blue1brown video series on Neural networks.
Begin work on Project 2.
February 14 / February 16 - Week 4
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 1.3 - Training a Neural Network with Backpropagation
Tasks
Bookmark the following section of Neural Networks and Deep Learning: A Textbook for future reference:
Section 1.2.1.6 - Some Useful Derivatives of Activation Functions
Participate in the online reflection for Week 4.
February 7 / February 9 - Week 3
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 1.2.2 - Multilayer Neural Networks
Section 1.2.3 - The Multilayer Network as a Computational Graph
Tasks
Bookmark the figure from Neural Networks and Deep Learning: A Textbook for future reference:
Figure 1.8 - Various activation functions
Participate in the online reflection for Week 3.
Contact me if you don't yet have a team for Project 1.
January 31 / February 2 - Week 2
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 1.2.1 - Single Computational Layer: The Perceptron
Tasks
Participate in the online reflection for Week 2.
Begin work on Project 1.
January 26 - Week 1
Reading
Read the following sections of Neural Networks and Deep Learning: A Textbook:
Section 1.1 - Introduction
Tasks
Participate in the online reflection for Week 1.