Artificial Neural Networks
May 11 (Finals Week)
Final exam Monday, 5:00 - 6:50 pm online, covering all topics from the Midterm exam and Chapters 4 - 6 of Neural Networks and Deep Learning. See Titanium for details.
May 4 / May 6 (Week 16)
Project 3 due Wednesday.
Check out the example notebook Training Keras with TensorFlow Datasets.
Finish reading Chapter 6 of Neural Networks and Deep Learning.
Read Keita Kurita's article An Intuitive Explanation of Why Batch Normalization Really Works.
Additional reading on ImageNet and the 2018 Turing Award
Read Dave Gershgorn's article The data that transformed AI research—and possibly the world, then compare and contrast:
Additional reading on Gabor filters
Read some articles on Gabor filters:
If you'd like to learn more about image processing:
Download ImagePlay to experiment with 2D Convolution and other algorithms
Download a copy of Digital Image Processing while Springer is making it freely accessible.
April 27 / April 29 (Week 15)
Class canceled Wednesday.
Read the following chapter of Neural Networks and Deep Learning:
Chapter 5 - Why are deep neural networks hard to train?
Read the following chapter of Deep Learning with Python:
Chapter 5 - Deep Learning for Computer Vision
Note: This book should be available through the library's Safari database.
Download a copy of the recommended text Neural Networks and Deep Learning: A Textbook while Springer is making it freely accessible.
Also recommended by the same author: Data Mining: The Textbook and Recommender Systems: The Textbook.
Other recommendations:
The Data Science Design Manual
The Algorithm Design Manual by the same author is a common recommendation for preparing for job interviews.
Or see the entire list
April 20 / April 22 (Week 14)
Project 2 due Wednesday.
Follow-up from Monday's discussion: Does mnist_mlp.py have one layer or two?
Finish reading Chapter 4 of Neural Networks and Deep Learning.
Read Chapter 6 of Neural Networks and Deep Learning up through the section Introducing convolutional networks.
Start working on Project 3, due May 6.
Note: Project specifications are living documents, and may be updated with additional details. Bookmark the document and check back frequently.
April 13 / April 15 (Week 13)
Midterm exam Wednesday, 5:30 - 6:45 pm online, covering Chapters 1 - 3 of Neural Networks and Deep Learning. See Titanium for details.
Start reading Chapter 4 of Neural Networks and Deep Learning.
Consult the following sections of Deep Learning for more information on the topics covered in Chapter 3.
Section 8.3 - Basic Algorithms
Section 8.4 - Parameter Initialization Strategies
Section 8.7.1 - Batch Normalization
Consult the following sections of Deep Learning for more information on the topics covered in Chapter 4.
Section 6.4.1 - Universal Approximation Properties and Depth
April 6 / April 8 (Week 12)
Finish reading Chapter 3 of Neural Networks and Deep Learning.
Consult the following sections of Deep Learning for more information on the topics covered in Chapter 3.
Section 3.13 - Information Theory
Section 7.1 - Parameter Norm Penalties
Section 7.4 - Dataset Augmentation
Section 7.8 - Early Stopping
Section 7.11 - Bagging and Other Ensemble Methods
Section 7.12 - Dropout
Read the following chapter of Deep Learning with Python:
Chapter 3 - Getting started with neural networks
Note: This book should be available through the library's Safari database.
Start working on Project 2, due April 22.
Note: Project specifications are living documents, and may be updated with additional details. Bookmark the document and check back frequently.
March 30 / April 1 (Week 11)
Spring Break. No class.
March 23 / March 25 (Week 10)
Non-instructional day Monday. No class.
Read Chapter 3 of Neural Networks and Deep Learning up through the section on Weight initialization.
March 16 / March 18 (Week 9)
Virtual instruction begins Monday. Check your email or Titanium for Zoom meeting information.
Finish Chapter 2 of Neural Networks and Deep Learning.
For news of an alternative approach, read the news release from Rice University Deep learning rethink overcomes major obstacle in AI industry.
March 9 / March 11 (Week 8)
Project 1 due Wednesday.
Read Chapter 2 of Neural Networks and Deep Learning up through the section Introducing convolutional networks.
Read the following sections of Chapter 2 of Neural Networks and Deep Learning up through the section Proof of the four fundamental equations. (This section is not optional.)
Read Christopher Olah's article Calculus on Computational Graphs: Backpropagation to see how backprogagation is implemented in libraries like TensorFlow.
March 2 / March 4 (Week 7)
Class canceled Monday.
Read Chapter 2 of Neural Networks and Deep Learning up through the section on The Hadamard Product.
Watch the four videos in Grant Sanderson's 3blue1brown video series on Neural networks.
February 24 / February 26 (Week 6)
Class canceled Monday.
Read Chapter 4 of The Hundred-Page Machine Learning Book - "Anatomy of a Learning Algorithm".
Read the following sections of Chapter 1 of Neural Networks and Deep Learning:
Start working on Project 1, due March 11.
Note: Project specifications are living documents, and may be updated with additional details. Bookmark the document and check back frequently.
February 17 / February 19 (Week 5)
Read the following sections of Chapter 1 of Neural Networks and Deep Learning:
Read the following chapter of Deep Learning with Python:
Chapter 2 - Before we begin: the mathematical building blocks of neural networks
Note: This book should be available through the library's Safari database.
If you are new to Jupyter, NumPy, and Matplotlib
Review Chapters 1-2 and read Chapter 16 of A Whirlwind Tour of Python.
February 10 / February 12 (Week 4)
Read up through the section on Perceptrons in Chapter 1 of Neural Networks and Deep Learning.
Follow up by reading the following Wikipedia articles:
Download and run the Hebb Net example - Character recognition.
Modify the code to print the weights and bias and compare them against the results shown in the handout.
Add some additional input patterns with mistakes or missing data to the test set to see if they are classified correctly.
If you are new to Python
Read at least Chapters 1-10 of A Whirlwind Tour of Python.
February 3 / February 5 (Week 3)
Class canceled Wednesday.
Read the following chapters from Deep Learning with Python:
Chapter 1 - What is deep learning?
Chapter 4 - Fundamentals of machine learning
Note: This book should be available through the library's Safari database.
Read the Hebb Net handout that will be distributed in class.
January 27 / January 29 (Week 2)
Class canceled Wednesday.
Read the following articles from blogs.nvidia.com:
Read the following chapters from The Hundred-Page Machine Learning Book:
Preface
Chapter 1 - Introduction
Chapter 2 - Notation
January 22 (Week 1)
Read the Syllabus.