Syllabus

Lecture Time:

Tuesday and Thursday, 12:30p-1:50p CSB 001

Lab hours:

Wednesday 4:00p-4:50p CSB 005
Friday 1:00p-1:50p CSB 005

TA:


Text Books:
This course will be self-contained but below are some useful books that are good to read:

Deep Learning (Ian Goodfellow and Yoshua Bengio and Aaron Courville)

R. Duda, P. Hart, D. Stork, "Pattern Classification", second edition, 2000.

Piazza:

Ted:


Office Hours:


Course Description:

Recent developments in deep neural networks approaches have led to a significant boost to state-of-the-art learning systems in a wide range of domains including machine learning, robotics, perception, computer vision, artificial intelligence, speech recognition, neural computation, medical imaging, bio-informatics, computational linguistics, and social data analysis. This course will cover the basics about neural networks, as well as recent developments in deep learning including deep belief nets, convolutional neural networks, recurrent neural networks, long-short term memory, and reinforcement learning. We will study details of the deep learning architectures with a focus on learning end-to-end models for these tasks, particularly image classification. Students will learn hands-on skills to do cutting-edge research by implementing, training and debugging neural networks.


Prerequisites:

Mathematics 20F (Linear Algebra) or Mathematics 31AH (Honors Linear Algebra), and Mathematics 180A (Introduction to Probability) or ECE 109 (Engineering Probability & Statistics), and COGS 109 (Modeling and Data Analysis) or COGS 108 or CSE 11 (Introduction to Computer Science & Object-Oriented Programming: Java ), or consent of instructor. 

Grading policy:
Assignments: 40% (Late policy: 5% reduction for the first day past due and every 10% reduction afterwards).
Midterms: 40%
Final project: 20% (Late policy: 5% reduction for the first day past due and every 10% reduction afterwards.).
Bonus points: 5% (classroom participation and final project)