Course Syllabus:
Artificial Neuron, Neuronal Network and Regression
- McCulloch–Pitts Neuron model
- Network Architecture, Design and Learning
- Linear regression
Unconstrained Optimization and LMS
- Unconstrained Optimization
- Wiener Filter
- LMS algorithm and its structure
Perceptron and Activation
- Rosenblatt’s Perceptron
- Perceptron Convergence Algorithm
- Batch Perceptron Algorithm
- Activation functions
Multilayer Perceptron and Back Propagation
- Architecture, Batch and Online Learning
- Back Propagation Algorithm and its Attributes
- Back Propagation Heuristics & More
Pattern Separability and Regularization
- Cover's theorem
- Interpolation Problem and Radial Basis Functions
- Tikhonov Regularization and Green's function
DNN: Convolutional Neural Networks
- CNN computations
- Training the CNN & Architectures
Briefs on RNN, LSTM, AE and GAN
DNN Theory
- Why Deep Nets work better?
- Scattering Networks and Feature Extraction in Convnets
Course resources:
Register in usebackpack.com, search for the course page and join using the code shared in class
Class Timings @ F300, EECE Building:
Monday: 08.00 am – 09.55 am
Tuesday: 12.00 noon – 12.55 pm
Office Hours:
Tuesday – 5.00pm to 6.00pm (Room A111, EECE Building)
Text Book:
Neural Networks and Learning Machines by Simon O. Haykin, Pearson Prentice Hall, 3rd Ed, 2009.
References:
Deep Learning by Ian Goodfellow, Yoshua Bengio and Aaron Courville, MIT Press, 2016.
Research Papers on Deep Learning Theories