Course Syllabus:
Artificial Neuron, Neuronal Network and Regression
- McCulloch–Pitts Neuron model
- Network Architecture, Design and Learning
- Linear regression
Unconstrained Optimization and LMS
- Unconstrained Optimization
- Relation with Wiener Filter
- LMS algorithm and its structure
Perceptron and Activation
- Rosenblatt’s Perceptron
- Perceptron Convergence Algorithm
- Batch Perceptron Algorithm
- Activation functions
Multilayer Perceptron and Back Propagation
- Architecture, batch and online learning
- Back Propagation Algorithm and its Attributes
- Back Propagation Heuristics
Pattern Separability and Regularization
- Cover's theorem
- Interpolation Problem and Radial Basis Functions
- Tikhonov Regularization and Green's function
DNN: Convolutional Neural Networks
- CNN computations
- Training the CNN
- Architectures
Other DNNs: RNN, AE and GAN
- LSTM Network
- Classical and Variational AE, Adversarial learning
- Classical GAN, cGAN, InfoGAN, BiGAN
DNN Theory:
- Deep Networks - Why and When?
- Scattering Networks
- Convnets as Feature Extraction Networks
Course resources:
Register in usebackpack.com, search for the course page and join using the code shared in class
Class Timings @ F300, EECE Building:
Monday: 08.00 am – 09.55 am, Room: F300
Tuesday: 06.00 pm – 06.55 pm, Room: F300/A102/A202
12.00 pm – 12.55 pm, Room: F300 (only for Pre-Midsem &
Pre-Endsem Tests)
Office Hours:
Monday – 12.30pm to 1.15pm (Room A111, EECE Building)
Text Book:
Neural Networks and Learning Machines by Simon O. Haykin, Pearson Prentice Hall, 3rd Ed, 2009.
References:
Deep Learning by Ian Goodfellow, Yoshua Bengio and Aaron Courville, MIT Press, 2016.
Research Papers on Deep Learning Theories