Deep Learning

1-Deep Learning (AI-551/AI - 1202 ) by Dr. Vijay Bhaskar Semwal, MANIT Bhopal

Deep Learning Syllabus :

Room No-TA-216 L- 4,P-2

Prerequisite:

Linear Algebra, Calculus & Probability Theory, Pattern Recognition & Machine Learning,

Neuroscience of vision module& Perception psychology, Non-linear Optimization.

Description of Contents in brief:

1.

  • Basics: Biological Neuron, Idea of computational units, McCulloch–Pitts unit and thresholding logic, Rosenberg perceptron, Perceptron Learning Algorithm, Linear separability, Convergence theorem for Perceptron Learning Algorithm, Shallow versus Deep network, overfitting versus under fitting, Bias variance tradeoff, loss function.
  • Feedforward Networks: Multilayer Perceptron, Gradient Descent, Backpropagation, Vanishing & exploding gradient problem, Empirical Risk Minimization, regularization.
  • Autoencoders: Dimensionality reduction, Inter and intra class classification, PCA and LDA, Limitation of PCA, Autoencoder, Different type of autoencoders- Overcomplete, undercomplete, Denoising, Sparse, contactive autoencoder.

2.

  • Deep Neural Networks: Difficulty of training deep neural networks, Greedy layer wise training.
  • Better Training of Neural Networks: Newer optimization methods for neural networks (Adagrad, adadelta, rmsprop, adam, NAG), second order methods for training, Saddle point problem in neural networks, Regularization methods (dropout, drop connect, batch normalization).

3.

  • Generative models: Restrictive Boltzmann Machines (RBMs), Bayesian probability network , Markov Model, Difference between Bayesian probability network and Markov model, Directional and undirected graph , factorization and probability distribution, RMBs architecture and Energy based model, Introduction to MCMC and Gibbs Sampling, gradient computations in RBMs, Deep Boltzmann Machines.

4.

  • Convolution Neural Networks: CNN Architecture and operations (convolution operation and different kind of filters (edge detection, vertical, horizontal, diagonal, sharpen and Gaussian) operations , down sampling/ pooling(max, avg, sum), padding, flatten operation, fully connected layer), Relu and softmax activation function, Different popular convolution neural networks, LeNet, AlexNet, ZF-Net, VGGNet, GoogLeNet, ResNet.
  • Recurrent Neural Networks: RNN vs CNN, Back propagation through time, Long Short Term Memory, Gated Recurrent Units, Bidirectional LSTMs, Bidirectional RNNs.

5.

  • Recent trends: Variational Autoencoders, Generative Adversarial Networks, Multi-task Deep Learning, Multi-view Deep Learning.

List of Text Books:

1.Deep Learning, Ian Goodfellow and Yoshua Bengio and Aaron Courville, MIT Press, 2016.

2."Learning deep architectures for AI." Foundations and trends in Machine Learning 2.1 Bengio, Yoshua. (2009):

List of Reference Books:

1.Neural Networks: A Systematic Introduction, Raúl Rojas, 1996

2.Pattern Recognition and Machine Learning, Christopher Bishop, 2007

3.Deep Learning with Python, Francois Chollet, 2017

URLs:1. https://github.com/vsemwal/Deep_Learning_MANIT

2.https://pytorch.org/tutorials/beginner/deep_learning_60min_blitz.html

3.https://github.com/fchollet/deep-learning-with-python-notebooks

Lecture Plan (about 40-50 Lectures):

Lecture No. Topic 1. Basics & Feed forward Networks, Autoencoder (10 lectures)

2.Deep Neural Networks & Better Training of Neural Networks: (6 Lectures)

3.Generative models & Restricted Boltzmann Machine (10)

4.Convolution Neural Networks & Recurrent Neural Networks (8 lecturer)

5.Recent trends (VAEs, GAINs) (6 Lecturer)

Deep Learning Tutorial in python using tensor flow and keras.

----------------------------------------------------------------------------------