Main Textbooks
Main Textbooks
https://d2l.ai/
Slides & Syllabus
Fundamental building blocks (Basics, MLP, CNN)
Recurrent Neural networks (RNN, LSTM, GRU)
Modern CNNs (Inception, ResNet, MobileNet)
Optimization (Gradient descent family, Adam, Learning rate scheduler)
Attention mechanism & Transformers (Seq2Seq, Attention, Transformers, BERT, GPT, BART)
[recommended videos 1 & 2 for Transformers]
(the corresponding ppsx file for playing gifs & videos in the slides)
Generative networks (GAN, VAE)
[recommended video for vae]
Codes & Notebooks
Basics:
MLP:
CNN:
RNN:
Optimization:
Illustration of the family of Gradient descent approaches in pytorch
Analysis of optimization path of gradient descent based approaches in numpy
Seq2Seq:
Transformers:
Homeworks
HW1 (deadline: 19 April)
HW2 (deadline: 28 April)
HW3 (deadline: 1 June)
HW4 [data] (deadline: 20 June)
You should provide a single PDF file containing your report and results for all questions (including programming questions). Do not include code in this report. For each programming question, you should include one notebook file with your code and an explanation of the code. Combine all the files into a single RAR file and send it to DL2024.kntu@gmail.com
MIDTERM: 20th April, 9:00 am
online class link: https://meet.google.com/yht-bnie-hgy