Main Textbooks
Dive into Deep Learning
https://d2l.ai/
Deep Learning with Pytorch
Daniel Voigt Godoy
Slides & Syllabus
Fundamental building blocks (Basics, MLP, CNN)
Recurrent Neural networks (RNN, LSTM, GRU)
Optimization (Gradient descent family, Adam, Learning rate scheduler)
Attention mechanism & Transformers (Seq2Seq, Attention, Transformers, BERT, GPT, BART)
[recommended videos 1 & 2 for Transformers]
(the corresponding ppsx file for playing gifs & videos in the slides)
Generative networks (GAN, VAE)
Codes & Notebooks
Basics:
MLP:
CNN:
RNN:
Optimization:
Illustration of the family of Gradient descent approaches in pytorch
Analysis of optimization path of gradient descent based approaches in numpy
to be updated
Homeworks
HW1 (deadline: 19 April)
HW2 (deadline: 28 April)
HW3 (deadline: 1 June)
You should provide a single PDF file containing your report and results for all questions (including programming questions). Do not include code in this report. For each programming question, you should include one notebook file with your code and an explanation of the code. Combine all the files into a single RAR file and send it to DL2024.kntu@gmail.com
MIDTERM: 20th April, 9:00 am