Weekly Schedule of Topics (approximate)
Chapter / section numbers refer to Mathematics for Machine Learning, by Deisenroth, Faisal and Ong. We will also use Dive into Deep Learning by A. Zhang (DDL)
Linear Algebra (2.5 weeks)
2.1 to 2.3 Systems of Linear Equations, Matrices (review at home) 0
2.4 to 2.6 Vector Spaces, Linear Independence, Basis and Rank 1
2.7 Linear Mappings 1.5
Analytic Geometry (2 weeks)
3.1 to 3.4 Norms; Inner Products; Length & Distances; Angles & Orthogonality 0.5
3.5 to 3.7 Orthonormal Basis; Orthogonal Complements; Inner Product of Functions 1
3.8 to 3.9 Orthogonal Projections; Rotations* 0.5
* selected (sub)sections
Matrix Decompositions (1.5 weeks)
4.1 to 4.2 Determinant & Trace; Eigenvalues & Eigenvectors 0.5
4.4 to 4.5 Eigendecomposition & Diagonalization; Singular Value Decomposition 1
Probability and Statistics (2 weeks)
Chapter 6 Probability and Distributions 2
Optimization (2 weeks)
Chapter 5* Vector Calculus 1
Chapter 7 Continuous Optimization 1
* selected sections
Applications (4 weeks)
Ch.3 DDL Linear Neural Networks for Regression (DDL) 1
Chapter 10 Dimensionality Reduction with Principal Component Analysis 1
Ch.4 DDL Neural Networks & Deep Learning
Linear Neural Networks for Classification (DDL) 2
Project 0.5
Review & exams 0.5
Total (weeks) 15