Pattern Recognition and Machine Learning

Jan to May 2020

Course Information

Space: CS 25

Time: F Slot. (Tue: 5PM, Wed: 11AM, Thu: 9AM, Friday: 8 AM)

Teaching Assistants: TBA

Learning Outcomes

At the end of the course, the student should be able:

  1. To understand the use cases and limitations of machine learning.

  2. To recognise the type of learning problem suitable for a practical task at hand.

  3. To identify the data required for solving a given type of learning problem.

  4. To identify the right learning algorithms for solving a given learning problem. [KEY]

  5. To analyse (several) learning algorithms and identify the role of the various critical knobs in the algorithms. [KEY]

  6. To efficiently use various software packages for solving learning problems.

  7. To implement various learning algorithms from first principles. [KEY]

Grading

  • Quiz 1: 20%

  • Final exam: 25%

  • Min-Quizzes: 10% (5+5)

  • Programming Assignments: 30%

  • Data Contest: 15%

Important Dates

Quiz 1: Feb 18

Quiz 2: Mar 24

Final : As announced on IITM calendar.


Reference Books:


Related Resources on the Web:

Reference material for Pre-requisites:

1. Undergraduate multivariate calculus.

1a. https://ocw.mit.edu/courses/mathematics/18-01sc-single-variable-calculus-fall-2010/unit-2-applications-of-differentiation/ (Part A)

1b. https://ocw.mit.edu/courses/mathematics/18-02sc-multivariable-calculus-fall-2010/2.-partial-derivatives/ (Part A and B)

2. Linear algebra.

2a. https://youtu.be/kjBOesZCoqc (Essence of linear algebra, youtube series)

2b. https://youtu.be/ZK3O402wf1c (Gilbert Strang course on Linear algebra.)

Books:

Gilbert Strang. Linear Algebra and its Applications.

3. Probability.

https://ocw.mit.edu/resources/res-6-012-introduction-to-probability-spring-2018/part-i-the-fundamentals/

Books:

Bertsekas and Tsitsiklis. Introduction to Probability.

Grimmett and Stirzaker. Probability and Random Processes.

Bruce Hajek. Probability course notes. Link

4. Basic Python/Numpy programming.

https://developers.google.com/edu/python/

http://www.astro.up.pt/~sousasag/Python_For_Astronomers/Python_qr.pdf

http://cs231n.github.io/python-numpy-tutorial/

https://www.tutorialspoint.com/numpy


Announcements:


Supporting Material:

Class Notes/Slides:

Course instructional handout : pdf

  1. Introduction slides : pdf

  2. Geometry/calculus notes : pdf

  3. Probability/Linalg/Optimisation notes: pdf

  4. Bayes Decision theory notes: pdf

  5. Regression notes : pdf

  6. Logistic regression notes: pdf

  7. Constrained optimisation, KKT, Lagrangian dual notes: pdf

  8. SVM notes : pdf

  9. Trees/Boosting notes : pdf

  10. Ensemble methods slides : pdf

  11. Gradient boosting notes : pdf

  12. Artifical neural nets : pdf

  13. Supervised learning wrap-up : pdf

  14. PCA : pdf

  15. Clustering overview : pdf

  16. GMM & EM : pdf

  17. Learning theory :pdf

  18. Regularisation, Collaborative filtering overview : pdf

Jupyter Notebooks:

  1. SVM examples. Link

  2. PCA demo. Link

  3. GMM EM. Link

  4. LASSO regression. Link



Worksheets:

  1. Probability basics : pdf (V1)

  2. Mega-worksheet - Prerequisite/Regression/classification : pdf (V1)

  3. Logistic regression, Lagrangian, KKT, SVM : pdf (V1)

  4. k-NN, Decision tree, AdaBoost : pdf (V1)

  5. Multi-class, PCA, clustering, collaborative filtering : pdf (V1)

  6. GMM, HMM, EM : pdf (V1)

  7. Model question paper : pdf (V1)


Course_webpage_schedule_table