Pattern Recognition and Machine Learning

Aug to Dec 2022

Course Information

Space: CS 36

For: BTechs and Dual degree students at IITM. (Others go to Prof. Arun's section).

Time: F Slot. (Tue: 5PM, Wed: 11AM, Thu: 9AM, Friday: 8 AM)

Teaching Assistants:

Learning Outcomes

At the end of the course, the student should be able:

  1. To understand the use cases and limitations of machine learning.

  2. To recognise the type of learning problem suitable for a practical task at hand.

  3. To identify the data required for solving a given type of learning problem.

  4. To identify the right learning algorithms for solving a given learning problem. [KEY]

  5. To analyse (several) learning algorithms and identify the role of the various critical knobs in the algorithms. [KEY]

  6. To efficiently use various software packages for solving learning problems.

  7. To implement various learning algorithms from first principles. [KEY]

Grading

  • Final exam: 35% (Nov 21)

  • Mid Sem : 20% (Sep 13)

  • Programming Assignments: 30%

  • Biweekly Miniquizzes: 15% (Aug 5, Aug 19, Sep 2, Sep 30, Oct 14, Oct 28)

Reference Books:


Related Resources on the Web:

Reference material for Pre-requisites:

1. Undergraduate multivariate calculus.

1a. https://ocw.mit.edu/courses/mathematics/18-01sc-single-variable-calculus-fall-2010/unit-2-applications-of-differentiation/ (Part A)

1b. https://ocw.mit.edu/courses/mathematics/18-02sc-multivariable-calculus-fall-2010/2.-partial-derivatives/ (Part A and B)

2. Linear algebra.

2a. https://youtu.be/kjBOesZCoqc (Essence of linear algebra, youtube series)

2b. https://youtu.be/ZK3O402wf1c (Gilbert Strang course on Linear algebra.)

Books:

Gilbert Strang. Linear Algebra and its Applications.

3. Probability.

https://ocw.mit.edu/resources/res-6-012-introduction-to-probability-spring-2018/part-i-the-fundamentals/

Books:

Bertsekas and Tsitsiklis. Introduction to Probability.

Grimmett and Stirzaker. Probability and Random Processes.

Bruce Hajek. Probability course notes. Link

4. Basic Python/Numpy programming.

https://developers.google.com/edu/python/

http://www.astro.up.pt/~sousasag/Python_For_Astronomers/Python_qr.pdf

http://cs231n.github.io/python-numpy-tutorial/

https://www.tutorialspoint.com/numpy


Announcements:


Supporting Material:

Class Notes/Slides:

  1. Week 1: Intro to ML: slides (old slides)

  2. Week 2: Pre-requisites: LinAlg, Probability and Optimisation/Basic calculus : Notes LA : Notes CalcOpt : Notes Prob

  3. Week 3: Bayes Classifiers : Notes

  4. Week 4,5,6 : Regression : Notes

  5. Week 7 : Logistic regression : Notes

  6. Week 8: Constrained optimisation: Notes (Only KKT conditions and Lagrangian duality page required)

  7. Week 8, 9: SVM : Notes

  8. Trees and Boosting : Notes

  9. Ensemble methods : Slides

  10. Supervised learning wrap-up : Neural nets, Multiclass etc. : Notes

  11. PCA : Notes

  12. Clustering, : Notes

  13. Expectation Maximisation (GMM) : Notes


External Links:

Decision trees : link

Clustering visualization : link

Neighborhood models in collaborative filtering : link

Jupyter Notebooks:


Worksheets:


  1. Probability basics : pdf (V1)

  2. Mega-worksheet - Prerequisite/Regression/classification : pdf (V1)

  3. SVMs/Classification : pdf

  4. Trees/Boosting : pdf

  5. Multiclass methods/Clustering/Collaborative filtering : pdf

  6. EM/GMM/HMM : pdf

  7. Sample/Previous question papers :

    1. One

    2. Two

    3. Three

    4. Four

    5. Five

    6. Six