Pattern Recognition and Machine Learning
Aug to Dec 2022
Course Information
Space: CS 36
For: BTechs and Dual degree students at IITM. (Others go to Prof. Arun's section).
Time: F Slot. (Tue: 5PM, Wed: 11AM, Thu: 9AM, Friday: 8 AM)
Teaching Assistants:
Learning Outcomes
At the end of the course, the student should be able:
To understand the use cases and limitations of machine learning.
To recognise the type of learning problem suitable for a practical task at hand.
To identify the data required for solving a given type of learning problem.
To identify the right learning algorithms for solving a given learning problem. [KEY]
To analyse (several) learning algorithms and identify the role of the various critical knobs in the algorithms. [KEY]
To efficiently use various software packages for solving learning problems.
To implement various learning algorithms from first principles. [KEY]
Grading
Final exam: 35% (Nov 21)
Mid Sem : 20% (Sep 13)
Programming Assignments: 30%
Biweekly Miniquizzes: 15% (Aug 5, Aug 19, Sep 2, Sep 30, Oct 14, Oct 28)
Links
Google group link : https://groups.google.com/g/prml-iitm-harish-aug2022
Reference Books:
[DHS] Richard O. Duda, Peter E. Hart and David G. Stork. Pattern Classification. John Wiley, 2001.
[CB] Christopher M. Bishop, Pattern Recognition and Machine Learning, Springer, 2006.
Download PDF from here : https://www.microsoft.com/en-us/research/people/cmbishop/#!prml-book
[SB] Shai Shalev-Shwartz and Shai Ben-David. Understanding Machine Learning: From Theory to Algorithms. Cambridge University Press, 2014.
Download PDF from here: http://www.cs.huji.ac.il/~shais/UnderstandingMachineLearning/copy.html
[MRT] Mehryar Mohri, Afshin Rostamizadeh, and Ameet Talwalkar. Foundations of Machine Learning. MIT Press. 2018.
Download PDF from here :https://cs.nyu.edu/~mohri/mlbook/
[DFO] Marc Peter Deisenroth, A. Aldo Faisal, and Cheng Soon Ong. Mathematics for Machine Learning. Cambridge University Press, 2020.
Book site here : https://mml-book.github.io/
[JW] Gareth James, Daniela Witten, Trevor Hastie and Robert Tibshirani. An Introduction to Statistical Learning with Applications in R, Springer, 2014.
Download PDF from here: http://www-bcf.usc.edu/~gareth/ISL
Jupyter notebooks for ISLR: https://github.com/JWarmenhoven/ISLR-python
Related Resources on the Web:
Google crash course on machine learning: https://developers.google.com/machine-learning/crash-course/ml-intro
Intel course on Machine learning: https://software.intel.com/en-us/ai-academy/students/kits/machine-learning-501
Nptel course on Machine learning: Prof. Ravindran's ML course from 2016
Hal Daume III. A course in machine learning. http://ciml.info/
Reference material for Pre-requisites:
1. Undergraduate multivariate calculus.
1b. https://ocw.mit.edu/courses/mathematics/18-02sc-multivariable-calculus-fall-2010/2.-partial-derivatives/ (Part A and B)
2. Linear algebra.
2a. https://youtu.be/kjBOesZCoqc (Essence of linear algebra, youtube series)
2b. https://youtu.be/ZK3O402wf1c (Gilbert Strang course on Linear algebra.)
Books:
Gilbert Strang. Linear Algebra and its Applications.
3. Probability.
Books:
Bertsekas and Tsitsiklis. Introduction to Probability.
Grimmett and Stirzaker. Probability and Random Processes.
Bruce Hajek. Probability course notes. Link
4. Basic Python/Numpy programming.
https://developers.google.com/edu/python/
http://www.astro.up.pt/~sousasag/Python_For_Astronomers/Python_qr.pdf
http://cs231n.github.io/python-numpy-tutorial/
https://www.tutorialspoint.com/numpy
Announcements:
Supporting Material:
Class Notes/Slides:
Week 1: Intro to ML: slides (old slides)
Week 2: Pre-requisites: LinAlg, Probability and Optimisation/Basic calculus : Notes LA : Notes CalcOpt : Notes Prob
Week 3: Bayes Classifiers : Notes
Week 4,5,6 : Regression : Notes
Week 7 : Logistic regression : Notes
Week 8: Constrained optimisation: Notes (Only KKT conditions and Lagrangian duality page required)
Week 8, 9: SVM : Notes
Trees and Boosting : Notes
Ensemble methods : Slides
Supervised learning wrap-up : Neural nets, Multiclass etc. : Notes
PCA : Notes
Clustering, : Notes
Expectation Maximisation (GMM) : Notes
External Links:
Decision trees : link
Clustering visualization : link
Neighborhood models in collaborative filtering : link
Jupyter Notebooks:
Worksheets:
Probability basics : pdf (V1)
Mega-worksheet - Prerequisite/Regression/classification : pdf (V1)
SVMs/Classification : pdf
Trees/Boosting : pdf
Multiclass methods/Clustering/Collaborative filtering : pdf
EM/GMM/HMM : pdf
Sample/Previous question papers :