Course Abstract:
This course will discuss fundamental knowledge and techniques on pattern recognition and machine learning by lectures and exercises.
Goal:
The goal of this course is that you will be able to have the fundamental knowledge of pattern recognition and machine learning and apply it to practical problems.
Evaluation:
Learning results are evaluated on the basis of three assignments as follows:
in-class exercises (40%): Students are solving simple computational problems during classes.
reports (20%) : Students are formulating and solving simple machine learning problems and reporting the results in two weeks.
final examination (40%)
Remarks:
The lecture materials are uploaded to CourseN@vi.
Assignments should be submitted via CourseN@vi.
Instructor:
Tetsuji OGAWA <ogawa.tetsuji__at__waseda.jp> (Please replace __at__ with @)
Tetsunori KOBAYASHI <koba__at__waseda.jp> (Please replace __at__ with @)
Term/Day/Period:
Spring semester / Tuesday / 2
Classroom:
52-202
Course Key:
5101061051
Main Language:
English
Syllabus & Lecture Materials:
The lecture materials for ``Pattern Recognition and Machine Learning'' course are attached:
#01. 04/09 Introduction : (koba) [slides]
The era of big data and machine learning
#02. 04/16 Embedding(1) : (Ogawa) [slides]
Principle component analysis (PCA), Linear discriminant analysis (LDA)
*in-class exercise (1) : PCA
#03. 04/23 Embedding(2) : (Ogawa)
Locally preserving projection (LPP), Local Fisher discriminant analysis (LFDA), t-SNE
#04. 05/07 Linear classification (1) : (Ogawa) [slides]
Discriminant function, Least squares, Perceptron
*in-class exercise (2) : Learst squares
#05. 05/14 Linear classification (2) : (Ogawa)
Logistic regression
*Report (1) : Logistic regression
#06. 05/21 Kernel method : (Ogawa) [slides] [supplement]
Basics of kernel methods, Multiple kernel learning
*in-class exercise (3) : Kernel perceptron
#07. 05/28 Sparse kernel machines : (Koba) [slides]
Support vector machine (SVM)
#08. 06/04 Neural networks (1) : (Koba) [slides]
Feed-forward neural network
*in-class exercise (4) : Backpropagation
#09. 06/11 Neural networks (2) : (Koba) [slides]
Convolutional neural network (CNN), Recurrent neural network (RNN)
#10. 06/18 Decision trees : (Koba) [slides]
Random forest, Gradient boosting decision tree (GBDT)
#11. 06/25 Mixture models and EM : (Ogawa) [slides]
Gaussian mixture models (GMMs), GMM parameter estimation using EM algorithmm, K-means clustering
*in-class exercise (5) : EM algorithm
#12. 07/02 Sequential data (1) : (Ogawa) [slides]
Hidden Markov models (HMMs), Forward algorithm, Viterbi algorithm
*in-class exercise (6) : Viterbi algorithm
#13. 07/09 Sequential data (2) : (Ogawa)
Backward algorithm, Baum-Welch algorithm
*in-class exercise (7) : Baum-Welch algorithm
*Report (2) : Traffic monitoring using HMM
#14. 07/16 Bayesian learning : (Ogawa) [slides]
Basics of Bayesian learning, Variational Bayes
*in-class exercise (8) : Derivation of posterior probability
#15. 07/23 Final Examination : (Koba & Ogawa)
Prerequisites:
This course assumes prior knowledge of linear algebra and calculus.
References:
The detailed and professional references are included in the lecture slides. The references for acquiring basic knowledge are listed as follows:
C. Bishop (2006). Pattern recognition and machine learning, Springer.
C. Bishop (著),元田浩,栗田多喜夫,樋口知之,松本裕治,村田昇 (監訳) (2007) パターン認識と機械学習 上 ―ベイズ理論による統計的予測,丸善.(1の日本語版)
C. Bishop (著),元田浩,栗田多喜夫,樋口知之,松本裕治,村田昇 (監訳) (2008) パターン認識と機械学習 下 ―ベイズ理論による統計的予測,丸善.(1の日本語版)