COURSE DESCRIPTION
The class considers classical and current theory and practice of supervised and unsupervised pattern recognition. Topics included are the most widely used methods for pattern recognition. One part of the course deals with the design of the classifier in a pattern recognition system which include Bayesian classifiers, linear classifiers (perceptron algorithm, logistic discrimination), and non-linear classifiers (multi-layer perceptron, polynomial classifiers, radial basis function networks, support vector machines). Other goals of the course are (1) to study methodologies related to the reduction of the number features for classification to a sufficient minimum since the number of features at the disposal of a designer of a classification system is usually very large; (2) data transformation and dimensionality reduction of a set of measurements to a new set of features (Karhunen-Loève Transformation, Independent Component Analysis, Discrete Fourier Transform, Discrete Wavelet Transform); (3) feature generation techniques which are dependent on the application of image analysis (regional features, shape and size characterization); and (4) the study of the unsupervised case of pattern recognition by revealing the organization of patterns into clusters (clustering algorithms).
CONTENTS
1. Linear Classifiers
1.1 Linear Discriminant Functions and Decision Hyperplanes
1.2 The Perceptron Algorithm
1.3 Variants of the Perceptron Algorithm
1.4 Least Squares Methods
1.5 Support Vector Machines
2. Nonlinear Classifiers
2.1 The XOR Problem
2.2 The Two-Layer Perceptron
2.3 Multi-Layer Perceptron
2.4 The Back Propagation Algorithm
2.5 The Cost Function Choice
2.6 Generalized Linear Classifiers
2.7 Polynomial Classifiers
2.8 Radial Basis Function Networks
2.9 Support Vector Machines
2.10 Logistic Classifier
3. Feature Selection & Generation
3.1 Outlier Removal, Normalization & Missing Data
3.2 Peaking Phenomenon & Hypothesis Testing
3.3 Principal Component Analysis (PCA)
3.4 The Singular Value Decomposition
3.5 Independent Component Analysis
3.6 The Discrete Fourier Transform
3.7 Regional Features
3.8 Parametric Models
3.9 Shape and Size
4. Clustering
4.1 Definitions
4.2 Proximity Measures
4.3 Sequential Clustering Algorithms
4.4 Agglomerative Algorithms
4.5 Vector Quantization
4.6 Competitive Learning Algorithms
Term: Spring 2017.
Time and Location: 16:00 – 16:50 on Monday, Wednesday & Friday at Classroom LA 206.
Grading: There will be four written exams and projects. Projects require programming. Exam 1 (15 %), Exam 2 (15 %), Exam 3 (15 %), Exam 4 (15 %), Projects (40 %).
Bibliography: (1) PATTERN RECOGNITION by Sergios Theodoridis, Konstantinos Koutroumbas, Academic Press, 4th edition; (2) NEURAL NETWORKS AND LEARNING MACHINES by Simon Haykin, Prentice Hall, 3rd edition.