Course Slides:
Yet to come (Recurrent neural networks, unsupervised adaptive filtering)
Course Contents:
Review of Signal Processing: Impulse response, Z-transform, system function, FIR filters and IIR filters
Review of random processes: moments of random process, wide-sense stationarity, autocorrelation function & power spectral density.
Linear adaptive filters: Design of optimal filters, Wiener filters, minimum mean squared error (MMSE) estimation, Linear prediction (LP) and LP coding of speech signals, stochastic gradient descent algorithms - least mean squares (LMS) and recursive least squares (RLS) algorithms, Kalman filters, extended Kalman filters and unscented Kalman filters.
Statistical Approaches: Markov Process, hidden Markov modelling, signal modeling and generation using HMMs.
Nonlinear filters: Introduction to kernel adaptive filtering, KLMS and KRLS algorithms, neural network architectures for sequence modeling, recurrent neural networks, long-short time memory networks and their applications to signal processing.
(If time permits) Unsupervised adaptive filtering: Blind source separation and blind deconvolution
Reference Books:
Monson H Hayes, Statisitcal Digital Signal Processing and Modelling, John Wiley & Sons
S. Haykin, Adaptive Filter Theory, Fourth Edition, Pearson Education LPE, 2007.
W Liu, JC Principe and S Haykin, Kernel Adaptive Filtering, John Wiley & Sons, 2010
I Goodfellow, Y Bengio and A Courville, Deep Learning, MIT Press, 2016
M A Little, Machine Learning for Signal Processing, Oxford University Press, 2019
Evaluation Criteria:
Homeworks & programming assignments - 30%
Course project - 30%
Periodic class tests - 15%
End exam - 25%