Title: Information Theory and Coding Techniques

Semester: Sixth Semester B Tech (ECE)

3-0-0,  3 Credits

Prerequisites:  probability theory


Information measures, Asymptotic Equipartition Property (AEP), Characterization of Information Sources. Entropy rate of a stochastic process, Introduction to Lossless Data Compression (Source Coding) for discrete sources, Shannon’s noiseless source coding theorem, Kraft-McMillan inequality for prefix-free codes, Shannon-Fano-Elias code and Huffman codes. Universal source coding - Arithmetic coding and Lempel-Ziv coding. Introduction to Rate Distortion theory.  Discrete channel characterization, channel capacity, Shannon’s noisy-channel coding theorem, reliability exponents.  Differential Entropy, Introduction to the Gaussian Channel. Introduction to Error Control Codes – Hamming Code and Linear Block Codes.


Text Books/References:

1.     Thomas M. Cover and Joy A. Thomas, “Elements of Information Theory”, 2nd Edition, Wiley Interscience, 2006.

2.     David MacKay, “Information Theory, Inference, and Learning Algorithms”, Cambridge University Press, 2003.  (Free legal version of online copy of the book is available at http://www.inference.phy.cam.ac.uk/mackay/itila/book.html).

3.     Robert B. Ash, “Information Theory (Dover Books on Mathematics)”, Dover Publications, 1990.

4.     Claude E. Shannon, “A Mathematical Theory of Communication”, Bell Sys. Tec. Journal, vol. 27, pp. 379–423, 623-656, 1948.