Information
Theory
(ECE 1134 )
L T P C
3 0 1 4
Instructor: Anjan Kumar Talukdar
email id: anjantalukdar@gauhati.ac.in
Teaching assistant: Mrs. Ananya Choudhury
email id: a.choudhury50@gmail.com
Class timing:
Monday: 11:30 AM - 12:30 PM (Room no, 3)
Wednesday : 11: 30 AM - 12: 30 PM (Room no. 3) and
2:30 PM-4:30 PM (Practical)(SPC LAB)
Thursday : 9: 30 AM - 10: 30 AM (Room no. 3)
Course objective:
The course is an advanced treatment of different coding methods associated with information systems.
Course outcomes:
At the end of the semester, students can
Measure information content of a source and can implement various source encoder and decoder algorithms.
Explain various types of channels and can implement various error-correcting coder and decoder algorithms.
Grading policy:
First-class test : 20 marks
Second class test: 20 marks
Third class test/assignment : 10 marks
End semester examination: 50%
Grading (absolute):
Mark obtained Letter grade Grade point
91 - 100 O 10
81 - 90 A+ 9
71 - 80 A 8
61 - 70 B+ 7
51 - 60 B 6
41 - 50 C 5
31 - 40 D 4
<31 F 0
Programming:
Programming will be done in Matlab.
Prerequisites:
A good background in linear algebra, calculus, probability theory, Digital Communication and working with Matlab.
Syllabus:
Module 1:
Review of sampling theorem-Practical aspects of sampling-quantization of analog signals-Spectra of Quantization-wave from coding- PCM, ADPCM, Delta modulation- ADM- Bit rate and SNR-calculation-Mean and prediction coding; Base band shaping, binary Data formats, NRZ, RZ, Manchester formats- Baseband transmission-ISI- Effect of ISI, Synchronization-application. correlative coding Eye Pattern-Adaptive equalization for data transmission data reception matched filter, Optimum SNR. Introduction to Information Theory: Information and Sources Uniquely Decodable Codes; Instantaneous codes-. Construction of an Instantaneous code; Kraft's Inequality. Coding Information Sources-: The Average length of a code;
Module 2:
Encoding for special Sources; Shannon's Theorems. Shannon's theorem for the Binary Symmetric channel, Entropy and Source coding, Lossless coding techniques including Huffman codes, Arithmetic codes, Lempel-Ziv coding, Lossy coding techniques, Shannon coding theorem, Channel codes including Linear block codes, Cyclic codes, BCH codes Convolutional codes. Finding Binary Compact Codes, Huffman's code. r-ary compact Codes, Code Efficiency and Redundancy.
Module 3:
Channels and Mutual Information: Information Channels, Trellis Coded Modulation; Probability relations in a channel; Apriori and Aposteriori Entropies, Generalization of Shannon's first theorem, Mutual Information. Properties of Mutual Information, Noiseless and Deterministic channels,
Module 4:
Cascaded channels, Channel Capacity, Conditional Mutual Information; Reliable Messages through Unreliable channels: Error probability and Decision rules, the Fano bound, Hamming distance, Random Coding; Ensemble performance analysis of block and convolution codes; Introduction linear block codes-cyclic codes-Burst error detecting and correcting codes-Decoding algorithms of convolution codes-ARQ codes performance of codes.
Suggested readings:
Simon Haykin, “Digital Communication”
Richard B. Wells, “ Applied Coding and Information Theory for Engineers”
Ranjan Bose “ Information Theory, Coding and Cryptography”