Information and Coding Theorem
Information and Coding Theorem
Lecturer: Jia-Chyi Wu (吳家琪)
Email: jcwu@ntou.edu.tw
Phone: (02)24622192 #7211
Course ID: M1801D8E
Credits: 3
Objective: How to measure information capacity How to compress information How to efficiently transmit information over noisy channels
Course Prerequisites: Probability and Random Process Signal and Systems
Outline:
1. Perfect Communication Thru a Noisy Channel Reliable transmission of data The binary symmetric channel Error correcting strategies: repetition Shannon`s Noisy Channel Coding Theorem Definition of Entropy/Mutual information Measuring the Information Content
2. Source Coding -- Data Compression Why Entropy Shannon`s Source Coding Theorem Arithmetic coding
3. Noisy Channel Coding Theorem Review of inference Mutual information The data processing theo
Teaching Method: Lectures and in class Discussion
Reference:
1.Cover/Thomas, "Element of Information Theory", 1991.
2.David MacKay, "Information Theory, Inference, and Learning Algorithms", 2003.
3.Robert Gray, "Entropy and Information Theory", Springer-Verlag 2013.
Course Schedule (subject to change):
Lecture 1: Entropy, Mutual Information
Lecture 2: AEP and Entropy Rates
Lecture 3: Entropy Rates
Lecture 4: Data Compression I
Lecture 5: Data Compression II
Lecture 6: Dictionary-based Compression and Channel Capacity
Lecture 7: Channel Coding Theorem
Lecture 8: Channel Coding Theorem II
Lecture 9: Rate Distortion Theory
Lecture 10:Maximum Entropy Estimation
Lecture 11:Spectrum Estimation/R(D) Functions
Lecture 12:Network Information Theory
Lecture 13:Statistics Information Theory and Stock M
Evaluation:
1. Homeworks 20%
2. Midterm Exam 40%
3. Final Report 40%