EC-604B: Information Theory and Coding

Lecture 1: Introduction

Lecture 2: Entropy and mutual information

Lecture 3: Chain rules and inequalities

Lecture 4: Data processing, Fano's inequality

Lecture 5: Asymptotic equipartition property

Lecture 6: Entropy rate

Lecture 7: Source coding and Kraft inequality

Lecture 8: Optimal code length and roof code

Lecture 9: Huffman codes

Lecture 10: Shannon-Fano-Elias and arithmetic codes

Lecture 11: Maximum entropy

Lecture 12: Channel capacity

Lecture 13: Channel coding theorem, joint typicality

Lecture 14: Proof of channel coding theorem (Notes)

Lecture 15: Hamming codes and Viterbi algorithm

Lecture 16: Feedback channel, source-channel separation theorem (Notes)

Lecture 17: Differential entropy

Lecture 18: Gaussian channel

Lecture 19: Parallel Gaussian channel and water-filling

Lecture 20: Quantization and rate-distortion

Lecture 21: Rate-distortion theorem (Notes on calculating R(D))

Lecture 22: Final review and future topics

Reference Materials:

[Tutorials]

[Link]

[Link]