INF242: Information Theory

Overview

What is information? The course deals with measures of information content of digital symbols and sequences of symbols. Based on Shannon's information-theoretical framework, an introduction to the fundamental principles behind data compression, error-correcting codes and cryptography is given. 

Instructor / Email: Sachin J. Valera ; sachin [dot] valera [at] uib [dot] no

UiB course page / Scheduling: Find the UiB page here, and the timetabling here.

Reading material:  Elements of Information Theory (Cover & Thomas)  

                                        Information Theory, Inference, and Learning Algorithms (David MacKay)

Outline of Lectures

Lecture 1 (20/08/19) : Crash course in probability

Lecture 2 (22/08/19) : Crash course in probability

Lecture 3 (27/08/19) : Overview, information sources, surprise function, introduction to entropy, Shannon's axiomatisation of  entropy

Lecture 4 (29/08/19) : Divergence & entropy (and some of their properties)

Lecture 5 (03/09/19): Further properties of entropy, (conditional) mutual information and its properties, Fano's inequality   

Lecture 6 (05/09/19): Overview of source coding, introduction to symbol codes, unique decodability, prefix codes, Kraft's theorem and McMillan's theorem

Lecture 7 (10/09/19): Optimality, theoretical minimum for compression (source coding theorem for symbol codes), main idea behind Shannon's code, binary Huffman code and proof of its optimality 

Lecture 8 (12/09/19): Block coding, d-ary Shannon code, binary Shannon-Fano code

Lecture 9 (17/09/19): Shannon-Fano-Elias code, d-ary Huffman code and its optimality

Lecture 10 (19/09/19): Weakly/strongly typical sequences, weak Asymptotic Equipartition Property (AEP), lossy fixed-length block codes

Lecture 11 (24/09/19): Shannon's lossy fixed-length source coding theorem, Tunstall's algorithm

Lecture 12 (26/09/19): Analysis of Tunstall codes, Markov chains revisited, entropy rate of stochastic process

Lecture 12 continued : see Tutorial 5 

Lecture 13 (01/10/19): Discrete memoryless channels (DMCs), channel capacity, weakly symmetric DMCs

Lecture 14 (03/10/19): Channel capacity II (Convexity, Z-channel, product channels, sum channels)

Lecture 15 (08/10/19): Channel codes, Shannon's noisy-channel coding theorem, joint AEP

Lecture 16 (10/10/19): Typical set decoder, proof of Shannon's noisy-channel coding theorem

Lecture 17 (17/10/19): Feedback channel, lower bound for compression of a stationary source, source-channel separation

Lecture 18 (18/10/19): Differential Entropy I (Continuous AEP, continuous information measures and their properties)

Lecture 19 (22/10/19): Differential Entropy II (Relation of measures to discrete analogues, n-bit quantisation, max differential entropy, entropic central limit theorem)

Lecture 20 (25/10/19): Gaussian Channel I (Infinite capacity Gaussian channels, power-constrained Gaussian channels)

Lecture 21 (29/10/19): Gaussian Channel II (Channel coding over the Gaussian channel (sphere-packing sketch), parallel Gaussian channels and water-filling)

Lecture 22 (05/11/19): Gaussian Channel III (Bandlimited Gaussian channel, ultimate Shannon limit, assorted facts)

Lecture 23 (07/11/19): Perfect Secrecy I (Overview of cryptosystems, basic ciphers, confusion  & diffusion, spurious keys, unicity distance)

Lecture 24 (12/11/19): Perfect Secrecy II (Perfect secrecy and its consequences, order of coding, product cryptosystems)

Lecture 25 (14/11/19): Perfect Secrecy III (Stream ciphers/pseudorandom keystreams, known-plaintext attack on linear-feedback shift register, connection polynomials, epsilon-secrecy)

Lecture 26 (19/11/19): Simmons' theory of authentication, overview of coding theory, basics of linear codes 

Lecture 27 (21/11/19): Hamming code, (asymptotic) Gilbert-Varshamov & Hamming bounds (and their connection to Shannon's noisy-channel coding theorem), perfect codes, dual codes

Problem Classes

Tutorial 1 (30/08/19) : Exercises on probability theory 

Tutorial 2 (06/09/19): Exercises on entropy 

Tutorial 3 (13/09/19): Exercises on information measures 

Tutorial 4 (20/09/19): Exercises on source coding

Tutorial 5 (27/09/19): Continuation of Lecture 12: entropy rate of stationary Markov chain, comments on general AEP, LZ78 compression and sketch of its asymptotic optimality 

Tutorial 6 (04/10/19): Exercises on parsing and entropy rates

Tutorial 7 (11/10/19): Exercises on channel capacity

Tutorial 8 (15/11/19): Exercises on differential entropy

Tutorial 9 (22/11/19): Exercises on Gaussian channel

Revision Sessions

Session 1 (07/11/19, 18.00): Entropy and information measures (with pizza!) (Chapter 1)

Session 2 (26/11/19): Generalities, pointers, exam preparation, Q&A (Chapters 2-3)

Session 3 (28/11/19): Generalities, pointers, exam preparation, selected problems, Q&A (Chapters 2-3)

Session 4 (29/11/19): Generalities, Q&A, selected problems from chapter 4, final tips and pointers 

You can find the course files here, and some additional interesting content here.