Instructor: Sumegha Garg (email: sumegha.garg@rutgers.edu)
Schedule: Wed 12:10 - 1:30 pm and Fri 2:00 - 3:20 pm at HLL-116 (Busch Campus)
Office Hours: Thursday 4-5 pm at CoRE 318A
Syllabus: The full course syllabus is here. This website contains the updated syllabus as the course progresses.
Prerequisites: Undergraduate courses on algorithms, complexity theory, discrete mathematics and probability; mathematical maturity. Familiarity with topics in graduate courses on computability theory and algorithms (like CS 509 and 513) is strongly encouraged.
Course Overview
The goal of the course is to emphasize the usefulness of information-theoretic concepts and techniques in various sub-fields of complexity theory. The course will begin by covering fundamental concepts in information theory, such as Shannon’s entropy, mutual information and KL divergence. We will then explore various lower bounding techniques that use information theory. The applications of these techniques will span several areas such as circuit lower bounds, communication complexity, streaming and data structure lower bounds, learning theory, etc.
Grading
Problem sets (30%): 2-3 over the course
Scribe notes (30%)
Final Project (40%): report + presentation
Lecture Notes
Lecture 1 -- Plans for the course; motivations for the entropy function.
Scribe [pdf]
References:
Lecture 2 -- Conditional entropy, mutual information and KL divergence.
Scribe [pdf]
References:
Lecture 3 -- Use of information theory in counting, statistics.
Scribe [pdf]
References:
Lecture 4 -- Chernoff bounds, data compression.
Scribe [pdf]
References:
Lecture 5 -- Huffman encoding, Shannon's source coding theorem
Scribe [pdf]
References:
Lecture 6 -- Kolgomorov complexity
Scribe [pdf]
References:
Lecture 7 -- Error correcting codes, Hamming bound
Scribe [pdf]
References:
Lecture 8 -- Joint AEP, noisy channel coding theorem
Scribe [pdf]
References:
Lecture 9 -- Proof of noisy channel coding theorem
Scribe [pdf]
References:
Lecture 10 -- Hamming codes, linear codes
Scribe [pdf]
References:
Lecture 11 -- Reed Solomon codes, graph entropy
Scribe [pdf]
References:
Lecture 12 -- Graph entropy and properties
Scribe [pdf]
References:
Lecture 13 -- Formula lower bounds using graph entropy
Scribe [pdf]
References:
Lecture 14 -- Deterministic communication complexity
Scribe [pdf]
References:
Lecture 15 -- Randomized communication complexity
Scribe [pdf]
References:
Lecture 16 -- Distributional communication complexity
Scribe [pdf]
References:
Lecture 17 -- CC lower bounds using discrepancy method and information theory
Scribe [pdf]
References:
Lecture 18 -- Information complexity of communication protocols
Scribe [pdf]
References:
Lecture 19 -- Direct sum
Scribe [pdf]
Lecture 20 -- CC of set disjointness
References: https://www.cs.princeton.edu/courses/archive/fall11/cos597D/L17.pdf
Lecture 21 and 22 -- Streaming lower bounds for frequency estimation
Lecture 23 -- Adaptive data analysis
References: http://people.seas.harvard.edu/~madhusudan/courses/Spring2016/notes/thomas-notes-ada.pdf
Additional Papers and Other Material
"Elements of Information Theory" book by T. M. Cover and J. A. Thomas
References here (minicourse by Mike Saks)
Information theory boot camp (Simons Institute, Berkeley)
Information theory in complexity theory and combinatorics (Workshop at Simons Institute, Berkeley)
Other related courses
Information theory in computer science (Mark Braverman at Princeton)
Information theory and its applications in theory of computation (Venkatesan Guruswami and Mahdi Cheraghchi at CMU)
Information theory in computer science (Madhu Sudan at Harvard)
Communication complexity (Prahladh Harsha, Jaikumar Radhakrishnan and Meena Mahajan at TIFR/IMSc)
Information and coding theory (Madhur Tulsiani and Shi Li at TTIC)
Information theory and applications (Ashwin Nayak at UWaterloo)