CS540: Information-theoretic Methods in Complexity Theory (Fall 2023)
Instructor: Sumegha Garg (email: sumegha.garg@rutgers.edu)
Schedule: Wed 12:10 - 1:30 pm and Fri 2:00 - 3:20 pm at HLL-116 (Busch Campus)
Office Hours: Thursday 4-5 pm at CoRE 318A
Syllabus: The full course syllabus is here. This website contains the updated syllabus as the course progresses.
Prerequisites: Undergraduate courses on algorithms, complexity theory, discrete mathematics and probability; mathematical maturity. Familiarity with topics in graduate courses on computability theory and algorithms (like CS 509 and 513) is strongly encouraged.
Course Overview
The goal of the course is to emphasize the usefulness of information-theoretic concepts and techniques in various sub-fields of complexity theory. The course will begin by covering fundamental concepts in information theory, such as Shannon’s entropy, mutual information and KL divergence. We will then explore various lower bounding techniques that use information theory. The applications of these techniques will span several areas such as circuit lower bounds, communication complexity, streaming and data structure lower bounds, learning theory, etc.
Grading
Problem sets (30%): 2-3 over the course
Scribe notes (30%)
Final Project (40%): report + presentation
Lecture Notes
Lecture 1 -- Plans for the course; motivations for the entropy function.
Scribe [pdf]
References:
Lecture 2 -- Conditional entropy, mutual information and KL divergence.
Scribe [pdf]
References:
Lecture 3 -- Use of information theory in counting, statistics.
Scribe [pdf]
References:
Lecture 4 -- Chernoff bounds, data compression.
Scribe [pdf]
References:
Additional Papers and Other Material
"Elements of Information Theory" book by T. M. Cover and J. A. Thomas
References here (minicourse by Mike Saks)
Information theory boot camp (Simons Institute, Berkeley)
Information theory in complexity theory and combinatorics (Workshop at Simons Institute, Berkeley)
Other related courses
Information theory in computer science (Mark Braverman at Princeton)
Information theory and its applications in theory of computation (Venkatesan Guruswami and Mahdi Cheraghchi at CMU)
Information theory in computer science (Madhu Sudan at Harvard)
Communication complexity (Prahladh Harsha, Jaikumar Radhakrishnan and Meena Mahajan at TIFR/IMSc)
Information and coding theory (Madhur Tulsiani and Shi Li at TTIC)
Information theory and applications (Ashwin Nayak at UWaterloo)