CS540: Information-theoretic Methods in Complexity Theory (Fall 2023)

Instructor: Sumegha Garg (email: sumegha.garg@rutgers.edu)

Schedule: Wed 12:10 - 1:30 pm and Fri 2:00 - 3:20 pm at HLL-116 (Busch Campus)

Office Hours: Thursday 4-5 pm at CoRE 318A

Syllabus: The full course syllabus is here. This website contains the updated syllabus as the course progresses. 

Prerequisites: Undergraduate courses on algorithms, complexity theory, discrete mathematics and probability; mathematical maturity. Familiarity with topics in graduate courses on computability theory and algorithms (like CS 509 and 513) is strongly encouraged.

Course Overview 

The goal of the course is to emphasize the usefulness of information-theoretic concepts and techniques in various sub-fields of complexity theory. The course will begin by covering fundamental concepts in information theory, such as Shannon’s entropy, mutual information and KL divergence. We will then explore various lower bounding techniques that use information theory. The applications of these techniques will span several areas such as circuit lower bounds, communication complexity, streaming and data structure lower bounds, learning theory, etc.


Grading

Lecture Notes


Additional Papers and Other Material

Other related courses

Information theory in computer science (Mark Braverman at Princeton)

Information theory and its applications in theory of computation (Venkatesan Guruswami and Mahdi Cheraghchi at CMU)

Information theory in computer science (Madhu Sudan at Harvard)

Communication complexity (Prahladh Harsha, Jaikumar Radhakrishnan and Meena Mahajan at TIFR/IMSc)

Information and coding theory (Madhur Tulsiani and Shi Li at TTIC)

Information theory and applications (Ashwin Nayak at UWaterloo)