Instructor: Sumegha Garg (email: sumegha.garg@rutgers.edu)
Schedule: Tuesday 10:20-1:20 pm at Beck Hall 252 (Livingston Campus)
TA: Naina Chaturvedi (email: nc832@cs.rutgers.edu)
Office hours: TBD
Prerequisites: Undergraduate courses on algorithms, complexity theory, discrete mathematics and probability; mathematical maturity. Graduate courses on computability theory and algorithms (like CS 509 and 513) are strongly encouraged.
Course Overview
This is an research-oriented graduate course focused on learning theory under memory, communication and information constraints. The course will begin with fundamental concepts, including PAC learning, online learning, and combinatorial dimensions. After the initial lectures, the course will transition to a seminar format. Each week, students will read research papers, with one student presenting the key results to the class. A key learning outcome of the course is to develop the ability to read, understand, and present recent technical advances on the limits of learning under computational constraints.
Grading components
Attendance
Paper presentation and participation
Homeworks and midterm
Final project
Weekly lectures and reading materials
Lecture 1 -- Overview, PAC learning and generalization for finite hypothesis classes
Lecture 2 -- Uniform convergence, agnostic learning and VC bound
Additional resources
Understanding Machine Learning: From Theory to Algorithms by Shai Shalev-Shwartz and Shai Ben-David. PDF here.
Other related courses
Foundations of Machine Learning and Data Science (by Nina Balcan and Avrim Blum at CMU)
Theory of Machine Learning (by Vatsal Sharan at USC)
Foundations of Modern Machine Learning (by Nika Haghtalab at Cornell)