ICML 2024 Workshop on 

Theoretical Foundations of Foundation Models

Schedule (Vienna Time)

Another version and livestreaming at the conference website (registration needed).

09:00-09:05 Opening Remarks by Organizers

09:05-09:35 Invited Talk (Yuandong Tian, Meta AI): Understanding Foundation Models via the Lens of Training Dynamics

09:35-10:05 Invited Talk (Jason Lee, Princeton): Learning Representations and Associations with Gradient Descent

10:05-10:15 Contributed Talk: Unlocking Tokens as Data Points for Generalization Bounds on Larger Language Models

10:15-10:25 Contributed Talk: Fundamental Limits of Prompt Compression: A Rate-Distortion Framework for Black-Box Language Models

10:25-10:35 Short Break

10:35-11:05 Invited Talk (Dan Alistarh, IST Austria): Model Compression at GPT Scale by Estimating Second-Order Information

11:05-11:35 Invited Talk (Ananda Theertha Suresh, Google Research): Accelerating language model inference using optimal transport: Theory and algorithms

11:35-12:35 Poster Session 1

12:35-14:00 Lunch Break

14:00-15:00 Poster Session 2

15:00-15:30 Invited Talk (Kamalika Chaudhuri, UCSD): Theoretical Foundations of Memorization in Foundation Models

15:30-16:00 Coffee Break

16:00-16:10 Contributed Talk: Transformers are Minimax Optimal Nonparametric In-Context Learners

16:10-16:20 Contributed Talk: Models That Prove Their Own Correctness

16:20-17:00 Panel Discussion (Ananda Theertha Suresh, Furong Huang, Reza Shokri, Subbarao Kambhampati, Tom Goldstein, and Yuandong Tian)

17:00-17:05 Concluding Remarks and Awards


Accepted Papers

Please see the list of accepted papers, featuring 58 poster presentations and 4 oral presentations. 

TwitterEmail