The Schedule
Opening Remarks [10:00]
Sparsity in NN Tutorial by Torsten Hoefler [slides] [10:10-11:55]
Session I: Algorithms for Sparsity (Chair: Jonathan Frankle)
[12:00] Invited Talk 1: Friedemann Zenke [slides]
[12:30] Invited Talk 2: Sara Hooker [slides]
[13:00] Spotlight: Doping: A Technique for Extreme Compression of LSTM Models using Sparse Structured Additive Matrices
[13:10] Spotlight: Powerpropagation: A Sparsity Inducing Weight Reparameterisation
[13:20] Spotlight: Chasing Sparsity in Vision Transformers: An End-to-End Exploration
[13:30] Panel
Moderator: Jonathan Frankle
Panelists: Torsten Hoefler, Friedemann Zenke, Sara Hooker, Rosanne Liu, Jose Javier Gonzalez Ortiz
Poster Session I [14:10-15:30]
Session II: Software and Hardware for Accelerating Sparsity (Chair: Trevor Gale)
[10:00] Invited Talk 1: Diana Marculescu [slides]
[10:30] Invited Talk 2: Paulius Micikevicius [slides]
[11:00] Spotlight: Multiplying Matrices Without Multiplying
[11:10] Spotlight: Channel Permutations for N:M Sparsity
[11:20] Panel
Moderator: Trevor Gale
Panelist: Natalia Vassilieva, Paulius Micikevicius, Selima Curci, Cliff Young
Poster Session II [12:00-13:00]
Session III: Sparsity as a Tool for Understanding Deep Learning Session (Chair: Ari Morcos)
[13:00] Invited Talk 1: Anna Golubeva [slides]
[13:30] Invited Talk 2: Gintare Karolina Dziugaite [slides]
[14:00] Spotlight: Model-Invariant State Abstractions for Model-Based Reinforcement Learning
[14:10] Spotlight: Neural Network Pruning via Rate-Distortion Theory
[14:20] Panel
Moderator: Ari Morcos
Panelists: Anna Golubeva, Gintare Karolina Dziugaite, Shashank Rajput, Mitchell Wortsman
Closing Remarks [15:00-15:10]