Most of my work relates to neural parameter symmetries, which are operators which can be applied to neural network weights without changing the function they represent.
* represents equal contribution
Papers
Derek Lim*, Theo (Moe) Putterman*, Robin Walters, Haggai Maron, Stefanie Jegelka
Accepted to NeurIPS 2024, Won Best Paper Award at HiLD ICML 2024
GL Equivariant Metanetworks for Learning on Low Rank Weight Spaces
Theo (Moe) Putterman*, Derek Lim*, Yoav Gelberg, Michael Bronstein, Stefanie Jegelka, Haggai Maron
Oral Presentation at Learning on Graphs (LoG) 2025
GradMetaNet:
An Equivariant Architecture for Learning on Gradients
Yoav Gelberg*, Yam Eitan*, Aviv Navon, Aviv Shamsian, Theo Putterman, Haggai Maron
Accepted to NeurIPS 2025