3rd Workshop on High-dimensional Learning Dynamics (HiLD)

TBD, ICML 2025

TBD, ICML 2025

Vancouver, BC, Canada

Description

The unprecedented scale and complexity of modern neural networks have revealed emergent patterns in learning dynamics and scaling behaviors. Recent advances in analyzing high-dimensional systems have uncovered fundamental relationships between model size, data requirements, and computational resources while highlighting the intricate nature of optimization landscapes. This understanding has led to deeper insights into the architecture design, regularization, and the principles governing neural learning at scale.  The HiLD workshop seeks to spur research and collaboration around: 

This year, the 3rd Workshop on High-dimensional Learning Dynamics theme is on Navigating Complexity: Feature Learning Dynamics at Scale. We aim to foster discussion, discovery, and dissemination of state-of-the-art research in high-dimensional learning dynamics relevant to ML and bring together experts from all parts of ML: theorists to empirical scientists.  It seeks to create synergies between these two groups, which often do not interact. Through a series of talks, the workshop will tackle questions on high-dimensionality in ML. 

Topics

Include but not limited to:

Credit: Yuichiro Chino

Notification Timeline & Deadline

(*) If you face severe difficulties meeting this deadline, please contact us before the deadline. 

Submission of papers will be through OpenReview and limited to no more than 5 pages plus supplementary materials.


All submissions must be anonymized and may not contain any identifying information that may violate the double-blind reviewing policy.


For accepted workshop posters, please adhere to the following:

Stochastic Differential Equations (SDEs) & ML

Credit: Martin Barlow

Loss Landscapes of Neural Networks

Credit: https://losslandscape.com/gallery/

Random Matrix Theory (RMT) & ML


Credit: Yuichiro Chino

Dynamical Systems related to ML