2nd Workshop on High-dimensional Learning Dynamics (HiLD):
The Emergence of Structure and Reasoning
ICML 2024
Friday, July 26, 2024
Vienna, Austria
Description
Modeling learning dynamics has long been a goal of the empirical science and theory communities in deep learning. These communities have grown rapidly in recent years, as our newly expanded understanding of the latent structures and capabilities of large models permits researchers to study these phenomena through the lens of the training process. Recent progress in understanding fully trained models can therefore enable understanding of their development and lead to insights that improve optimizer and architecture design, provide model interpretations, inform evaluation, and generally enhance the science of neural networks and their priors. The HiLD workshop seeks to spur research and collaboration around:
Developing analyzable models and dynamics to explain observed deep neural network phenomena;
Competition and dependencies among structures and heuristics, e.g., simplicity bias or learning staircase functions;
Creating mathematical frameworks for scaling limits of neural network dynamics as width and depth grow;
Provably explaining the role of the optimization algorithm, hyper-parameter choices, and neural network architectural choices on training/test dynamics;
Relating optimizer design and loss landscape geometry to implicit regularization, inductive bias, and generalization;
High-dimensionality, where intuitions from low-dimensional geometry tend to lead to inaccurate (and often misleading) properties of the machine learning models on large real-world datasets;
Connecting model architectures and data distributions to generalization, memorization, and forgetting.
This year the 2nd Workshop on High-dimensional Learning Dynamics theme is on The Emergence of Structure and Reasoning. We aim to foster discussion, discovery, and dissemination of state-of-the-art research in high-dimensional learning dynamics relevant to ML and bring together experts from all parts of ML: theorists to the empirical scientists. It seeks to create synergies between these two groups which often do not interact. Through a series of talks, the workshop will tackle questions on high-dimensionality in ML.
Topics
Include but not limited to:
The emergence of interpretable behaviors (e.g., circuit mechanisms) and capabilities (e.g., compositionality and reasoning)
Work that adapts tools from stochastic differential equations, high-dimensional probability, random matrix theory, and other theoretical frameworks to understand learning dynamics and phase transitions
Scaling laws related to internal structures and functional differences
Credit: Yuichiro Chino
Competition and dependencies among structures and heuristics, e.g., simplicity bias or learning staircase functions
Relating optimizer design and loss landscape geometry to implicit regularization, inductive bias, and generalization
Notification Timeline & Deadline
Deadline for submission of papers: May 27, 2024, anywhere on earth (*)
Notification of acceptance: June 17, 2024
Camera-Ready Papers: July 19, 2024 (High-dimensional learning dynamics (HiLD) style file required) (**)
Workshop date July 26, 2024
(*) If you face severe difficulties meeting this deadline, please contact us before the deadline.
Submission of papers will be through OpenReview and limited to no more than 5 pages plus supplementary materials.
All submissions must be anonymized and may not contain any identifying information that may violate the double-blind reviewing policy.
For accepted workshop posters, please adhere to the following:
Dimensions 36 in (H) x 24 in (W) or 91 cm (H) x 61 cm (W); this differs from the Main Conference
Portrait format
Information as well as printing services available by ICML can be found here: https://icml.cc/Conferences/2024/PosterInstructions
Stochastic Differential Equations (SDEs) & ML
Credit: Martin Barlow
Loss Landscapes of Neural Networks
Random Matrix Theory (RMT) & ML
Credit: Yuichiro Chino