NeurIPS 2024 Workshop
Fine-Tuning in Modern Machine Learning: Principles and Scalability (FITML)
at Dec. 14th (Satuday), East Exhibit Hall A
at Dec. 14th (Satuday), East Exhibit Hall A
This FITML workshop aims to contribute to the recent radical paradigm shift for fine-tuning in modern machine learning, theoretically, computationally, and systematically.
It encourages researchers to push forward the frontiers of theoretical understanding of fine-tuning, devising expeditious and resource-efficient inference and fine-tuning methods in machine learning systems, enabling their deployment within constrained computational resources.
This FITML workshop explores theoretical and/or empirical results for understanding and advancing modern practices for efficiency in machine learning.
large-scale kernel approximation, deep learning theory
neural architecture design,
deep learning
large scale machine learning,
algorithm-hardware design
scalable machine learning,
deep learning theory
natural language processing
LLMs
deep learning theory,
in-context learning
transfer learning, trustworthy ML
non-convex optimization,
scalable machine learning
trustworthy ML
DL theory