NeurIPS 2024 Workshop

Fine-Tuning in Modern Machine Learning: Principles and Scalability (FITML)

at Dec. 14th (Satuday), East Exhibit Hall A

 This FITML workshop aims to contribute to the recent radical paradigm shift for fine-tuning in modern machine learning, theoretically, computationally, and systematically.

 It encourages researchers to push forward the frontiers of theoretical understanding of fine-tuning, devising expeditious and resource-efficient inference and fine-tuning methods in machine learning systems, enabling their deployment within constrained computational resources.

This FITML workshop explores theoretical and/or empirical results for understanding and advancing modern practices for efficiency in machine learning.

Invited Speakers

Panelist

Important Dates

Submission Deadline: October 1, 2024, GMT

Author notification: October 9, 2024, GMT

Workshop date: December 14 (Saturday), 2024

Workshop location: East Exhibit Hall A 

Organizers

Fanghui Liu (Warwick)

large-scale kernel approximation, deep learning theory

Grigorios Chrysos (UW-Madison)

neural architecture design,

deep learning

Beidi Chen (CMU)

large scale machine learning,

algorithm-hardware design

Rebekka Burkholz (CISPA)

scalable machine learning,

deep learning theory

Saleh Soltan (Amazon)

natural language processing

LLMs

Angeliki Giannou (UW-Madison)

deep learning theory,

in-context learning

Masashi Sugiyama (RIKEN, UTokyo)

transfer learning, trustworthy ML

Volkan Cevher (EPFL)

non-convex optimization,

scalable machine learning

Volunteers

Yongtao Wu (EPFL)

trustworthy ML

Yuanhe Zhang (Warwick)

DL theory

 

 

Let us know if you'll be attending!