Efficient Deep Learning for Foundation Models
ECCV Workshop 2024
Sep 29th, Mico Milano, Italy
ECCV 2024
Milano, Italy
Half Day Workshop at Afternoon, Sep 29th, 2024
Workshop Overview
This year marks a sharp rise of foundation models, spurring enormous practical impacts to the society with surprising network strength across a wide range of input modality (e.g., GPT-4 & Gemini & LLama(-2) families, Megatron, Picasso, DALL·E, Imagen, SAM, Bard, PaLM & PaLM-E, Flamingo, CLIP, etc.). In a new realm of foundation model that relies on an enormous amount of training data and hardware (e.g., thousands of GPUs) to yield even one network, efficient deep learning becomes unprecedentedly critical to ensure a successful development/deployment of foundation models for both academia and industry.
With tens of billions of parameters, foundation neural networks by nature urge (i) enormous amount of clean and properly labeled training data, (ii) with highly efficient and scalable training pipeline, oriented around (iii) an efficient yet strong neural network architecture, with (iv) stringent practical requirements for ubiquitous usage and fast inference. For cloud service such as GPT-backed applications, speeding up the inference directly translates into far reduced waiting time for billions of users worldwide.
In this EFM workshop, we wish to offer to our audience the most recent insights across the entire development stack to lower the barrier for large foundation models.
Our topics include but are not limited to the following:
Novel neural architecture designs for foundation models across modalities and tasks
Resource-efficient training/fine-tuning from both data and algorithm perspectives
New methods to improve efficiency in distributed training and inference of large models
Large model compression techniques such as quantization, architecture search, pruning, adaptation, distillation
Hardware-software co-optimization of large foundation models
Efficient and effective bench-marking efforts for large foundation models
Program Summary
This workshop will feature invited talks, selected paper publication, and panel discussion. See the program section for details.
Call for papers
Important dates
Paper submission deadline:
Abstract Deadline: July 26, 11:59PM, Pacific Time (extended to July 30, 11:59PM, Pacific Time)
Paper Deadline: August 2, 11:59PM, Pacific Time
Notification to authors: August 16, 11:59PM, Pacific Time
Camera ready deadline (hard deadline): August 23, 11:59PM, Pacific Time
Submission instructions
We are following the ECCV paper format: https://eccv2024.ecva.net/Conferences/2024/AuthorGuidelines
LaTeX/Word Templates: ECCV 2024 Paper Template
We only accept short papers:
Paper Length: Note that papers should not exceed 8 pages, excluding references. All the requirements are the same as the ECCV paper style guide. Papers are for presenting mature works only. A paper to the workshop should not only describe novel ideas but also have full experiments and analyses to support the proposed ideas.
Note the page limit includes figures and tables, in the ECCV style. Additional pages containing only cited references are allowed. Papers with more than 8 pages (excluding references) will be rejected without review.
A paper should be submitted using the above templates. The length should match that intended for final publication.
Blind review: we adopt double-blind review for this workshop. Submitted papers and supplementary materials should not reveal any information about the author.
Dual submission: We do not accept paper submissions that have been published (including at the ECCV main conference) or are under review for other conferences or workshops. Accepted papers are expected to be published at ECCV proceedings.
Submission website
Submission site:
https://cmt3.research.microsoft.com/EFM2024