Architecture and System Support for Transformer Models
Call for Papers
Transformer models have become the foundation of a new wave of machine learning models. The application of such models spans from natural language understanding into image processing, protein folding, and many more. The main objective of this workshop is to bring the attention of our community to the upcoming architecture and system challenges for these foundational models and drive the innovation for supporting efficient execution of these ever-scaling models. To achieve this, the format of the workshop will consists of a combination of keynote speakers, short talks, followed by a panel discussion. Subject areas of the workshop included (but not limited to):
System and architecture support of transformer models at scale
Distributed training and infrastructure support
Efficient model compression (e.g. quantization, sparsity) techniques
Chiplet architecture and system support for transformer models
Efficient and sustainable training and serving
Real system evaluation of hardware and system
Benchmarking and evaluation of transformer models
System and architecture support for Mixture-of-Expert (MoEs)
Ethical accelerator and system design for AGI
Submission Instructions
We welcome submissions of up to 4 pages (not including references). This is not a strict limit, but authors are encouraged to adhere to it if possible.
All submissions must be in PDF format and should follow the ISCA'23 Latex Template.
Please follow the guidelines provided at ISCA 2023 Paper Submission Guidelines.
Please submit your paper at OpenReview. While the review process is not public, we make the accepted papers and their reviews public after the notification deadline.
Reviewing will be double blind: please do not include any author names on any submitted documents except in the space provided on the submission form.
Organizing Committee
Bahar Asgari (UMD)
Tushar Krishna (GaTech)
Yingyan (Celine) Lin (GaTech)
Mohammad Shoeybi (Nvidia)
Suvinay Subramanian (Google)
Amir Yazdanbakhsh (Google Research, Brain Team)
Full Paper Submission Deadline: April 21st, 2023, 11:59 AoE (OpenReview)
Please select ASSYST for the Workshop Track.
Paper Notification (Tentative): May 7th, 2023.
Workshop: June 17th, 2023
Contact us at archsystm@gmail.com