Current deep neural networks have a dirty secret: they usually requires heavy-duty power-hungry processors and a large dataset to train and run. OpenAI’s GPT-3 was estimated to emit more than 500 metric tons of carbon dioxide, about as much as 110 US cars emit in a year, during training, according to a research paper from Hugging Face.
While the models are getting larger in many cases, the rise of the Internet of Things (IoT) shifts the future of deep learning from clouds toward the edge with small, embedded devices. It is now the right time to improve models so that they run more efficiently, even deployable on embedded devices having limited computing power and memory. Making deep learning models efficient to run on resource-constrained devices like microcontrollers for a year even more with a single coin battery is very demanding, especially for real-time applications with image and speech processing such as image recognition, segmentation, object localisation, multi-channel speech enhancement, and speech recognition. On the other hand, a great deal of attention was paid to efficient implementation in hardware and system design with the inclusion of artificial intelligence.
In this track, we aim at conducting projects focussing on:
measuring & assessing the environmental impact of developing and running AI models
making AI systems more sustainable, smaller models, using less data to train, easy to develop and deploy.
addressing AI for sustainable applications
enabling embedded systems for sustainable AI
Meanwhile, the sustainable AI models are still robust against the changes, high accuracy, and understandable.
Topics of particular interest include, but are not limited to:
Sustainable Artificial Intelligence
Compression of neural networks for inference deployment, including methods for quantization (including binarization), pruning, knowledge distillation, structural efficiency and neural architecture search
Learning on edge devices, including federated and continuous learning
Learning from limited labelled data, including methods for self-supervised learning, constrastive learning, transfer learning, unsupervised learning, generative models, and latent representation learning
Exploring new ML models designed to use on designated device hardware
New benchmarks suited to the environmental impact of developing and running AI models
New and emerging AI applications for sustainability
Interpretable and Explainable AI Applications toward the sustainable development goals
Embedded Systems for Sustainable Artificial Intelligence
Future emerging processors and technologies for use in resource-constrained environments
Hardware security/privacy
Low-power wireless systems
Energy management and smart grids
Network on a chip
Energy harvesting
Sustainable AI: AI for Sustainability and the Sustainability of AI
Some possibly interesting assignments can be found on the Pervasive Systems Education page . Examples includes
TBA
For further information on the content of this track, you may contact the track chair: Le Viet Duc.