Hands-On:
Neural Architecture Search with NASlib

Abstract

NASLib is a modular and flexible framework developed by the AutoML Freiburg group. It was created with the aim of providing a common codebase to the community to facilitate research on Neural Architecture Search (NAS). It provides an interface to common NAS benchmarks, facilitates the evaluation of various optimizers or performance predictors, and enables to easily design new components and search spaces. More about NASLib.


In this talk, I will present some features of NASLib, focusing on zero-cost proxies and performance prediction. Participants will learn to add new predictors and explore how zero-cost proxies work in terms of transfer between tasks as well as on objectives other than predictive accuracy.

Code

Bio

Gabi Kadlecová has been a PhD student at Charles University in Prague since 2021. She also works at the Institute of Computer Science of the Czech Academy of Sciences, and is supervised by Roman Neruda. Her focus is Neural Architecture Search, specifically performance prediction and architecture embedding using graph neural networks. Recently, she has been exploring the usage of zero-cost proxies for performance prediction. From June to August 2023, she visited the Machine Learning Lab of the University of Freiburg led by Prof. Frank Hutter.