Invited II:
Meta-Learning for Neural Architecture Search

Abstract

Even though neural network architectures are known to be transferable across data sets, neural architecture search methods typically don’t exploit any prior knowledge but rather start searching from scratch for each task.. One approach to exploit such prior knowledge is meta-learning. Meta-learning approaches aim at learning and leveraging knowledge across various tasks in order to speed up learning of new tasks. This talk will first introduce the idea of meta-learning and afterwards discuss how meta-learning can be applied in order to make neural architecture search approaches more efficient.

Bio: Dr. Thomas Elsken

Thomas Elsken is a Research Engineer at the Bosch Center for Artificial Intelligence. Thomas studied Mathematics at the University of Muenster and did his PhD on automated deep learning at the University of Freiburg and the Bosch Center for Artificial Intelligence under the supervision of Prof. Frank Hutter. Thomas’ research interests lie in automated machine learning. His work focuses on automatically designing neural network architectures, also known as neural architecture search. More recently, Thomas gained interest in meta learning and how meta learning algorithms can be combined with neural architecture search methods in order to make them more efficient and practical.