Hands-On:
Bayesian optimization with SMAC

Abstract

In this tutorial, we are excited to introduce you to our powerful hyperparameter optimization (HPO) tool called SMAC! SMAC, short for sequential model-based algorithm configuration, is built upon Bayesian Optimization (BO) techniques. It offers a clever intensification routine that efficiently compares different hyperparameter configurations, making it a valuable asset for optimizing your machine learning models.


During the tutorial, we will guide you through two main parts. In the first part, we will provide an in-depth understanding of SMAC, including its working principles and the components that make it an effective HPO tool. We will explain how SMAC leverages Bayesian Optimization to intelligently explore the hyperparameter space and find the optimal configurations for your specific problem. Moreover, we will highlight SMAC's versatility, showcasing its capabilities in both classic black-box problems and optimizing the training process of machine learning models.


The second part of the tutorial will be interactive and practical. We will guide you through the step-by-step process of using SMAC for hyperparameter optimization. Starting from identifying suitable methods for your particular problem, we will demonstrate how to set up the configuration space, defining the range and types of hyperparameters to be optimized. Additionally, we will provide insights into visualizing the optimization process, allowing you to gain a comprehensive understanding of how SMAC operates.


To summarize, this tutorial will address the following key questions:


By the end of this tutorial, you will have a solid grasp of SMAC's capabilities and be equipped with the knowledge and practical skills to effectively leverage it for hyperparameter optimization tasks.

Bio

Since 2020 Carolin Benjamins has been a PhD student at the research group led by Prof. Dr. Marius Lindauer at Leibniz University Hannover which is part of the automl.org supergroup. Since then she also enjoyed collaborations with other research groups. Her focus is broadly on Automated Machine Learning, and more explicit on Dynamic Algorithm Configuration and Bayesian Optimization. Apart from the AutoML side, she is also interested in robotics and Contextual Reinforcement Learning. This might be a remnant of her Bachelor’s and Master’s studies in Mechatronics and Robotics at the Leibniz University Hannover. She is part of the development team of SMAC and worked with SMAC for numerous publications. In general, she is driven by the love for automation and making complex algorithms more accessible.

Alexander Tornede is a researcher in the field of Automated Machine Learning (AutoML). He completed his Master's degree with a focus on machine learning at Paderborn University in 2018 and recently defended his Ph.D. under the supervision of Prof. Dr. Eyke Hüllermeier, focusing on Algorithm Selection. Currently, Alexander is a Postdoctoral Researcher in the research group led by Prof. Dr. Marius Lindauer at Leibniz University Hannover, which is part of the automl.org supergroup. Since joining the group in September 2022, he has been actively involved in advancing interactive and explainable AutoML and researches on integrating large language models (LLMs) with AutoML. Furthermore he leads the SMAC development team. In addition to his research endeavors, Alexander serves as one of three general chairs of COSEAL, an international group of researchers dedicated to Algorithm Selection and Configuration and was involved in the organization of the AutoML conferences  2022 and 2023.