CMSC 35401-2, Winter 2023: Topics in Machine Learning

Bayesian Optimization and Adaptive Experimental Design


Effective black-box optimization is a widespread challenge in many real-world applications, including recommender systems, computational biology, drug discovery, and hyperparameter tuning of large workflows. Bayesian optimization has been a state-of-the-art technique, and among the most popular approaches that empower these modern applications.

This course includes a tutorial (first three weeks) on Bayesian optimization, covering the core concepts and fundamental algorithms, followed by discussions of research papers exhibiting the key practical challenges and recent advancements. Students will read and discuss recent research papers, lead student lectures (from week 4-9), as well as conduct a research project in 1-2 student groups.


  • Lectures on Tu/Th 2:00pm-3:20pm in RY 277

  • Office hour (by appointment): Tu/Th 3:30pm-4:00pm in JCL 317

  • First lecture: 1/3/2023

  • Announcements on Canvas

  • Discussion via Ed Discussion


Course Requirements and Grading

  • 30%: Paper reviews and lecture presentation

  • 70%: Research project (Report + poster presentation)


  • CMSC 25300/CMSC 25400/TTIC 31020 or equivalent, or permission from instructor


  • Week 1: Introduction

  • Week 2: Practical challenges: Scalability & kernel learning

  • Week 3: Advanced BO methods and open problems.

  • Week 4: Noisy & indirect observations

  • Week 5: Complex objectives

  • Week 6: Complex constraints

  • Week 7: Complex dynamics

  • Week 8: Massive search space

  • Week 9: Delayed feedback


  • Bayesian Optimization, by Roman Garnett, Cambridge University Press, January 2023.

  • Active Learning, (accessible via UChicago IPs), by Burr Settles, Morgan & Claypool Publishers, 2012