Session VI (May 16, 10:30am-12:00pm): Covering Arrays and Combinatorial Testing, organized by Ryan Lekivetz
Title: Design-based Hyperparameter Optimization for Machine Learning Models
Speaker: Rakhi Singh, Binghamton University
Abstract: Hyperparameters significantly influence the learning process of any machine learning model and significantly impact the corresponding prediction and classification accuracies. Tuning hyperparameters is critical due to its pivotal role in enhancing model performance, but it often requires extensive manual tuning and experimentation. Existing methods for hyperparameter optimization, like grid search or Bayesian optimization, are effective but are usually very computationally expensive. A design-based approach for automating hyperparameter optimization for neural networks was proposed in Shi et al., 2023, NEJSDS. In this talk, I will first add to their comparisons by including covering arrays as designs and random forests and gradient-boosted trees as modeling methods. From these comparisons, we conclude that covering arrays provides excellent competition to the other compared designs. Finally, I will introduce new design selection criteria to find LASSO-based screening designs that usually perform even better than covering arrays.