HyperUCB: Hyperparameter optimization using contextual bandits

Research output: Contributions to collected editions/worksArticle in conference proceedingsResearchpeer-review

Authors

Setting the optimal hyperparameters of a learning algorithm is a crucial task. Common approaches such as a grid search over the hyperparameter space or randomly sampling hyperparameters require many configurations to be evaluated in order to perform well. Hence, they either yield suboptimal hyperparameter configurations or are expensive in terms of computational resources. As a remedy, Hyperband, an exploratory bandit-based algorithm, introduces an early-stopping strategy to quickly provide competitive configurations given a resource budget which often outperforms Bayesian optimization approaches. However, Hyperband keeps sampling iid configurations for assessment without taking previous evaluations into account. We propose HyperUCB, a UCB extension of Hyperband which assesses the sampled configurations and only evaluates promising samples. We compare our approach on MNIST data against Hyperband and show that we perform better in most cases.

Original languageEnglish
Title of host publicationMachine Learning and Knowledge Discovery in Databases : International Workshops of ECML PKDD 2019, Würzburg, Germany, September 16–20, 2019, Proceedings, Part I
EditorsPeggy Cellier, Kurt Driessens
Number of pages7
Volume1
Place of PublicationCham
PublisherSpringer Nature AG
Publication date28.03.2020
Pages44-50
ISBN (print)978-3-030-43822-7
ISBN (electronic)978-3-030-43823-4
DOIs
Publication statusPublished - 28.03.2020
Event19th Joint European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases - 2019 - Wurzburg, Germany
Duration: 16.09.201920.09.2019
Conference number: 19
https://ecmlpkdd2019.org/submissions/researchAndADSTrack/