HyperUCB: Hyperparameter optimization using contextual bandits
Research output: Contributions to collected editions/works › Article in conference proceedings › Research › peer-review
Standard
Machine Learning and Knowledge Discovery in Databases: International Workshops of ECML PKDD 2019, Würzburg, Germany, September 16–20, 2019, Proceedings, Part I. ed. / Peggy Cellier; Kurt Driessens. Vol. 1 Cham: Springer Nature AG, 2020. p. 44-50 ( Communications in Computer and Information Science; Vol. 1167).
Research output: Contributions to collected editions/works › Article in conference proceedings › Research › peer-review
Harvard
APA
Vancouver
Bibtex
}
RIS
TY - CHAP
T1 - HyperUCB
T2 - 19th Joint European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases - 2019
AU - Tavakol, Maryam
AU - Mair, Sebastian
AU - Morik, Katharina
N1 - Conference code: 19
PY - 2020/3/28
Y1 - 2020/3/28
N2 - Setting the optimal hyperparameters of a learning algorithm is a crucial task. Common approaches such as a grid search over the hyperparameter space or randomly sampling hyperparameters require many configurations to be evaluated in order to perform well. Hence, they either yield suboptimal hyperparameter configurations or are expensive in terms of computational resources. As a remedy, Hyperband, an exploratory bandit-based algorithm, introduces an early-stopping strategy to quickly provide competitive configurations given a resource budget which often outperforms Bayesian optimization approaches. However, Hyperband keeps sampling iid configurations for assessment without taking previous evaluations into account. We propose HyperUCB, a UCB extension of Hyperband which assesses the sampled configurations and only evaluates promising samples. We compare our approach on MNIST data against Hyperband and show that we perform better in most cases.
AB - Setting the optimal hyperparameters of a learning algorithm is a crucial task. Common approaches such as a grid search over the hyperparameter space or randomly sampling hyperparameters require many configurations to be evaluated in order to perform well. Hence, they either yield suboptimal hyperparameter configurations or are expensive in terms of computational resources. As a remedy, Hyperband, an exploratory bandit-based algorithm, introduces an early-stopping strategy to quickly provide competitive configurations given a resource budget which often outperforms Bayesian optimization approaches. However, Hyperband keeps sampling iid configurations for assessment without taking previous evaluations into account. We propose HyperUCB, a UCB extension of Hyperband which assesses the sampled configurations and only evaluates promising samples. We compare our approach on MNIST data against Hyperband and show that we perform better in most cases.
KW - Business informatics
KW - Hyperparameter optimization
KW - Multi-armed bandits
UR - http://www.scopus.com/inward/record.url?scp=85083719265&partnerID=8YFLogxK
U2 - 10.1007/978-3-030-43823-4_4
DO - 10.1007/978-3-030-43823-4_4
M3 - Article in conference proceedings
AN - SCOPUS:85083719265
SN - 978-3-030-43822-7
VL - 1
T3 - Communications in Computer and Information Science
SP - 44
EP - 50
BT - Machine Learning and Knowledge Discovery in Databases
A2 - Cellier, Peggy
A2 - Driessens, Kurt
PB - Springer Nature AG
CY - Cham
Y2 - 16 September 2019 through 20 September 2019
ER -