超参数
贝叶斯优化
计算机科学
超参数优化
机器学习
加速
人工智能
集合(抽象数据类型)
核(代数)
随机搜索
一套
贝叶斯概率
数学优化
算法
支持向量机
数学
考古
组合数学
历史
程序设计语言
操作系统
作者
Lisha Li,Kevin Jamieson,Giulia DeSalvo,Afshin Rostamizadeh,Ameet Talwalkar
摘要
Performance of machine learning algorithms depends critically on identifying a good set of hyperparameters. While recent approaches use Bayesian optimization to adaptively select configurations, we focus on speeding up random search through adaptive resource allocation and early-stopping. We formulate hyperparameter optimization as a pure-exploration nonstochastic infinite-armed bandit problem where a predefined resource like iterations, data samples, or features is allocated to randomly sampled configurations. We introduce a novel algorithm, Hyperband, for this framework and analyze its theoretical properties, providing several desirable guarantees. Furthermore, we compare Hyperband with popular Bayesian optimization methods on a suite of hyperparameter optimization problems. We observe that Hyperband can provide over an order-of-magnitude speedup over our competitor set on a variety of deep-learning and kernel-based learning problems.
科研通智能强力驱动
Strongly Powered by AbleSci AI