计算机科学
贝叶斯优化
高斯过程
推论
可扩展性
核(代数)
插值(计算机图形学)
计算
算法
高斯分布
机器学习
人工智能
数学
数据库
物理
组合数学
运动(物理)
量子力学
作者
Samuel Stanton,Wesley J. Maddox,Ian Delbridge,Andrew Gordon Wilson
出处
期刊:Cornell University - arXiv
日期:2021-01-01
被引量:11
标识
DOI:10.48550/arxiv.2103.01454
摘要
Gaussian processes (GPs) provide a gold standard for performance in online settings, such as sample-efficient control and black box optimization, where we need to update a posterior distribution as we acquire data in a sequential fashion. However, updating a GP posterior to accommodate even a single new observation after having observed $n$ points incurs at least $O(n)$ computations in the exact setting. We show how to use structured kernel interpolation to efficiently recycle computations for constant-time $O(1)$ online updates with respect to the number of points $n$, while retaining exact inference. We demonstrate the promise of our approach in a range of online regression and classification settings, Bayesian optimization, and active sampling to reduce error in malaria incidence forecasting. Code is available at https://github.com/wjmaddox/online_gp.
科研通智能强力驱动
Strongly Powered by AbleSci AI