核(代数)
计算机科学
核方法
人工智能
可扩展性
高斯过程
深度学习
推论
分布的核嵌入
机器学习
算法
高斯分布
数学
支持向量机
离散数学
数据库
量子力学
物理
作者
Andrew Gordon Wilson,Zhiting Hu,Ruslan Salakhutdinov,Eric P. Xing
出处
期刊:International Conference on Artificial Intelligence and Statistics
日期:2016-05-02
卷期号:: 370-378
被引量:323
摘要
We introduce scalable deep kernels, which combine the structural properties of deep learning architectures with the non-parametric flexibility of kernel methods. Specifically, we transform the inputs of a spectral mixture base kernel with a deep architecture, using local kernel interpolation, inducing points, and structure exploiting (Kronecker and Toeplitz) algebra for a scalable kernel representation. These closed-form kernels can be used as drop-in replacements for standard kernels, with benefits in expressive power and scalability. We jointly learn the properties of these kernels through the marginal likelihood of a Gaussian process. Inference and learning cost O(n) for n training points, and predictions cost O(1) per test point. On a large and diverse collection of applications, including a dataset with 2 million examples, we show improved performance over scalable Gaussian processes with flexible kernel learning models, and stand-alone deep architectures.
科研通智能强力驱动
Strongly Powered by AbleSci AI