MNIST数据库
辍学(神经网络)
反向传播
人工神经网络
计算机科学
人工智能
强化学习
上下界
贝叶斯定理
能量(信号处理)
机器学习
数学优化
贝叶斯概率
数学
统计
数学分析
作者
Charles Blundell,Julien Cornebise,Koray Kavukcuoglu,Daan Wierstra
出处
期刊:Cornell University - arXiv
日期:2015-01-01
被引量:1340
标识
DOI:10.48550/arxiv.1505.05424
摘要
We introduce a new, efficient, principled and backpropagation-compatible algorithm for learning a probability distribution on the weights of a neural network, called Bayes by Backprop. It regularises the weights by minimising a compression cost, known as the variational free energy or the expected lower bound on the marginal likelihood. We show that this principled kind of regularisation yields comparable performance to dropout on MNIST classification. We then demonstrate how the learnt uncertainty in the weights can be used to improve generalisation in non-linear regression problems, and how this weight uncertainty can be used to drive the exploration-exploitation trade-off in reinforcement learning.
科研通智能强力驱动
Strongly Powered by AbleSci AI