强迫(数学)
计算机科学
树库
采样(信号处理)
MNIST数据库
算法
笔迹
样品(材料)
人工智能
序列(生物学)
机器学习
语音识别
人工神经网络
数学
解析
数学分析
滤波器(信号处理)
化学
生物
遗传学
色谱法
计算机视觉
作者
Alex Lamb,Anirudh Goyal,Ying Zhang,Saizheng Zhang,Aaron Courville,Yoshua Bengio
出处
期刊:Neural Information Processing Systems
日期:2016-01-01
卷期号:29: 4601-4609
被引量:300
摘要
The Teacher Forcing algorithm trains recurrent networks by supplying observed sequence values as inputs during training and using the network’s own one-step-ahead predictions to do multi-step sampling. We introduce the Professor Forcing algorithm, which uses adversarial domain adaptation to encourage the dynamics of the recurrent network to be the same when training the network and when sampling from the network over multiple time steps. We apply Professor Forcing to language modeling, vocal synthesis on raw waveforms, handwriting generation, and image generation. Empirically we find that Professor Forcing acts as a regularizer, improving test likelihood on character level Penn Treebank and sequential MNIST. We also find that the model qualitatively improves samples, especially when sampling for a large number of time steps. This is supported by human evaluation of sample quality. Trade-offs between Professor Forcing and Scheduled Sampling are discussed. We produce T-SNEs showing that Professor Forcing successfully makes the dynamics of the network during training and sampling more similar.
科研通智能强力驱动
Strongly Powered by AbleSci AI