正规化(语言学)
人工神经网络
计算机科学
人工智能
作者
Wojciech Zaremba,Ilya Sutskever,Oriol Vinyals
出处
期刊:Cornell University - arXiv
日期:2014-01-01
被引量:2318
标识
DOI:10.48550/arxiv.1409.2329
摘要
We present a simple regularization technique for Recurrent Neural Networks (RNNs) with Long Short-Term Memory (LSTM) units. Dropout, the most successful technique for regularizing neural networks, does not work well with RNNs and LSTMs. In this paper, we show how to correctly apply dropout to LSTMs, and show that it substantially reduces overfitting on a variety of tasks. These tasks include language modeling, speech recognition, image caption generation, and machine translation.
科研通智能强力驱动
Strongly Powered by AbleSci AI