计算机科学
超参数
人工智能
循环神经网络
任务(项目管理)
符号
功能(生物学)
机器学习
词(群论)
人工神经网络
理论计算机科学
数学
算术
进化生物学
生物
经济
管理
几何学
作者
Klaus Greff,Rupesh Kumar Srivastava,Jan Koutník,Bas R. Steunebrink,Jürgen Schmidhuber
出处
期刊:IEEE transactions on neural networks and learning systems
[Institute of Electrical and Electronics Engineers]
日期:2017-10-01
卷期号:28 (10): 2222-2232
被引量:4190
标识
DOI:10.1109/tnnls.2016.2582924
摘要
Several variants of the long short-term memory (LSTM) architecture for recurrent neural networks have been proposed since its inception in 1995. In recent years, these networks have become the state-of-the-art models for a variety of machine learning problems. This has led to a renewed interest in understanding the role and utility of various computational components of typical LSTM variants. In this paper, we present the first large-scale analysis of eight LSTM variants on three representative tasks: speech recognition, handwriting recognition, and polyphonic music modeling. The hyperparameters of all LSTM variants for each task were optimized separately using random search, and their importance was assessed using the powerful functional ANalysis Of VAriance framework. In total, we summarize the results of 5400 experimental runs ( $\approx 15$ years of CPU time), which makes our study the largest of its kind on LSTM networks. Our results show that none of the variants can improve upon the standard LSTM architecture significantly, and demonstrate the forget gate and the output activation function to be its most critical components. We further observe that the studied hyperparameters are virtually independent and derive guidelines for their efficient adjustment.
科研通智能强力驱动
Strongly Powered by AbleSci AI