循环神经网络
计算机科学
人工智能
前馈
概率逻辑
深度学习
人工神经网络
工程类
控制工程
作者
Robert DiPietro,Gregory D. Hager
出处
期刊:Elsevier eBooks
[Elsevier]
日期:2020-01-01
卷期号:: 503-519
被引量:59
标识
DOI:10.1016/b978-0-12-816176-0.00026-0
摘要
Recurrent neural networks (RNNs) are a class of neural networks that are naturally suited to processing time-series data and other sequential data. Here we introduce recurrent neural networks as an extension to feedforward networks, in order to allow the processing of variable-length (or even infinite-length) sequences, and some of the most popular recurrent architectures in use, including long short-term memory (LSTM) and gated recurrent units (GRUs). In addition, various aspects surrounding RNNs are discussed in detail, including various probabilistic models that are often realized using RNNs and various applications of RNNs that have appeared within the MICCAI community.
科研通智能强力驱动
Strongly Powered by AbleSci AI