GRAformer: A gated residual attention transformer for multivariate time series forecasting

残余物 多元统计 计算机科学 系列(地层学) 人工智能 变压器 模式识别(心理学) 机器学习 计量经济学 统计 数据挖掘 数学 算法 工程类 电压 电气工程 地质学 古生物学
作者
Chong Yang,Yutian Wang,Bin Yang,Jun Chen
出处
期刊:Neurocomputing [Elsevier]
卷期号:581: 127466-127466
标识
DOI:10.1016/j.neucom.2024.127466
摘要

Recurrent Neural Networks (RNNs), particularly when equipped with output windows—a standard practice in contemporary time series forecasting—have shown proficiency in handling short-term dependencies. Nonetheless, RNNs can encounter challenges in maintaining hidden states over extended forecasting periods, particularly in longer-term predictions where increased hidden state sizes and extended look-back windows can lead to gradient instability. In contrast, Transformer-based models, with their distinctive architecture designed to encode complex contextual relationships and enable computations to be done in parallel, are emerging as a popular alternative in this field. However, current research has mainly focused on modifying attention mechanisms, overlooking opportunities to improve the feedforward layer, which could lead to efficiency limitations. Moreover, prevailing methods often assume absolute independence between channels, disregarding distinct features among variables and failing to fully leverage channel-specific information. To address these gaps, we propose an efficient transformer design for multivariate time series prediction. Our approach integrates two key components: (i) a gated residual attention unit that enhances predictive accuracy and computational efficiency, and (ii) a channel embedding technique that differentiates between series and boosts performance. Theoretically, we prove that our model has recurrent dynamics introduced by the RNN layer. Through extensive experiments on real-world data, we demonstrate that our proposed method achieves competitive predictive accuracy compared to prior approaches while exhibiting accelerated processing relative to state-of-the-art transformers. Our code, data, and trained models are available at https://github.com/MythosAd/GRAformer.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
斯文败类应助jj采纳,获得10
2秒前
yrain发布了新的文献求助10
4秒前
enshuo完成签到,获得积分20
4秒前
情怀应助希特勒采纳,获得10
7秒前
lyylxcz发布了新的文献求助10
10秒前
11秒前
阔达冰兰完成签到,获得积分20
13秒前
13秒前
14秒前
希特勒发布了新的文献求助10
18秒前
RTP关闭了RTP文献求助
20秒前
汉桑波欸完成签到,获得积分10
21秒前
hkh发布了新的文献求助10
22秒前
希特勒完成签到,获得积分10
24秒前
24秒前
英俊依丝完成签到,获得积分10
24秒前
热心平萱发布了新的文献求助10
27秒前
充电宝应助科研通管家采纳,获得10
27秒前
丘比特应助科研通管家采纳,获得10
27秒前
罗_应助科研通管家采纳,获得10
27秒前
小二郎应助科研通管家采纳,获得10
27秒前
27秒前
充电宝应助科研通管家采纳,获得10
27秒前
所所应助科研通管家采纳,获得10
28秒前
28秒前
华仔应助科研通管家采纳,获得10
28秒前
ding应助科研通管家采纳,获得10
28秒前
28秒前
28秒前
28秒前
田様应助科研通管家采纳,获得10
28秒前
田様应助科研通管家采纳,获得30
28秒前
29秒前
隐形寻云发布了新的文献求助10
30秒前
qian发布了新的文献求助30
31秒前
彭于晏应助月兮2013采纳,获得10
31秒前
大水发布了新的文献求助10
32秒前
沐清林森完成签到,获得积分10
32秒前
34秒前
35秒前
高分求助中
请在求助之前详细阅读求助说明!!!! 20000
One Man Talking: Selected Essays of Shao Xunmei, 1929–1939 1000
The Three Stars Each: The Astrolabes and Related Texts 900
Yuwu Song, Biographical Dictionary of the People's Republic of China 700
[Lambert-Eaton syndrome without calcium channel autoantibodies] 520
Pressing the Fight: Print, Propaganda, and the Cold War 500
Bernd Ziesemer - Maos deutscher Topagent: Wie China die Bundesrepublik eroberte 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 有机化学 工程类 生物化学 纳米技术 物理 内科学 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 电极 光电子学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 2471144
求助须知:如何正确求助?哪些是违规求助? 2137927
关于积分的说明 5447466
捐赠科研通 1861777
什么是DOI,文献DOI怎么找? 925939
版权声明 562740
科研通“疑难数据库(出版商)”最低求助积分说明 495278