GRAformer: A gated residual attention transformer for multivariate time series forecasting

残余物 多元统计 计算机科学 系列(地层学) 人工智能 变压器 模式识别(心理学) 机器学习 计量经济学 统计 数据挖掘 数学 算法 工程类 电压 电气工程 地质学 古生物学
作者
Chengcao Yang,Yutian Wang,Bin Yang,Jun Chen
出处
期刊:Neurocomputing [Elsevier BV]
卷期号:581: 127466-127466 被引量:9
标识
DOI:10.1016/j.neucom.2024.127466
摘要

Recurrent Neural Networks (RNNs), particularly when equipped with output windows – a standard practice in contemporary time series forecasting – have shown proficiency in handling short-term dependencies. Nonetheless, RNNs can encounter challenges in maintaining hidden states over extended forecasting periods, particularly in longer-term predictions where increased hidden state sizes and extended look-back windows can lead to gradient instability. In contrast, Transformer-based models, with their distinctive architecture designed to encode complex contextual relationships and enable computations to be done in parallel, are emerging as a popular alternative in this field. However, current research has mainly focused on modifying attention mechanisms, overlooking opportunities to improve the feedforward layer, which could lead to efficiency limitations. Moreover, prevailing methods often assume absolute independence between channels, disregarding distinct features among variables and failing to fully leverage channel-specific information. To address these gaps, we propose an efficient transformer design for multivariate time series prediction. Our approach integrates two key components: (i) a gated residual attention unit that enhances predictive accuracy and computational efficiency, and (ii) a channel embedding technique that differentiates between series and boosts performance. Theoretically, we prove that our model has recurrent dynamics introduced by the RNN layer. Through extensive experiments on real-world data, we demonstrate that our proposed method achieves competitive predictive accuracy compared to prior approaches while exhibiting accelerated processing relative to state-of-the-art transformers. Our code, data, and trained models are available at https://github.com/MythosAd/GRAformer.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
袁晨悦发布了新的文献求助10
5秒前
gudu完成签到,获得积分10
7秒前
8秒前
8秒前
CipherSage应助ljcznhy采纳,获得10
9秒前
孤独雪柳发布了新的文献求助10
9秒前
羊羊完成签到 ,获得积分10
10秒前
奇点完成签到,获得积分10
11秒前
xw发布了新的文献求助10
14秒前
14秒前
16秒前
秋纳瑞发布了新的文献求助10
22秒前
cxt发布了新的文献求助10
22秒前
陳.完成签到 ,获得积分10
26秒前
科研通AI5应助帅气的天抒采纳,获得10
27秒前
孤独的乐珍完成签到,获得积分10
31秒前
31秒前
32秒前
昏睡的蟠桃应助李思超采纳,获得260
32秒前
zephyrforzhou完成签到,获得积分10
36秒前
38秒前
我是老大应助科研通管家采纳,获得10
38秒前
38秒前
天天快乐应助科研通管家采纳,获得30
38秒前
领导范儿应助科研通管家采纳,获得10
38秒前
猪猪hero应助科研通管家采纳,获得10
38秒前
思源应助科研通管家采纳,获得30
38秒前
打打应助科研通管家采纳,获得10
38秒前
冰魂应助科研通管家采纳,获得10
38秒前
科研通AI5应助科研通管家采纳,获得10
38秒前
科研通AI2S应助科研通管家采纳,获得10
38秒前
烟花应助科研通管家采纳,获得10
38秒前
科研通AI5应助科研通管家采纳,获得10
38秒前
lwl666应助科研通管家采纳,获得10
39秒前
星辰大海应助科研通管家采纳,获得10
39秒前
乐乐应助科研通管家采纳,获得20
39秒前
李爱国应助科研通管家采纳,获得10
39秒前
汉堡包应助科研通管家采纳,获得10
39秒前
科研通AI5应助科研通管家采纳,获得10
39秒前
大模型应助科研通管家采纳,获得10
39秒前
高分求助中
【此为提示信息,请勿应助】请按要求发布求助,避免被关 20000
ISCN 2024 – An International System for Human Cytogenomic Nomenclature (2024) 3000
Continuum Thermodynamics and Material Modelling 2000
Encyclopedia of Geology (2nd Edition) 2000
105th Edition CRC Handbook of Chemistry and Physics 1600
Maneuvering of a Damaged Navy Combatant 650
Mindfulness and Character Strengths: A Practitioner's Guide to MBSP 380
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 物理 生物化学 纳米技术 计算机科学 化学工程 内科学 复合材料 物理化学 电极 遗传学 量子力学 基因 冶金 催化作用
热门帖子
关注 科研通微信公众号,转发送积分 3776768
求助须知:如何正确求助?哪些是违规求助? 3322170
关于积分的说明 10209141
捐赠科研通 3037424
什么是DOI,文献DOI怎么找? 1666679
邀请新用户注册赠送积分活动 797625
科研通“疑难数据库(出版商)”最低求助积分说明 757944