InParformer: Evolutionary Decomposition Transformers with Interactive Parallel Attention for Long-Term Time Series Forecasting

计算机科学 变压器 语法演变 冗余(工程) 人工智能 时间序列 机器学习 数据挖掘 遗传程序设计 工程类 操作系统 电气工程 电压
作者
Haizhou Cao,Zhenhao Huang,Tiechui Yao,Jue Wang,Hong He,Yangang Wang
出处
期刊:Proceedings of the ... AAAI Conference on Artificial Intelligence [Association for the Advancement of Artificial Intelligence (AAAI)]
卷期号:37 (6): 6906-6915
标识
DOI:10.1609/aaai.v37i6.25845
摘要

Long-term time series forecasting (LTSF) provides substantial benefits for numerous real-world applications, whereas places essential demands on the model capacity to capture long-range dependencies. Recent Transformer-based models have significantly improved LTSF performance. It is worth noting that Transformer with the self-attention mechanism was originally proposed to model language sequences whose tokens (i.e., words) are discrete and highly semantic. However, unlike language sequences, most time series are sequential and continuous numeric points. Time steps with temporal redundancy are weakly semantic, and only leveraging time-domain tokens is hard to depict the overall properties of time series (e.g., the overall trend and periodic variations). To address these problems, we propose a novel Transformer-based forecasting model named InParformer with an Interactive Parallel Attention (InPar Attention) mechanism. The InPar Attention is proposed to learn long-range dependencies comprehensively in both frequency and time domains. To improve its learning capacity and efficiency, we further design several mechanisms, including query selection, key-value pair compression, and recombination. Moreover, InParformer is constructed with evolutionary seasonal-trend decomposition modules to enhance intricate temporal pattern extraction. Extensive experiments on six real-world benchmarks show that InParformer outperforms the state-of-the-art forecasting Transformers.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
大幅提高文件上传限制,最高150M (2024-4-1)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
2秒前
2秒前
3秒前
4秒前
5秒前
泥嚎完成签到,获得积分10
5秒前
Freya1528发布了新的文献求助10
5秒前
6秒前
lh发布了新的文献求助10
6秒前
hzl完成签到,获得积分10
7秒前
沐溪发布了新的文献求助10
8秒前
8秒前
隔壁小曾发布了新的文献求助20
8秒前
佳佳李发布了新的文献求助10
8秒前
9秒前
顾矜应助研友_X89o6n采纳,获得10
11秒前
11秒前
自由选择学完成签到,获得积分10
11秒前
12秒前
12秒前
GUan完成签到,获得积分10
12秒前
Ava应助草影花飘采纳,获得10
13秒前
14秒前
14秒前
renahuang发布了新的文献求助10
14秒前
健壮的水桃完成签到,获得积分10
15秒前
16秒前
上官若男应助GUan采纳,获得10
16秒前
TNT发布了新的文献求助10
17秒前
nek发布了新的文献求助10
18秒前
小马发布了新的文献求助10
18秒前
Bcz完成签到,获得积分10
20秒前
炙热血茗发布了新的文献求助10
20秒前
keyandog完成签到,获得积分10
20秒前
DAWN发布了新的文献求助10
20秒前
喔喔佳佳L完成签到 ,获得积分10
20秒前
任性半凡完成签到,获得积分10
20秒前
阿成完成签到,获得积分10
21秒前
cctv18应助yoneking采纳,获得10
21秒前
毛毛完成签到,获得积分10
21秒前
高分求助中
The three stars each : the Astrolabes and related texts 1070
Manual of Clinical Microbiology, 4 Volume Set (ASM Books) 13th Edition 1000
Hieronymi Mercurialis Foroliviensis De arte gymnastica libri sex: In quibus exercitationum omnium vetustarum genera, loca, modi, facultates, & ... exercitationes pertinet diligenter explicatur Hardcover – 26 August 2016 900
Sport in der Antike 800
De arte gymnastica. The art of gymnastics 600
Sport in der Antike Hardcover – March 1, 2015 500
Boris Pesce - Gli impiegati della Fiat dal 1955 al 1999 un percorso nella memoria 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 有机化学 工程类 生物化学 纳米技术 物理 内科学 计算机科学 化学工程 复合材料 遗传学 基因 物理化学 催化作用 电极 光电子学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 2403390
求助须知:如何正确求助?哪些是违规求助? 2102336
关于积分的说明 5304757
捐赠科研通 1829944
什么是DOI,文献DOI怎么找? 911923
版权声明 560458
科研通“疑难数据库(出版商)”最低求助积分说明 487581