Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting

变压器 计算机科学 编码器 依赖关系(UML) 序列(生物学) 算法 人工智能 工程类 电压 遗传学 生物 操作系统 电气工程
作者
Haoyi Zhou,Shanghang Zhang,Jieqi Peng,Shuai Zhang,Jianxin Li,Hui Xiong,Wancai Zhang
出处
期刊:Proceedings of the ... AAAI Conference on Artificial Intelligence [Association for the Advancement of Artificial Intelligence (AAAI)]
卷期号:35 (12): 11106-11115 被引量:3759
标识
DOI:10.1609/aaai.v35i12.17325
摘要

Many real-world applications require the prediction of long sequence time-series, such as electricity consumption planning. Long sequence time-series forecasting (LSTF) demands a high prediction capacity of the model, which is the ability to capture precise long-range dependency coupling between output and input efficiently. Recent studies have shown the potential of Transformer to increase the prediction capacity. However, there are several severe issues with Transformer that prevent it from being directly applicable to LSTF, including quadratic time complexity, high memory usage, and inherent limitation of the encoder-decoder architecture. To address these issues, we design an efficient transformer-based model for LSTF, named Informer, with three distinctive characteristics: (i) a ProbSparse self-attention mechanism, which achieves O(L log L) in time complexity and memory usage, and has comparable performance on sequences' dependency alignment. (ii) the self-attention distilling highlights dominating attention by halving cascading layer input, and efficiently handles extreme long input sequences. (iii) the generative style decoder, while conceptually simple, predicts the long time-series sequences at one forward operation rather than a step-by-step way, which drastically improves the inference speed of long-sequence predictions. Extensive experiments on four large-scale datasets demonstrate that Informer significantly outperforms existing methods and provides a new solution to the LSTF problem.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
更新
PDF的下载单位、IP信息已删除 (2025-6-4)

科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
gxz完成签到,获得积分10
1秒前
lenaimiao完成签到,获得积分10
1秒前
坦率无剑完成签到,获得积分10
1秒前
大鲨鱼完成签到 ,获得积分10
2秒前
4秒前
吧嗒蹭完成签到 ,获得积分10
6秒前
搜集达人应助gxz采纳,获得10
6秒前
MrCoolWu完成签到,获得积分10
6秒前
小潘同学完成签到 ,获得积分10
6秒前
7秒前
清新的易真完成签到,获得积分10
7秒前
贵贵完成签到,获得积分10
8秒前
天天完成签到,获得积分10
8秒前
欢呼的雨琴完成签到 ,获得积分10
8秒前
努力成为科研大佬完成签到,获得积分10
11秒前
Y2完成签到 ,获得积分10
11秒前
嘻嘻哈哈应助KeLiang采纳,获得10
12秒前
嘻嘻哈哈应助苏苏采纳,获得10
12秒前
默默平文完成签到,获得积分10
13秒前
DavidSun完成签到,获得积分10
13秒前
zzzx完成签到 ,获得积分10
14秒前
十二完成签到,获得积分10
17秒前
Albert完成签到,获得积分10
18秒前
平凡完成签到,获得积分10
21秒前
ZHDNCG完成签到,获得积分10
22秒前
Seameng完成签到 ,获得积分10
23秒前
KeLiang完成签到,获得积分10
23秒前
科研狗的春天完成签到 ,获得积分10
24秒前
yyer完成签到 ,获得积分10
24秒前
24秒前
zzb完成签到,获得积分10
25秒前
小劳完成签到,获得积分10
25秒前
仲大船完成签到,获得积分10
25秒前
苏苏完成签到,获得积分10
26秒前
w0304hf完成签到,获得积分10
26秒前
26秒前
行舟完成签到,获得积分10
27秒前
halo完成签到 ,获得积分10
27秒前
阿拉艾浩基完成签到,获得积分10
27秒前
jin完成签到,获得积分10
27秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Fermented Coffee Market 2000
PARLOC2001: The update of loss containment data for offshore pipelines 500
Critical Thinking: Tools for Taking Charge of Your Learning and Your Life 4th Edition 500
Phylogenetic study of the order Polydesmida (Myriapoda: Diplopoda) 500
A Manual for the Identification of Plant Seeds and Fruits : Second revised edition 500
Vertebrate Palaeontology, 5th Edition 340
热门求助领域 (近24小时)
化学 医学 生物 材料科学 工程类 有机化学 内科学 生物化学 物理 计算机科学 纳米技术 遗传学 基因 复合材料 化学工程 物理化学 病理 催化作用 免疫学 量子力学
热门帖子
关注 科研通微信公众号,转发送积分 5256478
求助须知:如何正确求助?哪些是违规求助? 4418730
关于积分的说明 13753082
捐赠科研通 4291913
什么是DOI,文献DOI怎么找? 2355182
邀请新用户注册赠送积分活动 1351622
关于科研通互助平台的介绍 1312330