Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting

变压器 计算机科学 编码器 依赖关系(UML) 序列(生物学) 算法 人工智能 工程类 电压 遗传学 生物 操作系统 电气工程
作者
Haoyi Zhou,Shanghang Zhang,Jieqi Peng,Shuai Zhang,Jianxin Li,Hui Xiong,Wancai Zhang
出处
期刊:Proceedings of the ... AAAI Conference on Artificial Intelligence [Association for the Advancement of Artificial Intelligence (AAAI)]
卷期号:35 (12): 11106-11115 被引量:5425
标识
DOI:10.1609/aaai.v35i12.17325
摘要

Many real-world applications require the prediction of long sequence time-series, such as electricity consumption planning. Long sequence time-series forecasting (LSTF) demands a high prediction capacity of the model, which is the ability to capture precise long-range dependency coupling between output and input efficiently. Recent studies have shown the potential of Transformer to increase the prediction capacity. However, there are several severe issues with Transformer that prevent it from being directly applicable to LSTF, including quadratic time complexity, high memory usage, and inherent limitation of the encoder-decoder architecture. To address these issues, we design an efficient transformer-based model for LSTF, named Informer, with three distinctive characteristics: (i) a ProbSparse self-attention mechanism, which achieves O(L log L) in time complexity and memory usage, and has comparable performance on sequences' dependency alignment. (ii) the self-attention distilling highlights dominating attention by halving cascading layer input, and efficiently handles extreme long input sequences. (iii) the generative style decoder, while conceptually simple, predicts the long time-series sequences at one forward operation rather than a step-by-step way, which drastically improves the inference speed of long-sequence predictions. Extensive experiments on four large-scale datasets demonstrate that Informer significantly outperforms existing methods and provides a new solution to the LSTF problem.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
刚刚
边伯贤发布了新的文献求助10
刚刚
田様应助南小雪采纳,获得10
2秒前
URB7完成签到,获得积分10
4秒前
岩中花述发布了新的文献求助10
4秒前
源孤律醒发布了新的文献求助10
4秒前
5秒前
striveboyha完成签到,获得积分10
5秒前
今天很美味完成签到 ,获得积分10
6秒前
bkagyin应助soda采纳,获得10
6秒前
6秒前
刘壮壮发布了新的文献求助10
6秒前
bkagyin应助踏实晓啸采纳,获得10
7秒前
Hogsed完成签到,获得积分20
9秒前
9秒前
9秒前
10秒前
腼腆的武发布了新的文献求助10
10秒前
zheng完成签到,获得积分10
10秒前
10秒前
Wakakak发布了新的文献求助10
10秒前
可爱的函函应助炸药采纳,获得10
11秒前
科研小废物应助achovy采纳,获得10
11秒前
Leeee完成签到,获得积分10
11秒前
12秒前
丘比特应助边伯贤采纳,获得10
12秒前
ZL发布了新的文献求助10
13秒前
13秒前
JJ完成签到 ,获得积分10
13秒前
15秒前
852应助Firewoods采纳,获得10
15秒前
科目三应助勤劳冰棍采纳,获得10
15秒前
陈陈发布了新的文献求助10
15秒前
在水一方应助自觉的笑容采纳,获得30
15秒前
16秒前
仇文琪完成签到,获得积分10
16秒前
17秒前
17秒前
小王同学完成签到,获得积分10
18秒前
19秒前
高分求助中
(应助此贴封号)【重要!!请各用户(尤其是新用户)详细阅读】【科研通的精品贴汇总】 10000
Modern Epidemiology, Fourth Edition 5000
Handbook of pharmaceutical excipients, Ninth edition 5000
Kinesiophobia : a new view of chronic pain behavior 5000
Molecular Biology of Cancer: Mechanisms, Targets, and Therapeutics 3000
Digital Twins of Advanced Materials Processing 2000
Weaponeering, Fourth Edition – Two Volume SET 2000
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 纳米技术 化学工程 生物化学 物理 计算机科学 内科学 复合材料 催化作用 物理化学 光电子学 电极 冶金 细胞生物学 基因
热门帖子
关注 科研通微信公众号,转发送积分 6019217
求助须知:如何正确求助?哪些是违规求助? 7612188
关于积分的说明 16161370
捐赠科研通 5166910
什么是DOI,文献DOI怎么找? 2765483
邀请新用户注册赠送积分活动 1747235
关于科研通互助平台的介绍 1635524