MEAformer: An all-MLP transformer with temporal external attention for long-term time series forecasting

计算机科学 自回归模型 推论 编码器 变压器 人工智能 二次方程 算法 模式识别(心理学) 机器学习 数学 计量经济学 电压 物理 量子力学 几何学 操作系统
作者
Siyuan Huang,Yepeng Liu,Haoyi Cui,Fan Zhang,Jinjiang Li,Xiaofeng Zhang,Mingli Zhang,Caiming Zhang
出处
期刊:Information Sciences [Elsevier BV]
卷期号:669: 120605-120605 被引量:5
标识
DOI:10.1016/j.ins.2024.120605
摘要

Transformer-based models have significantly improved performance in Long-term Time Series Forecasting (LTSF). These models employ various self-attention mechanisms to discover long-term dependencies. However, the computational efficiency is hampered by the inherent permutation invariance of self-attention, and they primarily focus on relationships within the sequence while neglecting potential relationships between different sample sequences. This limits the ability and flexibility of self-attention in LTSF. In addition, the Transformer's decoder outputs sequences in an autoregressive manner, leading to slow inference speed and error accumulation effects, especially for LTSF. Regarding the issues with Transformer-based models for LTSF, we propose a model better suited for LTSF, named MEAformer. MEAformer adopts a fully connected Multi-Layer Perceptron (MLP) architecture consisting of two types of layers: encoder layers and MLP layers. Unlike most encoder layers in Transformer-based models, the MEAformer replaces self-attention with temporal external attention. Temporal external attention explores potential relationships between different sample sequences in the training dataset. Compared to the quadratic complexity of self-attention mechanisms, temporal external attention has efficient linear complexity. Encoder layers can be stacked multiple times to capture time-dependent relationships at different scales. Furthermore, the MEAformer replaces the intricate decoder layers of the original model with more straightforward MLP layers. This modification aims to enhance inference speed and facilitate single-pass sequence generation, effectively mitigating the problem of error accumulation effects. Regarding long-term forecasting, MEAformer achieves state-of-the-art performance on six benchmark datasets, covering five real-world domains: energy, transportation, economy, weather, and disease. Code is available at: https://github.com/huangsiyuan924/MEAformer.
最长约 10秒,即可获得该文献文件

科研通智能强力驱动
Strongly Powered by AbleSci AI
科研通是完全免费的文献互助平台,具备全网最快的应助速度,最高的求助完成率。 对每一个文献求助,科研通都将尽心尽力,给求助人一个满意的交代。
实时播报
tuanzi发布了新的文献求助10
刚刚
如意的翰发布了新的文献求助10
刚刚
1秒前
1秒前
奋斗向南完成签到,获得积分10
1秒前
地球人完成签到,获得积分20
3秒前
3秒前
3秒前
烟花应助水凝胶采纳,获得10
4秒前
ZJX发布了新的文献求助10
5秒前
好好发布了新的文献求助10
5秒前
小二郎应助zbb采纳,获得10
5秒前
5秒前
碧蓝紫山发布了新的文献求助10
6秒前
研友_VZG7GZ应助我是張寜啊采纳,获得10
7秒前
7秒前
8秒前
Chiron完成签到,获得积分10
8秒前
jy发布了新的文献求助10
8秒前
SciGPT应助王智慧采纳,获得10
8秒前
顾矜应助ZLY采纳,获得10
8秒前
8秒前
9秒前
9秒前
10秒前
脑洞疼应助李明采纳,获得10
10秒前
11秒前
11秒前
脑洞疼应助lily采纳,获得10
12秒前
小赵发布了新的文献求助10
13秒前
慕青应助tuanzi采纳,获得10
14秒前
不知道完成签到,获得积分10
14秒前
雷夜蕾发布了新的文献求助10
14秒前
可爱的函函应助武海素采纳,获得10
15秒前
15秒前
每天都在找完成签到,获得积分10
16秒前
阿飘应助阔达乘云采纳,获得10
16秒前
16秒前
单纯的冬灵完成签到 ,获得积分10
16秒前
DDS完成签到,获得积分10
16秒前
高分求助中
Technologies supporting mass customization of apparel: A pilot project 600
Разработка метода ускоренного контроля качества электрохромных устройств 500
Chinesen in Europa – Europäer in China: Journalisten, Spione, Studenten 500
Arthur Ewert: A Life for the Comintern 500
China's Relations With Japan 1945-83: The Role of Liao Chengzhi // Kurt Werner Radtke 500
Two Years in Peking 1965-1966: Book 1: Living and Teaching in Mao's China // Reginald Hunt 500
Epigenetic Drug Discovery 500
热门求助领域 (近24小时)
化学 材料科学 医学 生物 工程类 有机化学 物理 生物化学 纳米技术 计算机科学 化学工程 内科学 复合材料 物理化学 电极 遗传学 量子力学 基因 冶金 催化作用
热门帖子
关注 科研通微信公众号,转发送积分 3819110
求助须知:如何正确求助?哪些是违规求助? 3362176
关于积分的说明 10415900
捐赠科研通 3080453
什么是DOI,文献DOI怎么找? 1694480
邀请新用户注册赠送积分活动 814668
科研通“疑难数据库(出版商)”最低求助积分说明 768382