计算机科学
变压器
数据挖掘
人工智能
系列(地层学)
解码方法
时间序列
模式识别(心理学)
算法
机器学习
工程类
生物
电气工程
古生物学
电压
作者
Dazhao Du,Bing Su,Zhewei Wei
标识
DOI:10.1109/icassp49357.2023.10096881
摘要
In long-term time series forecasting, most Transformer-based methods adopt the standard point-wise attention mechanism, which not only has high complexity but also cannot explicitly capture the predictive dependencies from contexts since the corresponding key and value are transformed from the same point. This paper proposes a predictive Transformer-based model called Preformer. Preformer introduces a novel efficient Multi-Scale Segment-Correlation mechanism that divides time series into segments and utilizes segment-wise correlation-based attention to replace point-wise attention. A multi-scale structure is developed to aggregate dependencies at different temporal scales and facilitate the selection of segment length. Preformer further designs a predictive paradigm for decoding, where the key and value come from two successive segments rather than the same segment. Experiments demonstrate that Preformer outperforms other Transformer-based models. The codes are available at https://github.com/ddz16/Preformer.
科研通智能强力驱动
Strongly Powered by AbleSci AI