自回归模型
计算机科学
概率逻辑
推论
时间序列
人工智能
变压器
状态空间表示
数据挖掘
机器学习
数学
算法
计量经济学
物理
量子力学
电压
作者
Junlong Tong,Liping Xie,Wankou Yang,Kanjian Zhang,Junsheng Zhao
标识
DOI:10.1016/j.ins.2023.119410
摘要
Time series forecasting is crucial for several fields, such as disaster warning, weather prediction, and energy consumption. Transformer-based models are considered to have revolutionized the field of time series forecasting. However, the autoregressive form of the Transformer gives rise to cumulative errors in the inference stage. Furthermore, the complex temporal pattern of the time series leads to increased difficulty for the models in mining reliable temporal dependencies. In this paper, we propose a hierarchical Transformer with probabilistic decomposition representation, which provides a flexible framework for hierarchical and decomposable forecasts for time series. The hierarchical mechanism utilizes the forecasting results of the Transformer as conditional information for the generative model, performing sequence-level forecasts to approximate the ground truth, which can mitigate the cumulative error of the autoregressive Transformer. In addition, the conditional generative model encodes historical and predictive information into the latent space and reconstructs typical patterns from the latent space, including seasonality and trend terms. The process provides a flexible framework for the separation of complex patterns through the interaction of information in the latent space. Extensive experiments on several datasets demonstrate the effectiveness and robustness of the model, indicating that it compares favorably with the state of the art.
科研通智能强力驱动
Strongly Powered by AbleSci AI