计算机科学
人工智能
机器学习
杠杆(统计)
变压器
自编码
时间序列
特征学习
原始数据
多元统计
数据挖掘
模式识别(心理学)
深度学习
工程类
程序设计语言
电气工程
电压
作者
Zhe Li,Zhongwen Rao,Lujia Pan,Pengyun Wang,Zenglin Xu
出处
期刊:Cornell University - arXiv
日期:2023-01-21
被引量:14
标识
DOI:10.48550/arxiv.2301.08871
摘要
Multivariate Time Series forecasting has been an increasingly popular topic in various applications and scenarios. Recently, contrastive learning and Transformer-based models have achieved good performance in many long-term series forecasting tasks. However, there are still several issues in existing methods. First, the training paradigm of contrastive learning and downstream prediction tasks are inconsistent, leading to inaccurate prediction results. Second, existing Transformer-based models which resort to similar patterns in historical time series data for predicting future values generally induce severe distribution shift problems, and do not fully leverage the sequence information compared to self-supervised methods. To address these issues, we propose a novel framework named Ti-MAE, in which the input time series are assumed to follow an integrate distribution. In detail, Ti-MAE randomly masks out embedded time series data and learns an autoencoder to reconstruct them at the point-level. Ti-MAE adopts mask modeling (rather than contrastive learning) as the auxiliary task and bridges the connection between existing representation learning and generative Transformer-based methods, reducing the difference between upstream and downstream forecasting tasks while maintaining the utilization of original time series data. Experiments on several public real-world datasets demonstrate that our framework of masked autoencoding could learn strong representations directly from the raw data, yielding better performance in time series forecasting and classification tasks.
科研通智能强力驱动
Strongly Powered by AbleSci AI