多元统计
计算机科学
无监督学习
回归
时间序列
特征学习
重新使用
人工智能
数据挖掘
变压器
机器学习
工程类
统计
电气工程
数学
电压
废物管理
作者
George Zerveas,Srideepika Jayaraman,Dhaval Patel,Anuradha Bhamidipaty,Carsten Eickhoff
出处
期刊:Knowledge Discovery and Data Mining
日期:2021-08-12
被引量:458
标识
DOI:10.1145/3447548.3467401
摘要
We present a novel framework for multivariate time series representation learning based on the transformer encoder architecture. The framework includes an unsupervised pre-training scheme, which can offer substantial performance benefits over fully supervised learning on downstream tasks, both with but even without leveraging additional unlabeled data, i.e., by reusing the existing data samples. Evaluating our framework on several public multivariate time series datasets from various domains and with diverse characteristics, we demonstrate that it performs significantly better than the best currently available methods for regression and classification, even for datasets which consist of only a few hundred training samples. Given the pronounced interest in unsupervised learning for nearly all domains in the sciences and in industry, these findings represent an important landmark, presenting the first unsupervised method shown to push the limits of state-of-the-art performance for multivariate time series regression and classification.
科研通智能强力驱动
Strongly Powered by AbleSci AI