计算机科学
异常检测
变压器
编码器
时间序列
卷积神经网络
时间戳
模式识别(心理学)
算法
系列(地层学)
水准点(测量)
卷积(计算机科学)
异常(物理)
人工智能
数据挖掘
人工神经网络
机器学习
实时计算
电压
古生物学
地理
物理
操作系统
生物
量子力学
凝聚态物理
大地测量学
作者
Jina Kim,Hyeongwon Kang,Pilsung Kang
标识
DOI:10.1016/j.engappai.2023.105964
摘要
Time-series anomaly detection is a task of detecting data that do not follow normal data distribution among continuously collected data. It is used for system maintenance in various industries; hence, studies on time-series anomaly detection are being carried out actively. Most of the methodologies are based on Long Short-Term Memory (LSTM) and Convolution Neural Network (CNN) to model the temporal structure of time-series data. In this study, we propose an unsupervised prediction-based time-series anomaly detection methodology using Transformer, which shows superior performance to LSTM and CNN in learning dynamic patterns of sequential data through a self-attention mechanism. The prediction model consists of an encoder comprising multiple Transformer encoder layers and a decoder that includes a 1D convolution layer. The output representation of each Transformer layer is accumulated in the encoder to obtain a representation with multi-level, rich information. The decoder fuses this representation through a 1d convolution operation. Consequently, the model can perform predictions considering both the global trend and local variability of the input time-series. The anomaly score is defined as the difference between the predicted and the actual value at the corresponding timestamp, assuming that the trained model produces the predictions that follow the normal data distribution. Finally, the data with an anomaly score above the threshold is detected as an anomaly. Experiments on the benchmark datasets show that the proposed method has performance superior to those of the baselines.
科研通智能强力驱动
Strongly Powered by AbleSci AI