变压器
过程(计算)
深度学习
计算机科学
工艺工程
人工智能
工程类
电气工程
电压
程序设计语言
作者
Zhenchao Wei,Xu Ji,Li Zhou,Yagu Dang,Yiyang Dai
标识
DOI:10.1016/j.psep.2022.09.039
摘要
Deep learning is a powerful tool for feature representation, and many methods based on convolutional neural networks (CNNs) and recurrent neural networks (RNNs) have been applied on fault diagnoses for chemical processes. However, unlike attention mechanisms, these networks are inefficient when extracting features of long-term dependencies. The transformer method employs a self-attention mechanism and sequence-to-sequence model originally designed for natural language processing (NLP). This approach has attracted significant attention in recent years due to its great success in NLP fields. The fault diagnosis of a chemical process is a task based on multi-variable time series, which are similar to text sequences with a greater focus on long-term dependencies. This paper proposes a modified transformer model called Target Transformer, which includes not only a self-attention mechanism, but also a target-attention mechanism for chemical process fault diagnoses. The Tennessee Eastman (TE) process was used to evaluate our method’s performance.
科研通智能强力驱动
Strongly Powered by AbleSci AI