一般化
学习迁移
计算机科学
分歧(语言学)
公制(单位)
人工智能
机器学习
领域(数学分析)
度量(数据仓库)
集成学习
钥匙(锁)
Kullback-Leibler散度
时间序列
数据建模
数据挖掘
数学
工程类
数学分析
哲学
数据库
语言学
计算机安全
运营管理
作者
Jilun Tian,Yuchen Jiang,Jiusi Zhang,Shimeng Wu,Hao Luo
标识
DOI:10.1109/tim.2023.3273676
摘要
Data-driven remaining useful life (RUL) prediction is critical for industrial devices. There is an important assumption for classic machine learning methods that the training and test sets need to follow independent and identical distribution (IID), which does not hold under multiple working conditions. To relax the IID assumption, transfer learning is a key technique, which is also limited by the knowledge of the target domain data distribution. This paper proposes a novel transfer ensemble learning (TEL) framework, which can effectively utilize the information of source domain and improve the generalization ability of the model to unknown target domain. The framework mainly relies on the knowledge of metric learning, and adopts Kullback-Leibler (KL) divergence to measure the differences in data distributions. A domain dissimilarity metric is proposed to ensure that sub-models of similar datasets have a greater impact on the results. To verify the performance of this framework, a real filtering system from the PHM 2020 competition is used. Meanwhile, the information of time series data can be fully utilized by using the bidirectional long short-term memory (Bi-LSTM) model. Experimental results show that the proposed TEL-Bi-LSTM method outperforms existing machine learning methods.
科研通智能强力驱动
Strongly Powered by AbleSci AI