计算机科学
人工智能
学习迁移
蒸馏
机器学习
适应(眼睛)
无监督学习
领域(数学分析)
领域知识
数据挖掘
域适应
数学
数学分析
有机化学
化学
物理
光学
分类器(UML)
作者
Yingqin Liang,Wentao Mao,Chao-Szu Wu
标识
DOI:10.1177/1748006x231223777
摘要
Online remaining useful life (RUL) prediction has been solved by deep transfer learning, but is still with challenges as follows: (1) Incomplete and unlabeled data under actual operation; (2) Condition monitoring data is streaming with unknown distribution; and (3) The distribution of degradation data is variable. To solve them, an unsupervised incremental transfer learning approach with knowledge distillation (KD) is proposed. First, a time series recursive prediction model is built to generate pseudo labels. Second, an online KD network is constructed to realize unsupervised domain adaptation. Finally, an incremental updating mechanism is designed in the online KD network for online RUL prediction with the pseudo labels. Comparative experiments on the IEEE PHM Challenge 2012 rolling bearing dataset show that the proposed method, being computationally inexpensive, can effectively achieve dynamic prediction only with sequentially-collected data, which can provide an effective RUL prediction solution for rotating machinery under an open environment.
科研通智能强力驱动
Strongly Powered by AbleSci AI