遗忘
计算机科学
公制(单位)
蒸馏
特征(语言学)
断层(地质)
样品(材料)
人工智能
机器学习
过程(计算)
数据挖掘
模式识别(心理学)
工程类
哲学
语言学
运营管理
化学
有机化学
色谱法
地震学
地质学
操作系统
作者
Qilang Min,Juanjuan He,Piaoyao Yu,Yanwei Fu
出处
期刊:IEEE Access
[Institute of Electrical and Electronics Engineers]
日期:2023-01-01
卷期号:11: 46015-46025
被引量:1
标识
DOI:10.1109/access.2023.3274481
摘要
Incremental learning-based fault diagnosis systems (IFD) are widely used because of their ability to handle constantly updated fault data and types.However, the catastrophic forgetting problem remains the most crucial contemporary challenge facing IFD.This paper proposes an incremental fault diagnosis method based on metric feature distillation (MFD) and improved sample memory to solve this problem.First, the metric feature distillation is designed with metric learning methods and feature distillation.It uses distillation and triplet loss to constrain the network parameters of old and new tasks in the same feasible region, effectively alleviating catastrophic forgetting.Then, for a small amount of data that can be stored scenario, an improved sample memory strategy is introduced to reduce catastrophic forgetting further, called the center and hard sample memory (CAHM).It can better preserve the global information of the data, reducing the forgetting of old data information that needs to be preserved during the training process.Experimental results on CWRU and MFPT datasets verify the proposed method's effectiveness.
科研通智能强力驱动
Strongly Powered by AbleSci AI