缩小
不变(物理)
计算机科学
人工智能
模式识别(心理学)
机器学习
算法
数学
数学物理
程序设计语言
作者
Zhenling Mo,Zijun Zhang,Qiang Miao,Kwok‐Leung Tsui
标识
DOI:10.1109/tnnls.2025.3531214
摘要
Incorrect labels as well as the discrepancy between training and test domain data distributions can significantly affect the effectiveness of supervised data-driven models in machine fault diagnosis applications. Such a challenge can be characterized as the noisy label-domain generalization (NL-DG) problem. In this article, the extended invariant risk minimization (EIRM) is developed, which incorporates flat minima seeking to address the NL-DG challenge. The ability of handling NL-DG is realized by shifting the gradient penalty base from the dummy classifier to the entire model. EIRM is shown to be closely related to locating a flat minimum, which is crucial for label noise (LN) robustness and model generalization. Explorations on function smoothness and algorithm convergence are offered to understand EIRM from the theoretical aspect. An efficient implementation of EIRM is also developed to construct the fault diagnosis model. The EIRM-based fault diagnosis method is compared with strong benchmarks on multiple NL-DG tasks using actuator and gearbox fault datasets. Results indicate that the EIRM-based method on average is more effective than the benchmarks. The code is available at https://github.com/mozhenling/doge-eirm.
科研通智能强力驱动
Strongly Powered by AbleSci AI