边际分布
条件概率分布
联合概率分布
一般化
计算机科学
断层(地质)
接头(建筑物)
学习迁移
人工智能
相似性(几何)
模式识别(心理学)
分布(数学)
数据挖掘
领域(数学分析)
机器学习
算法
数学
随机变量
统计
工程类
建筑工程
数学分析
地震学
图像(数学)
地质学
作者
Changqing Shen,Xu Wang,Dong Wang,Yongxiang Li,Jun Zhu,Mingming Gong
标识
DOI:10.1109/tim.2021.3055786
摘要
An inconsistent distribution between training and testing data caused by complicated and changeable machine working conditions hinders wide applications of traditional deep learning for machine fault diagnosis. In a target domain, in which labeled samples are not available (testing data), transfer learning can adopt a relevant source domain (training data) to identify the similarity between the two domains and subsequently mitigate the negative effects of a domain shift. Previous studies on transfer learning mainly focused on decreasing the marginal distribution distance of two different domains or narrowing the conditional distribution distance even though marginal and conditional distributions provide different contributions to transfer tasks. The relative importance of the two distributions is difficult to dynamically and quantitatively assess. To align the two distributions (joint distribution) of two different domains, in this article, we propose a dynamic joint distribution alignment network (DJDAN) to evaluate the relative importance of marginal and conditional distributions dynamically and quantitatively. Furthermore, compared with common metrics that use pseudo labels to calculate the conditional distribution distance, the proposed DJDAN uses soft pseudo labels to more accurately measure the conditional distribution discrepancy between different domains. Extensive experiments reveal the superiority and generalization of the proposed DJDAN for bearing fault diagnosis under different working conditions.
科研通智能强力驱动
Strongly Powered by AbleSci AI