计算机科学
联营
数据挖掘
编码器
数据集成
人工智能
机器学习
分布式计算
操作系统
作者
Bo Wang,Shuai Zhao,Qian Zhao,Yang Bai
标识
DOI:10.1038/s41598-025-10124-9
摘要
Data-driven intelligent fault diagnosis methods have become essential for ensuring the reliability and stability of mechanical systems. However, their practical application is often hindered by the scarcity of labeled samples and the absence of effective multi-source information fusion strategies, which collectively limit the accuracy of existing fault diagnosis frameworks. To address these challenges, we propose a novel auto-embedding transformer named EDformer, tailored for multi-source information under few-shot fault diagnosis. First, the multi-source information is fed into a novel encoder-decoder to extract high-quality embeddings, thereby mitigating the challenges posed by limited samples in real-world engineering applications. Subsequently, an innovative cross-attention architecture leveraging Transformer neural networks is proposed to facilitate efficient multi-modal data integration by highlighting key correlations between sensing devices while minimizing superfluous information. In the final stage, the architecture integrates global max pooling and global average pooling operations to optimize feature abstraction and improve resilience to data variations. The effectiveness of the proposed framework is validated through comprehensive evaluations on two heterogeneous datasets. Diagnostic results demonstrate that EDformer surpasses contemporary approaches in both classification accuracy and stability, particularly under conditions of data scarcity. Visualization tools such as t-SNE and ROC curves further confirm its ability to effectively distinguish fault categories and capture critical fault-related features.
科研通智能强力驱动
Strongly Powered by AbleSci AI