Auto-embedding transformer under multi-source information fusion for few-shot fault diagnosis.
Journal:
Scientific reports
Published Date:
Jul 20, 2025
Abstract
Data-driven intelligent fault diagnosis methods have become essential for ensuring the reliability and stability of mechanical systems. However, their practical application is often hindered by the scarcity of labeled samples and the absence of effective multi-source information fusion strategies, which collectively limit the accuracy of existing fault diagnosis frameworks. To address these challenges, we propose a novel auto-embedding transformer named EDformer, tailored for multi-source information under few-shot fault diagnosis. First, the multi-source information is fed into a novel encoder-decoder to extract high-quality embeddings, thereby mitigating the challenges posed by limited samples in real-world engineering applications. Subsequently, an innovative cross-attention architecture leveraging Transformer neural networks is proposed to facilitate efficient multi-modal data integration by highlighting key correlations between sensing devices while minimizing superfluous information. In the final stage, the architecture integrates global max pooling and global average pooling operations to optimize feature abstraction and improve resilience to data variations. The effectiveness of the proposed framework is validated through comprehensive evaluations on two heterogeneous datasets. Diagnostic results demonstrate that EDformer surpasses contemporary approaches in both classification accuracy and stability, particularly under conditions of data scarcity. Visualization tools such as t-SNE and ROC curves further confirm its ability to effectively distinguish fault categories and capture critical fault-related features.
Authors
Keywords
No keywords available for this article.