EconPapers    
Economics at your fingertips  
 

Mutual-learning based self-supervised knowledge distillation framework for remaining useful life prediction under variable working condition-induced domain shift scenarios

Zhuohang Chen, Jinglong Chen, Zijun Liu and Yulang Liu

Reliability Engineering and System Safety, 2025, vol. 264, issue PA

Abstract: Domain shifts induced by variable working conditions, including both multiple steady and time-varying working conditions, result in inconsistent degradation patterns and pose significant challenges for remaining useful life (RUL) prediction. To address the above issue, we propose a self-supervised knowledge distillation framework based on mutual learning for RUL prediction under variable working conditions. The proposed framework employs a teacher-student architecture, facilitating knowledge transfer through self-supervised pseudo-labels. A mutual learning-based training strategy is developed to prevent over-adaptation to the source domain and promote domain generalization. Additionally, during student model training, a feature-level domain adversarial training strategy is implemented to improve cross-domain feature decoupling and ensure the learning of domain-invariant features. The above two components complement each other: adversarial learning aligns marginal distributions (variable working conditions), while pseudo-label learning refines conditional alignment (normal and fast degradation stages), allowing the model to adapt more effectively to complex degradation scenarios. Furthermore, we incorporate a sparse attention mechanism for efficient feature extraction, significantly reducing computational complexity while maintaining robust performance. The RUL prediction experiments under multi steady conditions and time-varying conditions are carried out on two life-cycle bearing datasets respectively. Comparative results demonstrate the superiority and practicality of our proposed method.

Keywords: Remaining useful life prediction; Variable working conditions; Knowledge distillation; Domain adversarial learning (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:

Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0951832025005605
Full text for ScienceDirect subscribers only

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:eee:reensy:v:264:y:2025:i:pa:s0951832025005605

DOI: 10.1016/j.ress.2025.111359

Access Statistics for this article

Reliability Engineering and System Safety is currently edited by Carlos Guedes Soares

More articles in Reliability Engineering and System Safety from Elsevier
Bibliographic data for series maintained by Catherine Liu ().

 
Page updated 2025-08-29
Handle: RePEc:eee:reensy:v:264:y:2025:i:pa:s0951832025005605