EconPapers    
Economics at your fingertips  
 

Instance-Based Transfer Learning-Improved Battery State-of-Health Estimation with Self-Attention Mechanism

Renjun He, Chunxiao Wang, Chun Yin, Shang Yang, Yifan Wang (), Yuanpeng Fang, Kai Chen and Jiusi Zhang ()
Additional contact information
Renjun He: School of Automation Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China
Chunxiao Wang: School of Automation Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China
Chun Yin: School of Automation Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China
Shang Yang: School of Automation Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China
Yifan Wang: School of Automation Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China
Yuanpeng Fang: AVIC Chengdu Aircraft Design & Research Institute, Chengdu 610091, China
Kai Chen: School of Automation Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China
Jiusi Zhang: School of Automation Engineering, University of Electronic Science and Technology of China, Chengdu 611731, China

Energies, 2025, vol. 18, issue 21, 1-19

Abstract: Batteries’ state-of-health (SOH) estimation has attracted appealing attention in energy industrial systems. In conventional data-driven methods, the lack of target data and different source data can also lead to poor model training effect. To tackle this problem, this paper combines the instance-based transfer (ITL) and interpretable self-attention mechanism (SAM) to integrate the fitting ability of long short-term memory (LSTM), which can improve the SOH estimation performance. ITL re-weights the temporal instance of a training set to give more impact of target-like data, which can relax the independent and identical distribution (IID) assumption. SAM method can enhance the estimation performance by re-weighting the spatial features, and be interpreted by detailed visualization. During the model training, the pre-trained multi-layer LSTM model is fine-tuned by target data to make full use of target information. The proposed method has outperformed other compared algorithms in transfer tasks, and has tested in real-world cross-domain conditions datasets.

Keywords: instance transfer learning; batteries’ state-of-health estimation; long short-term memory; self-attention mechanism (search for similar items in EconPapers)
JEL-codes: Q Q0 Q4 Q40 Q41 Q42 Q43 Q47 Q48 Q49 (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/1996-1073/18/21/5672/pdf (application/pdf)
https://www.mdpi.com/1996-1073/18/21/5672/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jeners:v:18:y:2025:i:21:p:5672-:d:1782030

Access Statistics for this article

Energies is currently edited by Ms. Cassie Shen

More articles in Energies from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-10-30
Handle: RePEc:gam:jeners:v:18:y:2025:i:21:p:5672-:d:1782030