Transformer-Based Transfer Learning for Battery State-of-Health Estimation
Alessandro Giuliano,
Yuandi Wu,
John Yawney and
Stephen Andrew Gadsden ()
Additional contact information
Alessandro Giuliano: Intelligent and Cognitive Engineering Laboratory, McMaster University, Hamilton, ON L8S 4L8, Canada
Yuandi Wu: Intelligent and Cognitive Engineering Laboratory, McMaster University, Hamilton, ON L8S 4L8, Canada
John Yawney: Intelligent and Cognitive Engineering Laboratory, McMaster University, Hamilton, ON L8S 4L8, Canada
Stephen Andrew Gadsden: Intelligent and Cognitive Engineering Laboratory, McMaster University, Hamilton, ON L8S 4L8, Canada
Energies, 2025, vol. 18, issue 20, 1-21
Abstract:
The accurate prediction of batteries’ state of health has been an important research topic in recent years, given the surge in electric vehicle production. Dynamically assessing the current state of health of a battery can help predict how long the battery will last during the next discharge cycle, which is directly related to an electric vehicle’s autonomy calculations. Data-driven approaches have been successful in accurately estimating the state of health through machine learning-based models. Within this research topic, limited studies have been carried out to explore the transfer learning capabilities of these models to improve performance and reduce computational costs related to training. This paper aims to compare the performance of different machine learning models to adapt to diverse battery working conditions, as well as their transfer learning capabilities to batteries with different electrochemical compositions. A new transformer-based model is proposed for the SOH estimation problem. The results show that the proposed transformer model can improve its prediction performance through transfer learning when compared to the same model trained exclusively on the target dataset. When pre-trained on the NASA dataset and fine-tuned on the Oxford dataset, the transformer achieved an average RMSE of 0.01461, outperforming the best-performing model (an ANN with an RMSE of 0.01747) trained exclusively on the target data by 17%. On top of improving its performance, the model is also able to outperform a competing transformer model from the literature, which reported an RMSE of 0.90170 on a similar cross-composition transfer task.
Keywords: attention mechanism; battery; electric vehicles; state of health; transfer learning; transformers (search for similar items in EconPapers)
JEL-codes: Q Q0 Q4 Q40 Q41 Q42 Q43 Q47 Q48 Q49 (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/1996-1073/18/20/5439/pdf (application/pdf)
https://www.mdpi.com/1996-1073/18/20/5439/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jeners:v:18:y:2025:i:20:p:5439-:d:1772076
Access Statistics for this article
Energies is currently edited by Ms. Cassie Shen
More articles in Energies from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().