EconPapers    
Economics at your fingertips  
 

Predictive pretrained transformer (PPT) for real-time battery health diagnostics

Jingyuan Zhao, Zhenghong Wang, Yuyan Wu and Andrew F. Burke

Applied Energy, 2025, vol. 377, issue PD, No S0306261924021299

Abstract: Modeling and forecasting the evolution of battery systems involve complex interactions across physical, chemical, and electrochemical processes, influenced by diverse usage demands and dynamic operational patterns. In this study, we developed a predictive pre-trained Transformer (PPT) model equipped with 1,871,114 parameters that enhance identification of both short-term and long-term patterns in time-series data. This is achieved through the integration of convolutional layers and probabilistic sparse self-attention mechanisms, which collectively enhance prediction accuracy and efficiency in diagnosing battery health. Moreover, the customized hybrid-model fusion supports parallel computing and employs transfer learning, reducing computational costs while enhancing scalability and adaptability. Consequently, this allows for precise real-time health estimations across various battery cycles. We validated this method using a public dataset of 203 commercial lithium iron phosphate (LFP)/graphite batteries charged at rates ranging from 1C to 8C. By using only partial charge data—from an 80 % state of charge to the maximum charging voltage (3.6 V for LFP batteries, 4.2 V for ternary batteries)—and avoiding complex feature engineering, error metrics were achieved below 0.3 % for root mean square error (RMSE), weighted mean absolute percentage error (WMAPE), and mean absolute error (MAE), with an R2 of 98.9 %. The generalization capabilities were further demonstrated across 36 different testing protocols, encompassing 23,480 cycles throughout the entire life cycle, with a total inference time of 9.88 s during the testing phases. Further experiments on 30 nickel cobalt aluminum (NCA) batteries and 36 nickel cobalt manganese (NCM) batteries, across different battery types and operational scenarios, resulted in RMSE, WMAPE, and MAE all below 0.9 %, with R2 values of 94.1 % and 94.4 %, respectively. These findings highlight the potential of our customized deep transfer neural networks to enhance diagnostic accuracy, accelerate training, and improve generalization in real-time applications.

Keywords: Battery; Health; Transformer; Convolution; Self-attention; Transfer learning (search for similar items in EconPapers)
Date: 2025
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)

Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0306261924021299
Full text for ScienceDirect subscribers only

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:eee:appene:v:377:y:2025:i:pd:s0306261924021299

Ordering information: This journal article can be ordered from
http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/bibliographic
http://www.elsevier. ... 405891/bibliographic

DOI: 10.1016/j.apenergy.2024.124746

Access Statistics for this article

Applied Energy is currently edited by J. Yan

More articles in Applied Energy from Elsevier
Bibliographic data for series maintained by Catherine Liu ().

 
Page updated 2025-04-05
Handle: RePEc:eee:appene:v:377:y:2025:i:pd:s0306261924021299