Time Series Foundation Model for Improved Transformer Load Forecasting and Overload Detection
Yikai Hou,
Chao Ma (),
Xiang Li,
Yinggang Sun,
Haining Yu and
Zhou Fang
Additional contact information
Yikai Hou: School of Computer Science and Technology, Harbin University of Science and Technology, Harbin 150000, China
Chao Ma: School of Computer Science and Technology, Harbin University of Science and Technology, Harbin 150000, China
Xiang Li: School of Cyberspace Security, Harbin Institute of Technology, Harbin 150000, China
Yinggang Sun: School of Cyberspace Security, Harbin Institute of Technology, Harbin 150000, China
Haining Yu: School of Cyberspace Security, Harbin Institute of Technology, Harbin 150000, China
Zhou Fang: Heilongjiang Province Cyberspace Research Center, Harbin 150000, China
Energies, 2025, vol. 18, issue 3, 1-15
Abstract:
Simple load forecasting and overload prediction models, such as LSTM and XGBoost, are unable to handle the increasing amount of data in power systems. Recently, various foundation models (FMs) for time series analysis have been proposed, which can be scaled up for large time series variables and datasets across domains. However, the simple pre-training setting makes FMs unsuitable for complex downstream tasks. Effectively handling real-world tasks depends on additional data, i.e., covariates, and prior knowledge. Incorporating these through structural modifications to FMs is not feasible, as it would disrupt the pre-trained weights. To address this issue, this paper proposes a frequency domain mixer, i.e., FreqMixer, framework for enhancing the task-specific analytical capabilities of FMs. FreqMixer is an auxiliary network for the backbone FMs that takes covariates as input. It has the same number of layers as the backbone and communicates with it at each layer, allowing the incorporation of prior knowledge without altering the backbone’s structure. Through experiments, FreqMixer demonstrates high efficiency and performance, reducing MAPE by 23.65%, recall by 87%, and precision by 72% in transformer load forecasting during the Spring Festival while improving precision by 192.09% and accuracy by 14% in corresponding overload prediction, all while processing data from over 160 transformers with just 1M additional parameters.
Keywords: transformer load forecasting; parameter efficient fine-tuning; multitask learning; foundation model; spectral analysis; time series forecasting (search for similar items in EconPapers)
JEL-codes: Q Q0 Q4 Q40 Q41 Q42 Q43 Q47 Q48 Q49 (search for similar items in EconPapers)
Date: 2025
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/1996-1073/18/3/660/pdf (application/pdf)
https://www.mdpi.com/1996-1073/18/3/660/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jeners:v:18:y:2025:i:3:p:660-:d:1581178
Access Statistics for this article
Energies is currently edited by Ms. Agatha Cao
More articles in Energies from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().