KALFormer: Knowledge-augmented attention learning for long-term time series forecasting with transformer
Xing Dong,
Qianwei Yang,
Wenbo Cheng and
Yun Zhang
PLOS ONE, 2026, vol. 21, issue 1, 1-17
Abstract:
Time series forecasting remains a fundamental yet challenging task due to its inherent non-linear dynamics, inter-variable dependencies, and long-term temporal correlations. Existing approaches often struggle to jointly capture local temporal continuity and global contextual relationships, particularly under complex external influences. To overcome these limitations, we propose KALFormer, a knowledge-augmented attention learning transformer framework that integrates sequential modeling with external information fusion. KALFormer enhances spatiotemporal representation and contextual reasoning by integrating Long Short-Term Memory (LSTM) encoders, Transformer-based self-attention mechanisms, and knowledge-aware modules. Extensive experiments on six public benchmark datasets demonstrate that KALFormer achieves an average improvement of 8.4% in MSE and MAE compared with representative baseline models, highlighting its robustness, interpretability, and reliability for long-term time series forecasting. The source code is available at https://github.com/dxpython/KALFormer.
Date: 2026
References: Add references at CitEc
Citations:
Downloads: (external link)
https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0338052 (text/html)
https://journals.plos.org/plosone/article/file?id= ... 38052&type=printable (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:plo:pone00:0338052
DOI: 10.1371/journal.pone.0338052
Access Statistics for this article
More articles in PLOS ONE from Public Library of Science
Bibliographic data for series maintained by plosone ().