EconPapers    
Economics at your fingertips  
 

Transformers and Long Short-Term Memory Transfer Learning for GenIV Reactor Temperature Time Series Forecasting

Stella Pantopoulou, Anthonie Cilliers, Lefteri H. Tsoukalas and Alexander Heifetz ()
Additional contact information
Stella Pantopoulou: Nuclear Science and Engineering Division, Argonne National Laboratory, Argonne, IL 60439, USA
Anthonie Cilliers: Kairos Power, Alameda, CA 94501, USA
Lefteri H. Tsoukalas: School of Nuclear Engineering, Purdue University, West Lafayette, IN 47906, USA
Alexander Heifetz: Nuclear Science and Engineering Division, Argonne National Laboratory, Argonne, IL 60439, USA

Energies, 2025, vol. 18, issue 9, 1-18

Abstract: Automated monitoring of the coolant temperature can enable autonomous operation of generation IV reactors (GenIV), thus reducing their operating and maintenance costs. Automation can be accomplished with machine learning (ML) models trained on historical sensor data. However, the performance of ML usually depends on the availability of large amount of training data, which is difficult to obtain for GenIV, as this technology is still under development. We propose the use of transfer learning (TL), which involves utilizing knowledge across different domains, to compensate for this lack of training data. TL can be used to create pre-trained ML models with data from small-scale research facilities, which can then be fine-tuned to monitor GenIV reactors. In this work, we develop pre-trained Transformer and long short-term memory (LSTM) networks by training them on temperature measurements from thermal hydraulic flow loops operating with water and Galinstan fluids at room temperature at Argonne National Laboratory. The pre-trained models are then fine-tuned and re-trained with minimal additional data to perform predictions of the time series of high temperature measurements obtained from the Engineering Test Unit (ETU) at Kairos Power. The performance of the LSTM and Transformer networks is investigated by varying the size of the lookback window and forecast horizon. The results of this study show that LSTM networks have lower prediction errors than Transformers, but LSTM errors increase more rapidly with increasing lookback window size and forecast horizon compared to the Transformer errors.

Keywords: transfer learning; transformer; LSTM; Time Series Forecasting; temperature sensing (search for similar items in EconPapers)
JEL-codes: Q Q0 Q4 Q40 Q41 Q42 Q43 Q47 Q48 Q49 (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/1996-1073/18/9/2286/pdf (application/pdf)
https://www.mdpi.com/1996-1073/18/9/2286/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jeners:v:18:y:2025:i:9:p:2286-:d:1646200

Access Statistics for this article

Energies is currently edited by Ms. Agatha Cao

More articles in Energies from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-05-01
Handle: RePEc:gam:jeners:v:18:y:2025:i:9:p:2286-:d:1646200