EconPapers    
Economics at your fingertips  
 

A Multi-Scale Fusion Convolutional Network for Time-Series Silicon Prediction in Blast Furnaces

Qiancheng Hao (), Wenjing Liu, Wenze Gao and Xianpeng Wang
Additional contact information
Qiancheng Hao: School of Information Science and Engineering, Northeastern University, Shenyang 110819, China
Wenjing Liu: School of Metallurgy, Northeastern University, Shenyang 110819, China
Wenze Gao: School of Mechanical Engineering and Automation, Northeastern University, Shenyang 110819, China
Xianpeng Wang: Key Laboratory of Data Analytics and Optimization for Smart Industry, Northeastern University, Ministry of Education, Shenyang 110819, China

Mathematics, 2025, vol. 13, issue 8, 1-21

Abstract: In steel production, the blast furnace is a critical element. In this process, precisely controlling the temperature of the molten iron is indispensable for attaining efficient operations and high-grade products. This temperature is often indirectly reflected by the silicon content in the hot metal. However, due to the dynamic nature and inherent delays of the ironmaking process, real-time prediction of silicon content remains a significant challenge, and traditional methods often suffer from insufficient prediction accuracy. This study presents a novel Multi-Scale Fusion Convolutional Neural Network (MSF-CNN) to accurately predict the silicon content during the blast furnace smelting process, addressing the limitations of existing data-driven approaches. The proposed MSF-CNN model extracts temporal features at two distinct scales. The first scale utilizes a Convolutional Block Attention Module, which captures local temporal dependencies by focusing on the most relevant features across adjacent time steps. The second scale employs a Multi-Head Self-Attention mechanism to model long-term temporal dependencies, overcoming the inherent delay issues in the blast furnace process. By combining these two scales, the model effectively captures both short-term and long-term temporal dependencies, thereby enhancing prediction accuracy and real-time applicability. Validation using real blast furnace data demonstrates that MSF-CNN outperforms recurrent neural network models such as Long Short-Term Memory (LSTM) and the Gated Recurrent Unit (GRU). Compared with LSTM and the GRU, MSF-CNN reduces the Root Mean Square Error (RMSE) by approximately 22% and 21%, respectively, and improves the Hit Rate (HR) by over 3.5% and 4%, highlighting its superiority in capturing complex temporal dependencies. These results indicate that the MSF-CNN adapts better to the blast furnace’s dynamic variations and inherent delays, achieving significant improvements in prediction precision and robustness compared to state-of-the-art recurrent models.

Keywords: silicon content prediction; convolutional block attention module; self-attention mechanism; temporal dependencies (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2025
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
https://www.mdpi.com/2227-7390/13/8/1347/pdf (application/pdf)
https://www.mdpi.com/2227-7390/13/8/1347/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:13:y:2025:i:8:p:1347-:d:1638590

Access Statistics for this article

Mathematics is currently edited by Ms. Emma He

More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-05-10
Handle: RePEc:gam:jmathe:v:13:y:2025:i:8:p:1347-:d:1638590