Low-Carbon Economic Dispatch of Integrated Energy Systems for Electricity, Gas, and Heat Based on Deep Reinforcement Learning
Xiaojuan Lu,
Yaohui Zhang,
Duojin Fan (),
Jiawei Wei and
Xiaoying Yu
Additional contact information
Xiaojuan Lu: School of Automation Electrical Engineering, Lanzhou Jiaotong University, Lanzhou 730070, China
Yaohui Zhang: School of Automation Electrical Engineering, Lanzhou Jiaotong University, Lanzhou 730070, China
Duojin Fan: Research Institute of Photothermal Energy Storage, Lanzhou Jiaotong University, Lanzhou 730070, China
Jiawei Wei: School of Automation Electrical Engineering, Lanzhou Jiaotong University, Lanzhou 730070, China
Xiaoying Yu: School of Automation Electrical Engineering, Lanzhou Jiaotong University, Lanzhou 730070, China
Sustainability, 2025, vol. 17, issue 20, 1-20
Abstract:
Under the background of “dual-carbon”, the development of energy internet is an inevitable trend for China’s low-carbon energy transition. This paper proposes a hydrogen-coupled electrothermal integrated energy system (HCEH-IES) operation mode and optimizes the source-side structure of the system from the level of carbon trading policy combined with low-carbon technology, taps the carbon reduction potential, and improves the renewable energy consumption rate and system decarbonization level; in addition, for the operation optimization problem of this electric–gas–heat integrated energy system, a flexible energy system based on electric–gas–heat is proposed. Furthermore, to address the operation optimization problem of the HCEH-IES, a deep reinforcement learning method based on Soft Actor–Critic (SAC) is proposed. This method can adaptively learn control strategies through interactions between the intelligent agent and the energy system, enabling continuous action control of the multi-energy flow system while solving the uncertainties associated with source-load fluctuations from wind power, photovoltaics, and multi-energy loads. Finally, historical data are used to train the intelligent body and compare the scheduling strategies obtained by SAC and DDPG algorithms. The results show that the SAC-based algorithm has better economics, is close to the CPLEX day-ahead optimal scheduling method, and is more suitable for solving the dynamic optimal scheduling problem of integrated energy systems in real scenarios.
Keywords: integrated energy systems; low-carbon economic dispatch; deep reinforcement learning; soft actor–critic; optimal energy management (search for similar items in EconPapers)
JEL-codes: O13 Q Q0 Q2 Q3 Q5 Q56 (search for similar items in EconPapers)
Date: 2025
References: Add references at CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2071-1050/17/20/9040/pdf (application/pdf)
https://www.mdpi.com/2071-1050/17/20/9040/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jsusta:v:17:y:2025:i:20:p:9040-:d:1769713
Access Statistics for this article
Sustainability is currently edited by Ms. Alexandra Wu
More articles in Sustainability from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().