EconPapers    
Economics at your fingertips  
 

Enhancing cyber-resilience in integrated energy system scheduling with demand response using deep reinforcement learning

Yang Li, Wenjie Ma, Yuanzheng Li, Sen Li, Zhe Chen and Mohammad Shahidehpour

Applied Energy, 2025, vol. 379, issue C, No S0306261924022141

Abstract: Optimally scheduling multi-energy flow is an effective method to utilize renewable energy sources (RES) and improve the stability and economy of integrated energy systems (IES). However, the stable demand-supply of IES faces challenges from uncertainties that arise from RES and loads, as well as the increasing impact of cyber-attacks with advanced information and communication technologies adoption. To address these challenges, this paper proposes an innovative model-free resilience scheduling method based on state-adversarial deep reinforcement learning (DRL) for integrated demand response (IDR)-enabled IES. The proposed method designs an IDR program to explore the interaction ability of electricity-gas-heat flexible loads. Additionally, the state-adversarial Markov decision process (SA-MDP) model characterizes the energy scheduling problem of IES under cyber-attack, incorporating cyber-attacks as adversaries directly into the scheduling process. The state-adversarial soft actor–critic (SA-SAC) algorithm is proposed to mitigate the impact of cyber-attacks on the scheduling strategy, integrating adversarial training into the learning process to against cyber-attacks. Simulation results demonstrate that our method is capable of adequately addressing the uncertainties resulting from RES and loads, mitigating the impact of cyber-attacks on the scheduling strategy, and ensuring a stable demand supply for various energy sources. Moreover, the proposed method demonstrates resilience against cyber-attacks. Compared to the original soft actor–critic (SAC) algorithm, it achieves a 10% improvement in economic performance under cyber-attack scenarios.

Keywords: Integrated energy system; Demand response; Cyber-attack; Deep reinforcement learning; Dynamic pricing mechanism; Cyber-resilient scheduling (search for similar items in EconPapers)
Date: 2025
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0306261924022141
Full text for ScienceDirect subscribers only

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:eee:appene:v:379:y:2025:i:c:s0306261924022141

Ordering information: This journal article can be ordered from
http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/bibliographic
http://www.elsevier. ... 405891/bibliographic

DOI: 10.1016/j.apenergy.2024.124831

Access Statistics for this article

Applied Energy is currently edited by J. Yan

More articles in Applied Energy from Elsevier
Bibliographic data for series maintained by Catherine Liu ().

 
Page updated 2025-05-25
Handle: RePEc:eee:appene:v:379:y:2025:i:c:s0306261924022141