Safe reinforcement learning for real-time automatic control in a smart energy-hub
Dawei Qiu,
Zihang Dong,
Xi Zhang,
Yi Wang and
Goran Strbac
Applied Energy, 2022, vol. 309, issue C, No S030626192101638X
Abstract:
Nowadays, multi-energy systems are receiving special attention from smart grid community owing to their high flexibility potentials integrating with multiple energy carriers. In this regard, energy hub is known as a flexible and efficient platform to supply energy demands with an acceptable range of affordability and reliability by relying on various energy production, storage and conversion facilities. Given the increasing penetration of renewable energy sources to promote a low-carbon energy transition, accurate economic and environmental assessment of energy hub, along with the real-time automatic energy management scheme has become a challenging task due to the high variability of renewable energy sources. Furthermore, the conventional model-based optimization approach requiring full knowledge of the employed mathematical operating models and accurate uncertainty distributions may become impractical for real-world applications. In this context, this paper proposes a model-free safe deep reinforcement learning method for the optimal control of a renewable-based energy hub operating in multiple energy carries while satisfying the physical constraints within the energy hub operation model. The main objective of this work is to minimize the system energy cost and carbon emission by considering various energy components. The proposed deep reinforcement learning method is trained and tested on a real-world dataset to validate its superior performance in reducing energy cost, carbon emission, and computational time with respect to the state-of-the-art deep reinforcement learning and optimized-based approaches. Moreover, the effectiveness of the proposed method in dealing with model operation constraints is evaluated on both training and test environments. Finally, the generalization performance for the learnt energy management scheme as well as the sensitivity analysis on storage flexibility and carbon price are also examined in the case studies.
Keywords: Multi-energy system; Energy hub; Safe reinforcement learning; Carbon emission; Renewable energy (search for similar items in EconPapers)
Date: 2022
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (19)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S030626192101638X
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:appene:v:309:y:2022:i:c:s030626192101638x
Ordering information: This journal article can be ordered from
http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/bibliographic
http://www.elsevier. ... 405891/bibliographic
DOI: 10.1016/j.apenergy.2021.118403
Access Statistics for this article
Applied Energy is currently edited by J. Yan
More articles in Applied Energy from Elsevier
Bibliographic data for series maintained by Catherine Liu ().