Pre-Attention Mechanism and Convolutional Neural Network Based Multivariate Load Prediction for Demand Response
Zheyu He (),
Rongheng Lin,
Budan Wu,
Xin Zhao and
Hua Zou
Additional contact information
Zheyu He: State Key Laboratory of Networking and Switching Technology, School of Computer Science (National Pilot Software Engineering School), Beijing University of Posts and Telecommunications, Beijing 100876, China
Rongheng Lin: State Key Laboratory of Networking and Switching Technology, School of Computer Science (National Pilot Software Engineering School), Beijing University of Posts and Telecommunications, Beijing 100876, China
Budan Wu: State Key Laboratory of Networking and Switching Technology, School of Computer Science (National Pilot Software Engineering School), Beijing University of Posts and Telecommunications, Beijing 100876, China
Xin Zhao: Economic & Research Institute, State Grid Shandong Electric Power Company, Jinan 250021, China
Hua Zou: State Key Laboratory of Networking and Switching Technology, School of Computer Science (National Pilot Software Engineering School), Beijing University of Posts and Telecommunications, Beijing 100876, China
Energies, 2023, vol. 16, issue 8, 1-13
Abstract:
The construction of smart grids has greatly changed the power grid pattern and power supply structure. For the power system, reasonable power planning and demand response is necessary to ensure the stable operation of a society. Accurate load prediction is the basis for realizing demand response for the power system. This paper proposes a Pre-Attention-CNN-GRU model (PreAttCG) which combines a convolutional neural network (CNN) and gate recurrent unit (GRU) and applies the attention mechanism in front of the whole model. The PreAttCG model accepts historical load data and more than nine other factors (including temperature, wind speed, humidity, etc.) as input. The attention layer and CNN layer effectively extract the features and weights of each factor. Load forecasting is then performed by the prediction layer, which consists of a stacked GRU. The model is verified by industrial load data from a German dataset and a Chinese dataset from the real world. The results show that the PreAttCG model has better performance (3~5% improvement in MAPE) than both LSTM with only load input and LSTM with all factors. Additionally, the experiments also show that the attention mechanism can effectively extract the weights of relevant factors affecting the load data.
Keywords: load prediction; attention; convolutional neural network; gate recurrent unit (search for similar items in EconPapers)
JEL-codes: Q Q0 Q4 Q40 Q41 Q42 Q43 Q47 Q48 Q49 (search for similar items in EconPapers)
Date: 2023
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (1)
Downloads: (external link)
https://www.mdpi.com/1996-1073/16/8/3446/pdf (application/pdf)
https://www.mdpi.com/1996-1073/16/8/3446/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jeners:v:16:y:2023:i:8:p:3446-:d:1123574
Access Statistics for this article
Energies is currently edited by Ms. Agatha Cao
More articles in Energies from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().