Development and Evaluation of a Smart Charging Strategy for an Electric Vehicle Fleet Based on Reinforcement Learning
Felix Tuchnitz,
Niklas Ebell,
Jonas Schlund and
Marco Pruckner
Applied Energy, 2021, vol. 285, issue C, No S0306261920317566
Abstract:
Governments are currently subsidizing growth in the electric car market and the associated infrastructure in order to accelerate the transition to more sustainable mobility. To avoid the grid overload that results from simultaneously charging too many electric vehicles, there is a need for smart charging coordination systems. In this paper, we propose a charging coordination system based on Reinforcement Learning using an artificial neural network as a function approximator. Taking into account the baseload present in the power grid, a central agent creates forward-looking, coordinated charging schedules for an electric vehicle fleet of any size. In contrast to optimization-based charging strategies, system dynamics such as future arrivals, departures, and energy consumption do not have to be known beforehand. We implement and compare a range of parameter variants that differ in terms of the reward function and prioritized experience. Subsequently, we use a case study to compare our Reinforcement Learning algorithm with several other charging strategies. The Reinforcement Learning-based charging coordination system is shown to perform very well. All electric vehicles have enough energy for their next trip on departure and charging is carried out almost exclusively during the load valleys at night. Compared with an uncontrolled charging strategy, the Reinforcement Learning algorithm reduces the variance of the total load by 65%. The performance of our Reinforcement Learning concept comes close to that of an optimization-based charging strategy. However, an optimization algorithm needs to know certain information beforehand, such as the vehicle’s departure time and its energy requirement on arriving at the charging station. Our novel Reinforcement Learning-based charging coordination system therefore offers a flexible, easily adaptable, and scalable approach for an electric vehicle fleet under realistic operating conditions.
Keywords: Machine Learning; Reinforcement Learning; Electric vehicle; Smart charging; Valley filling; Load balancing (search for similar items in EconPapers)
Date: 2021
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (30)
Downloads: (external link)
http://www.sciencedirect.com/science/article/pii/S0306261920317566
Full text for ScienceDirect subscribers only
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:eee:appene:v:285:y:2021:i:c:s0306261920317566
Ordering information: This journal article can be ordered from
http://www.elsevier.com/wps/find/journaldescription.cws_home/405891/bibliographic
http://www.elsevier. ... 405891/bibliographic
DOI: 10.1016/j.apenergy.2020.116382
Access Statistics for this article
Applied Energy is currently edited by J. Yan
More articles in Applied Energy from Elsevier
Bibliographic data for series maintained by Catherine Liu ().