Intelligent Energy Management Control for Extended Range Electric Vehicles Based on Dynamic Programming and Neural Network
Lihe Xi,
Xin Zhang,
Chuanyang Sun,
Zexing Wang,
Xiaosen Hou and
Jibao Zhang
Additional contact information
Lihe Xi: Beijing Key Laboratory of Powertrain for New Energy Vehicle, Beijing Jiaotong University, Beijing 100044, China
Xin Zhang: Beijing Key Laboratory of Powertrain for New Energy Vehicle, Beijing Jiaotong University, Beijing 100044, China
Chuanyang Sun: Beijing Key Laboratory of Powertrain for New Energy Vehicle, Beijing Jiaotong University, Beijing 100044, China
Zexing Wang: Beijing Electric Vehicle Co. LTD., Beijing 102606, China
Xiaosen Hou: Beijing Key Laboratory of Powertrain for New Energy Vehicle, Beijing Jiaotong University, Beijing 100044, China
Jibao Zhang: Beijing Key Laboratory of Powertrain for New Energy Vehicle, Beijing Jiaotong University, Beijing 100044, China
Energies, 2017, vol. 10, issue 11, 1-18
Abstract:
The extended range electric vehicle (EREV) can store much clean energy from the electric grid when it arrives at the charging station with lower battery energy. Consuming minimum gasoline during the trip is a common goal for most energy management controllers. To achieve these objectives, an intelligent energy management controller for EREV based on dynamic programming and neural networks (IEMC_NN) is proposed. The power demand split ratio between the extender and battery are optimized by DP, and the control objectives are presented as a cost function. The online controller is trained by neural networks. Three trained controllers, constructing the controller library in IEMC_NN, are obtained from training three typical lengths of the driving cycle. To determine an appropriate NN controller for different driving distance purposes, the selection module in IEMC_NN is developed based on the remaining battery energy and the driving distance to the charging station. Three simulation conditions are adopted to validate the performance of IEMC_NN. They are target driving distance information, known and unknown, changing the destination during the trip. Simulation results using these simulation conditions show that the IEMC_NN had better fuel economy than the charging deplete/charging sustain (CD/CS) algorithm. More significantly, with known driving distance information, the battery SOC controlled by IEMC_NN can just reach the lower bound as the EREV arrives at the charging station, which was also feasible when the driver changed the destination during the trip.
Keywords: energy management strategy; extended range electric vehicle; dynamic programming; neural network; state of charge (search for similar items in EconPapers)
JEL-codes: Q Q0 Q4 Q40 Q41 Q42 Q43 Q47 Q48 Q49 (search for similar items in EconPapers)
Date: 2017
References: View references in EconPapers View complete reference list from CitEc
Citations: View citations in EconPapers (9)
Downloads: (external link)
https://www.mdpi.com/1996-1073/10/11/1871/pdf (application/pdf)
https://www.mdpi.com/1996-1073/10/11/1871/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jeners:v:10:y:2017:i:11:p:1871-:d:118931
Access Statistics for this article
Energies is currently edited by Ms. Agatha Cao
More articles in Energies from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().