|
. | . |
|
by Staff Writers Beijing (SPX) Jul 03, 2015
Nowadays, the plug-in hybrid electric bus (PHEB) has been widely applied as a transportation in many cities of China. Compared with conventional bus, more preferable fuel economy might have been achieved, due to the usage of the electric energy from the grid which is relatively more inexpensive than fossil fuels. In recent years, a large amount of approaches had been adopted in solving the energy management problem, which described via optimal control theory including dynamic programming, fuzzy logic control, Pontryagin Minimum Principle, and Model Predictive Control. Inherently, if those techniques are attempted to be applied online, it is critical to find a control strategy with some kind of driving cycle prediction. For this purpose, some modeling methods proposed to estimate the fuel consumption cost function with a Markov chain which would give the transition probability of a set of torque demand, meanwhile utilizing the stochastic dynamic programming in solving the cost function. Considering the characteristics of the driving cycles of city buses, the regularities of the driving cycles might be easily 'extracted' from the collected historical data. Obviously, the SDP might be the most appropriate algorithm to implement the optimization of the energy management for PHEB. However, utilizing SDP algorithm to design the optimal energy management strategy also faces two challenges. First, the cost function of SDP algorithm is constructed through using the basic discrete method, which takes a constant value over each of the discretization intervals. Second, the discretization approach owns the problem of "curse of dimensionality". This paper describes an alternative approach for finding control strategy with stochastic Markov model of PHEB energy management, in which the cost function is approximated directly without resorting to discretization. Because the statistical learning method is introduced in this approach, it is not necessary to know all of the parameters in the MDP model. And using the approximate method, it will reduce the burden of the computation in our problem. The PHEB structure discussed in this paper is a typical single-shaft parallel configuration shown in Figure 1. For such a PHEB, a cost function of fuel consumption and electric consumption based on Markov decision process will be presented. Then a learning method is proposed to search for a minimal value of this cost function, and obtain the optimal control strategy simultaneously. In the proposed method, a simpler function is used for approximating the cost function, and in the process of this method, a linear regression method is adopted which make the problem much easier to solve. Moreover, sample data is easy to be obtained because PHEBs always run on a fixed route many times. The driving cycle for simulation in this paper starts from Yudong station to Nanping station in Chongqing city, including 32 bus stops. As quantitative perspective, the simulation results with three strategies: CDCS?MDP?DP show that the energy consumption generated by the proposed MDP strategy is higher than that generated by the standard DP algorithm, but significantly lower than that of CDCS strategy. Furthermore, a test based on a real PHEB was carried out to verify the applicable of the proposed method. Sun Y, Chen Z, Yan B J, et al. A learning method for energy optimization of the plug-in hybrid electric bus. Sci China Tech Sci, doi: 10.1007/s11431-015-5852-x
Related Links Science China Press Car Technology at SpaceMart.com
|
|
The content herein, unless otherwise known to be public domain, are Copyright 1995-2014 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. Privacy Statement All images and articles appearing on Space Media Network have been edited or digitally altered in some way. Any requests to remove copyright material will be acted upon in a timely and appropriate manner. Any attempt to extort money from Space Media Network will be ignored and reported to Australian Law Enforcement Agencies as a potential case of financial fraud involving the use of a telephonic carriage device or postal service. |