Abstract
The traditional model-based energy management strategy (EMS) for regenerative braking energy storage systems (RBESS) is obsoleting in the face of increasingly complex and uncertain operation conditions in railway power system (RPS). In this paper, a model-free deep reinforcement learning (DRL) method is proposed. First, the multi-objective energy management problem for RBESS is formulated to concurrently achieve the RBE utilization and power demand shaving of RPS. Then, this problem is modeled as a Markov decision process (MDP) to be solved by the DRL-based method. Specifically, the RBESS controller is modeled as an agent to interact with the environment modeled as the RPS integrated with RBESS. To coordinate the agent to learn the optimal strategies regarding multiple energy management objectives in different time scales, a multistage reward function involving the step reward and final reward is designed. Based on the above elements, the double deep Q-learning algorithm is applied to train the agent for optimizing the EMS. Finally, the proposed DRL-based EMS is tested on the OPAL-RT experimental platform by using the field load data. Case studies have demonstrated that the proposed method outperforms the traditional rule-based and optimization-based methods by over 5% in the energy management objective.
© 2025 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission
© 2025 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission
| Original language | English |
|---|---|
| Journal | IEEE Transactions on Transportation Electrification |
| DOIs | |
| Publication status | Online published - 10 Jan 2025 |
Research Keywords
- deep reinforcement learning
- energy management
- energy storage system
- railway power system
- regenerative braking energy
Fingerprint
Dive into the research topics of 'Multi-timescale Reward-based DRL Energy Management for Regenerative Braking Energy Storage System'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver