A Learning and Operation Planning Method for Uber Energy Storage System : Order Dispatch
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review
Author(s)
Detail(s)
Original language | English |
---|---|
Pages (from-to) | 23070-23083 |
Journal / Publication | IEEE Transactions on Intelligent Transportation Systems |
Volume | 23 |
Issue number | 12 |
Online published | 26 Aug 2022 |
Publication status | Published - Dec 2022 |
Externally published | Yes |
Link(s)
Abstract
With the increasing penetration of intermittent renewable energy resources, electricity distribution networks may face many challenges in terms of system security and reliability. In this context, mobile power sources can provide various distribution network services, including load leveling, peak shaving, voltage regulation, and emergency backup. Different from the stationary energy storage system (SESS), mobile power sources show advantages in mobility and flexibility. In the current literature, the dispatch of the mobile power sources relies on day-ahead scheduling based on optimization, which lacks the capability of dealing with emerging power problems. In order to realize that the mobile power sources can provide on-demand local service efficiently, an uber energy storage system (UESS) is presented based on a learning and planning integrated approach. First, each bus generates the UESS service order. Then, a centralized platform dispatches the orders to the UESSs through a planning problem. To solve the dispatch problem efficiently, we convert the optimization to a bipartite graph problem with low complexity. With the assistance of deep reinforcement learning, the value of each order dispatch action is learned, and the weight of each edge in the bipartite graph equals the corresponding action value. The proposed method is verified in case studies. Simulation results reveal that the daily cost savings and the finish rate of the proposed method are around $ 1562 and 15% higher than that of the nearest rule, respectively. Compared with the case without UESS, the voltage violation and the power loss issues are alleviated, and the profit of the system operator can be increased by 16.7%. Compared with SESS, the load curtailment cost with UESS can be reduced by 21.9k under contingency, which indicates that the resilience of the system is enhanced. Compared with MESS, UESS can better realize load restoration under emergencies. © 2022 IEEE.
Research Area(s)
- Bipartite graph, deep reinforcement learning, order dispatch, uber energy storage system
Citation Format(s)
A Learning and Operation Planning Method for Uber Energy Storage System: Order Dispatch. / Tao, Yuechuan; Qiu, Jing; Lai, Shuying.
In: IEEE Transactions on Intelligent Transportation Systems, Vol. 23, No. 12, 12.2022, p. 23070-23083.
In: IEEE Transactions on Intelligent Transportation Systems, Vol. 23, No. 12, 12.2022, p. 23070-23083.
Research output: Journal Publications and Reviews › RGC 21 - Publication in refereed journal › peer-review