TY - JOUR
T1 - Synergistic Optimization of Virtual-Shared Energy Storage in Renewables-Rich Microgrids via Asynchronous Curriculum Reinforcement Learning
AU - Hua, Haochen
AU - Gong, Jiakai
AU - Chen, Xingying
AU - Gu, Chenghong
AU - Lei, Shunbo
AU - Yu, Kun
AU - Liu, Di
AU - Hua, Weiqi
PY - 2025/12/12
Y1 - 2025/12/12
N2 - Demand-side energy storage and flexible loads are crucial for enhancing the stability and economy of microgrid operation. However, the integrated uncertainties and heterogeneous demand-side resources complicate coordinated scheduling, resulting in suboptimal utilization efficiency. This paper proposes a novel curriculum reinforcement learning architecture for collaborative scheduling of shared energy storage and flexible load. Flexible loads are constructed as virtual energy storage to increase the regulation capability of the system. A semi-coupled asynchronous optimization method based on reinforcement learning is proposed to solve the complex optimization scheduling problem. Furthermore, an adaptive curriculum learning method is designed to improve algorithm performance under uncertain environment by adaptively adjusting the difficulty of training tasks. The simulation results illustrate that the microgrid operator’s revenue obtained by the proposed energy scheduling method is significantly improved compared to the baseline methods, improving convergence speed by 34.92% and economy by 6.71% over the non-curriculum learning.
AB - Demand-side energy storage and flexible loads are crucial for enhancing the stability and economy of microgrid operation. However, the integrated uncertainties and heterogeneous demand-side resources complicate coordinated scheduling, resulting in suboptimal utilization efficiency. This paper proposes a novel curriculum reinforcement learning architecture for collaborative scheduling of shared energy storage and flexible load. Flexible loads are constructed as virtual energy storage to increase the regulation capability of the system. A semi-coupled asynchronous optimization method based on reinforcement learning is proposed to solve the complex optimization scheduling problem. Furthermore, an adaptive curriculum learning method is designed to improve algorithm performance under uncertain environment by adaptively adjusting the difficulty of training tasks. The simulation results illustrate that the microgrid operator’s revenue obtained by the proposed energy scheduling method is significantly improved compared to the baseline methods, improving convergence speed by 34.92% and economy by 6.71% over the non-curriculum learning.
KW - curriculum learning
KW - Demand-side resource
KW - reinforcement learning
KW - virtual energy storage
UR - https://www.scopus.com/pages/publications/105024683110
U2 - 10.1109/TSG.2025.3643481
DO - 10.1109/TSG.2025.3643481
M3 - Article
AN - SCOPUS:105024683110
SN - 1949-3053
JO - IEEE Transactions on Smart Grid
JF - IEEE Transactions on Smart Grid
ER -