Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

Restoring Distribution System Under Renewable Uncertainty Using Reinforcement Learning

Conference ·
Distributed energy resources (DERs) in distribution systems, including renewable generation, micro-turbine, and energy storage, can be used to restore critical loads following extreme events to increase grid resiliency. However, properly coordinating multiple DERs in the system for multi-step restoration process under renewable uncertainty and fuel availability is a complicated sequential optimal control problem. Due to its capability to handle system non-linearity and uncertainty, reinforcement learning (RL) stands out as a potentially powerful candidate in solving complex sequential control problems. Moreover, the offline training of RL provides excellent action readiness during online operation, making it suitable to problems such as load restoration, where in-time, correct and coordinated actions are needed. In this study, a distribution system prioritized load restoration based on a simplified single-bus system is studied: with imperfect renewable generation forecast, the performance of an RL controller is compared with that of a deterministic model predictive control (MPC). Our experiment results show that the RL controller is able to learn from experience, adapt to the imperfect forecast information and provide a more reliable restoration process when compared with the baseline controller.
Research Organization:
National Renewable Energy Laboratory (NREL), Golden, CO (United States)
Sponsoring Organization:
USDOE Office of Electricity, Advanced Grid Modeling Program
DOE Contract Number:
AC36-08GO28308
OSTI ID:
1766867
Report Number(s):
NREL/CP-2C00-79160; MainId:33386; UUID:e2732ca4-13a7-499c-9355-5dcb97596fa2; MainAdminID:19685
Country of Publication:
United States
Language:
English