DR-RQL: A Sustainable Demand Response-Based Learning System for Energy Scheduling and Battery Health Estimation

Journal Article (2025)
Author(s)

Kailian Deng (Donghua University, Ministry of Education)

Hongtao Zhang (Ministry of Education, Donghua University)

Zihao Cui (Ministry of Education, Donghua University)

Zhongyi Zha (Donghua University, Huazhong University of Science and Technology)

Shuyi Gao (TU Delft - Intelligent Electrical Power Grids)

Shuai Yan (Ministry of Education, Donghua University)

Yicun Hua (Ministry of Education, Donghua University)

Xiaojie Liu (Ministry of Education, Donghua University)

Shaoxuan Xu (Ministry of Education, Donghua University)

undefined More Authors

Research Group
Intelligent Electrical Power Grids
DOI related publication
https://doi.org/10.3390/su172410970
More Info
expand_more
Publication Year
2025
Language
English
Research Group
Intelligent Electrical Power Grids
Journal title
Sustainability
Issue number
24
Volume number
17
Article number
10970
Downloads counter
47
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Given the uncertainty from renewable production, local loads and battery operating states in microgrid, it is vital to develop an efficient energy management scheme to improve system economics and enhance grid reliability. In this paper, we consider a renewable integrated microgrid scenario including an energy storage system (ESS), bidirectional energy flow from/to conventional power grid and ESS health estimation. We develop a novel demand response-based scheme for microgrid energy management with a long short-term memory (LSTM) network and reinforcement learning (RL), aiming to improve the system operating profit from energy-trading and reduce the battery health cost from energy-scheduling. Specifically, to overcome the uncertainty from future, we utilize LSTM to forecast the unknown demand and electricity price. To obtain the desired ESS control scheme, we apply RL to learn an optimal energy-scheduling strategy. To improve the critical performance of the RL paradigm, we propose a random greedy strategy to encourage exploration. Numerical results show that our proposed algorithm outperforms the baselines by improve the system operating profit by 8.27% and 17.31% while ensuring ESS operating safety. By integrating energy efficiency with sustainable energy management practices, our scheme contributes to long-term environmental and economic resilience.