Print Email Facebook Twitter Configuration of the Actor and Critic Network of the Deep Reinforcement Learning controller for Multi-Energy Storage System Title Configuration of the Actor and Critic Network of the Deep Reinforcement Learning controller for Multi-Energy Storage System Author Páramo-Balsa, Paula (University of Seville) Gonzalez-Longatt, Francisco (University of South-Eastern Norway) Acosta, Martha N. (University of South-Eastern Norway) Rueda, José L. (TU Delft Intelligent Electrical Power Grids) Palensky, P. (TU Delft Intelligent Electrical Power Grids) Sanchez, Francisco (Loughborough University) Roldan-Fernandez, Juan Manuel (University of Seville) Burgos-Payán, Manuel (University of Seville) Date 2022 Abstract The computational burden and the time required to train a deep reinforcement learning (DRL) can be appreciable, especially for the particular case of a DRL control used for frequency control of multi-electrical energy storage (MEESS). This paper presents an assessment of four training configurations of the actor and critic network to determine the configuration training that produces the lower computational time, considering the specific case of frequency control of MEESS. The training configuration cases are defined considering two processing units: CPU and GPU and are evaluated considering serial and parallel computing using MATLAB® 2020b Parallel Computing Toolbox. The agent used for this assessment is the Deep Deterministic Policy Gradient (DDPG) agent. The environment represents the dynamic model to provide enhanced frequency response to the power system by controlling the state of charge of energy storage systems. Simulation results demonstrated that the best configuration to reduce the computational time is training both actor and critic network on CPU using parallel computing. Subject actor-networkcritic networkdeep reinforcement learningenergy storage systemsenhanced frequency responseparallel computing To reference this document use: http://resolver.tudelft.nl/uuid:5ddd951e-8708-4961-b95e-12b0c5427d03 DOI https://doi.org/10.1109/GPECOM55404.2022.9815793 Publisher IEEE, Piscataway Embargo date 2023-01-11 ISBN 978-1-6654-6926-5 Source Proceedings of the 2022 4th Global Power, Energy and Communication Conference (GPECOM) Event 2022 4th Global Power, Energy and Communication Conference (GPECOM), 2022-06-14 → 2022-06-17, Nevsehir, Turkey Bibliographical note Green Open Access added to TU Delft Institutional Repository 'You share, we take care!' - Taverne project https://www.openaccess.nl/en/you-share-we-take-care Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public. Part of collection Institutional Repository Document type conference paper Rights © 2022 Paula Páramo-Balsa, Francisco Gonzalez-Longatt, Martha N. Acosta, José L. Rueda, P. Palensky, Francisco Sanchez, Juan Manuel Roldan-Fernandez, Manuel Burgos-Payán Files PDF Configuration_of_the_Acto ... System.pdf 1.09 MB Close viewer /islandora/object/uuid:5ddd951e-8708-4961-b95e-12b0c5427d03/datastream/OBJ/view