Managing offshore wind turbines through Markov decision processes and dynamic Bayesian networks

More Info
expand_more

Abstract

Efficient planning of inspection and maintenance (I&M) actions in civil and maritime environments is of paramount importance to balance management costs against failure risk caused by deteriorating mechanisms. Determining I&M policies for such cases constitutes a complex sequential decision-making optimization problem under uncertainty. Addressing this complexity, Partially Observable Markov Decision Processes (POMDPs) provide a principled mathematical methodology for stochastic optimal control, in which the optimal actions are prescribed as a function of the entire, dynamically updated, state probability distribution. As shown in this paper, by integrating Dynamic Bayesian Networks (DBNs) with POMDPs, advanced algorithmic schemes of probabilistic inference and decision optimization under uncertainty can be uniquely combined into an efficient planning platform. To demonstrate the capabilities of the proposed approach, POMDP and heuristic-based I&M policies are compared, with emphasis on an offshore wind substructure subject to fatigue deterioration. Results verify that POMDP solutions offer substantially reduced costs compared to their counterparts, even in traditional problem settings.