Demonstrating a Bayesian Online Learning for Energy-Aware Resource Orchestration in vRANs
Jose A. Ayala-Romero (Trinity College Dublin)
Andres Garcia-Saavedra (NEC Laboratories Europe)
Xavier Costa-Perez (I2CAT Foundation, Catalan Institution for Research and Advanced Studies (ICREA))
George Iosifidis (TU Delft - Embedded Systems)
More Info
expand_more
Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.
Abstract
Radio Access Network Virtualization (vRAN) will spearhead the quest towards supple radio stacks that adapt to heterogeneous infrastructure: from energy-constrained platforms deploying cells-on-wheels (e.g., drones) or battery-powered cells to green edge clouds. We demonstrate a novel machine learning approach to solve resource orchestration problems in energy-constrained vRANs. Specifically, we demonstrate two algorithms: (i) BP-vRAN, which uses Bayesian online learning to balance performance and energy consumption, and (ii) SBP-vRAN, which augments our Bayesian optimization approach with safe controls that maximize performance while respecting hard power constraints. We show that our approaches are data-efficient— converge an order of magnitude faster than other machine learning methods—and have provably performance, which is paramount for carrier-grade vRANs. We demonstrate the ad-vantages of our approach in a testbed comprised of fully-fledged LTE stacks and a power meter, and implementing our approach into O-RAN’s non-real-time RAN Intelligent Controller (RIC).