Orchestrating Energy-Efficient vRANs

Bayesian Learning and Experimental Results

Journal Article (2022)
Author(s)

Jose A. Ayala-Romero (Trinity College Dublin)

Andres Garcia-Saavedra (NEC Laboratories Europe)

Xavier Costa-Perez (NEC Laboratories Europe)

George Iosifidis (TU Delft - Embedded Systems)

Research Group
Embedded Systems
Copyright
© 2022 Jose A. Ayala-Romero, Andres Garcia-Saavedra, Xavier Costa-Perez, G. Iosifidis
DOI related publication
https://doi.org/10.1109/TMC.2021.3123794
More Info
expand_more
Publication Year
2022
Language
English
Copyright
© 2022 Jose A. Ayala-Romero, Andres Garcia-Saavedra, Xavier Costa-Perez, G. Iosifidis
Research Group
Embedded Systems
Issue number
5
Volume number
22
Pages (from-to)
2910-2924
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Virtualized base stations (vBS) can be implemented in diverse commodity platforms and are expected to bring unprecedented operational flexibility and cost efficiency to the next generation of cellular networks. However, their widespread adoption is hampered by their complex configuration options that affect in a non-traditional fashion both their performance and their power consumption. Following an in-depth experimental analysis in a bespoke testbed, we characterize the vBS power consumption profile and reveal previously unknown couplings between their various control knobs. Motivated by these findings, we develop a Bayesian learning framework for the orchestration of vBSs and design two novel algorithms: (i) BP-vRAN, which employs online learning to balance the vBS performance and energy consumption, and (ii) SBP-vRAN, which augments our optimization approach with safe controls that maximize performance while respecting hard power constraints. We show that our approaches are data-efficient, i.e., converge an order of magnitude faster than state-of-the-art Deep Reinforcement Learning methods, and achieve optimal performance. We demonstrate the efficacy of these solutions in an experimental prototype using real traffic traces.

Files

Orchestrating_Energy_Efficient... (pdf)
(pdf | 2.98 Mb)
- Embargo expired in 01-05-2023
License info not available