Demonstrating a Bayesian Online Learning for Energy-Aware Resource Orchestration in vRANs
More Info
expand_more
Abstract
Radio Access Network Virtualization (vRAN) will spearhead the quest towards supple radio stacks that adapt to heterogeneous infrastructure: from energy-constrained platforms deploying cells-on-wheels (e.g., drones) or battery-powered cells to green edge clouds. We demonstrate a novel machine learning approach to solve resource orchestration problems in energy-constrained vRANs. Specifically, we demonstrate two algorithms: (i) BP-vRAN, which uses Bayesian online learning to balance performance and energy consumption, and (ii) SBP-vRAN, which augments our Bayesian optimization approach with safe controls that maximize performance while respecting hard power constraints. We show that our approaches are data-efficient— converge an order of magnitude faster than other machine learning methods—and have provably performance, which is paramount for carrier-grade vRANs. We demonstrate the ad-vantages of our approach in a testbed comprised of fully-fledged LTE stacks and a power meter, and implementing our approach into O-RAN’s non-real-time RAN Intelligent Controller (RIC).