Page 6 - ITU Journal Future and evolving technologies Volume 2 (2021), Issue 4 – AI and machine learning solutions in 5G and future networks
P. 6
ITU Journal on Future and Evolving Technologies, Volume 2 (2021), Issue 4
ITU Y.3172 (https://www.itu.int/rec/T-REC-Y.3172/en, an ITU standard) and other mainstream
algorithms to solve the problem, whereas the latter builds a Long Short-term Memory (LSTM) model
for time series traffic forecasting, using NetworkX (a Python library) to dynamically optimize the
network topology by edge deletion or addition based on traffic over nodes.
The paper “Machine learning for performance prediction of channel bonding in next-generation
IEEE 802.11 WLANS” presents results gathered from the problem statement by Universitat Pompeu
Fabra (UPF), whose primary goal was predicting the performance of next-generation Wireless Local
Area Networks (WLANs) by applying Channel Bonding (CB) techniques. The paper presents an
overview of ML models proposed by participants (including Artificial Neural Networks, Graph Neural
Networks, Random Forest regression, and gradient boosting) and analyze their performance on an open
dataset generated using the IEEE 802.11ax-oriented Komondor network simulator. The accuracy
achieved by the proposed methods demonstrates the suitability of ML for predicting the performance
of WLANs.
Recent advancements in Deep Learning (DL) have revolutionized the way we can efficiently tackle
complex optimization problems. However, existing DL-based solutions are often considered as black
boxes with high inner complexity. In this context, explainability techniques have recently emerged to
unveil why DL models make each decision. The paper “NetXplain: Real-time explainability of graph
neural networks applied to networking” focuses on the explainability of Graph Neural Networks (GNN)
applied to networking. GNNs are a novel DL family with unique properties to generalize over graphs.
As a result, they have shown unprecedented performance to solve complex network optimization
problems. NetXplain is a novel real-time explainability solution that uses a GNN to interpret the output
produced by another GNN. In evaluation, the proposed explainability method is applied to RouteNet, a
GNN model that predicts end-to-end QoS metrics in networks.
In the paper “Graph-neural-network-based delay estimation for communication networks with
heterogeneous scheduling policies,” the authors propose a solution that supports multiple scheduling
policies (Strict Priority, Deficit Round Robin, Weighted Fair Queuing) and handles mixed scheduling
policies in a single communication network as opposed to RouteNet which is based on simplified
assumptions (such as the restriction to a single scheduling policy). The solution proposed by the authors
achieved a mean absolute percentage error under 1% on the evaluation data set from the Challenge.
This takes neural-network-based delay estimation one step closer to practical use.
The paper titled “Site-specific millimeter-wave compressive channel estimation algorithms with hybrid
MIMO architectures” presents and compares three novel model-cum-data driven channel estimation
procedures in a millimeter-wave multi-input multi-output (MIMO) orthogonal frequency division
multiplexing (OFDM) wireless communication system. The techniques are adapted from a wide range
of signal processing methods, such as detection and estimation theories, compressed sensing, and
Bayesian inference, to learn the unknown virtual beamspace domain dictionary, as well as the delay-
and-beamspace sparse channel. The model-based algorithms were trained with a site-specific training
dataset generated using a realistic ray tracing-based wireless channel simulation tool. Through
benchmarking, model-based approaches combined with data-driven customization unanimously
outperform the state-of-the-art techniques by a large margin.
Beamforming is an essential technology in the 5G massive multiple-input-multiple-output (MMIMO)
communications, which are subject to many impairments due to the nature of the wireless transmission
channel. The inter-cell interference (ICI) is one of the main obstacles faced by 5G communications due
to frequency-reuse technologies. However, finding the optimal beamforming parameter to minimize the
ICI requires infeasible prior network or channel information. The paper “A dynamic Q-learning
beamforming method for inter-cell interference mitigation in 5G massive MIMO networks” proposes a
dynamic Q-learning beamforming method for ICI mitigation in the 5G downlink that does not require
prior network or channel knowledge. Comparing with a traditional beamforming method and other
industrial Reinforcement Learning (RL) methods, the proposed method has lower computational
– iv –