Page 13 - ITU Journal Future and evolving technologies Volume 2 (2021), Issue 4 – AI and machine learning solutions in 5G and future networks
P. 13

ITU Journal on Future and Evolving Technologies, Volume 2 (2021), Issue 4


               NetXplain:  Real-time  explainability  of  graph  neural  networks  applied  to
               networking

               Pages 57–66
               David  Pujol-Perich,  José  Suárez-Varela,  Shihan  Xiao,  Bo  Wu,  Albert  Cabellos-Aparicio,
               Pere Barlet-Ros

               Recent advancements in Deep Learning (DL) have revolutionized the way we can efficiently tackle
               complex optimization problems. However, existing DL-based solutions are often considered as black
               boxes with high inner complexity. As a result, there is still certain skepticism among the networking
               industry  about  their  practical  viability  to  operate  data  networks.  In  this  context,  explainability
               techniques have recently emerged to unveil why DL models make each decision. This paper focuses on
               the explainability of Graph Neural Networks (GNNs) applied to networking. GNNs are a novel DL
               family with unique properties to generalize over graphs. As a result, they have shown unprecedented
               performance to solve complex network optimization problems. This paper presents NetXplain, a novel
               real-time explainability solution that uses a GNN to interpret the output produced by another GNN. In
               the evaluation, we apply the proposed explainability method to RouteNet, a GNN model that predicts
               end-to-end QoS metrics in networks. We show that NetXplain operates more than 3 orders of magnitude
               faster than state-of-the-art explainability solutions when applied to networks up to 24 nodes, which
               makes it compatible with real-time applications; while demonstrating strong capabilities to generalize
               to network scenarios not seen during training.
               View Article

               Machine  learning  for  performance  prediction  of  channel  bonding  in  next-
               generation IEEE 802.11 WLANS


               Pages 67–79
               Francesc  Wilhelmi,  David  Góez,  Paola  Soto,  Ramon  Vallés,  Mohammad  Alfaifi,  Abdulrahman
               Algunayah, Jorge Martín-Pérez, Luigi Girletti, Rajasekar Mohan, K Venkat Ramnan, Boris Bellalta
               With the advent of Artificial Intelligence (AI)-empowered communications, industry, academia, and
               standardization  organizations  are  progressing  on  the  definition  of  mechanisms  and  procedures  to
               address  the  increasing  complexity  of  future  5G  and  beyond  communications.  In  this  context,  the
               International  Telecommunication  Union  (ITU)  organized  the  First  AI  for  5G  Challenge  to  bring
               industry and academia together to introduce and solve representative problems related to the application
               of Machine Learning (ML) to networks. In this paper, we present the results gathered from Problem
               Statement  13  (PS-013),  organized  by  Universitat  Pompeu  Fabra  (UPF),  whose  primary  goal  was
               predicting  the  performance  of  next-generation  Wireless  Local  Area  Networks  (WLANs)  applying
               Channel Bonding (CB) techniques. In particular, we provide an overview of the ML models proposed
               by participants (including artificial neural networks, graph neural networks, random forest regression,
               and gradient boosting) and analyze their performance on an open data set generated using the IEEE
               802.11ax-oriented  Komondor  network  simulator.  The  accuracy  achieved  by  the  proposed  methods
               demonstrates the suitability of ML for predicting the performance of WLANs. Moreover, we discuss
               the importance of abstracting WLAN interactions to achieve better results, and we argue that there is
               certainly room for improvement in throughput prediction through ML.
               View Article











                                                           – xi –
   8   9   10   11   12   13   14   15   16   17   18