Page 5 - ITU Journal Future and evolving technologies Volume 2 (2021), Issue 4 – AI and machine learning solutions in 5G and future networks
P. 5
ITU Journal on Future and Evolving Technologies, Volume 2 (2021), Issue 4
Challenge Organizers’ Editorial
Artificial Intelligence (AI) / Machine Learning (ML) is impacting every aspect of business and society.
AI will also shape how communication networks, a lifeline of our society, will evolve.
Applying AI/ML in communication networks poses an entirely different set of challenges than in other
domains like image recognition or natural language processing. Time scales in a communication
network span many orders of magnitude; some parameters change on an annual basis (e.g. your
subscription to a telecom provider), while others may vary on a millisecond timescale (e.g. resource
block allocation in the radio access networks). In addition, network environments are dynamic and
noisy. Limitations in computing resources in a network adds to these challenges. Thus, while telecom
operators have seen early AI applications to predict customer churn, predict fraud, identify customers
for promotions, the industry has been slower in applying AI in use cases related to different domains
within networks like core networks, radio access networks and management domains.
ITU has been at the forefront to explore how to best apply AI/ML in future networks including 5G
networks. To advance the use of AI/ML in the telco industry, the ITU AI/ML in 5G Challenge was born
(https://aiforgood.itu.int/ai-ml-in-5g-challenge-2020/). It rallied like-minded students and professionals
from around the globe to study the practical application of AI/ML in emerging and future networks.
The first edition of the Challenge was conducted in 2020 with over 1300 students and professionals
from 62 countries, competing for global recognition and a shared a prize fund totalling 33 000 CHF.
Through the Challenge, ITU encourages and supports the growing community driving the integration
of AI/ML in networks and at the same time enhances the community driving standardization work for
AI/ML, creating new opportunities for industry and academia to influence the evolution of ITU
standards. Tools, data resources and problem statements were contributed by industry and academia in
Brazil, China, India, Ireland, Japan, Russia, Spain, Turkey and the United States. The Challenge offered
participants an opportunity to showcase their talent, test their concepts on real data and real-world
problems, and compete for global recognition. The solutions can be accessed in several repositories on
the Challenge GitHub: https://github.com/ITU-AI-ML-in-5G-Challenge.
Many solutions submitted to the Challenge were innovative and, in some cases, improvements with
respect to the baselines. To share the solutions with the larger community, ITU issued a call for papers
for a special issue on AI and machine learning solutions in 5G and future networks of the ITU Journal
on Future and Evolving Technologies (ITU J-FET). In this special issue, hosts (i.e., the originators of
the problem statements) and participants of the ITU Challenge submitted their solutions and learnings
for publication. This special issue is dedicated to exploration of Artificial Intelligence and Machine
Learning in 5G and future networks as well as enabling technologies and tools in networks. After
rigorous review by reviewers in conjunction with guest editors, 10 papers were accepted for publication.
The ability to automatically and rapidly detect network and device failures is an essential feature for
network operators to provide reliable service in future networks and 5G. In the paper “Analysis on route
information failure in IP core networks by NFV-based test environment,” the authors propose a method
that extract features from large-scale unstructured data to differentiate between normal and abnormal
states. The proposed method reduces computation without degrading the performance and achieves a
prediction accuracy of 94%.
Existing methods of network topology planning do not consider the increasing network traffic and
uneven link capacity utilization, resulting in sub-optimal resource utilization and unnecessary
investments in network construction. In this special issue, two papers “Applying machine learning in
network topology optimization” and “AI-based network topology optimization system” consider the
problem of topology optimization. The former proposes a solution by considering an ML pipeline in
– iii –