Page 75 - ITU Journal Future and evolving technologies Volume 2 (2021), Issue 4 – AI and machine learning solutions in 5G and future networks
P. 75
ITU Journal on Future and Evolving Technologies, Volume 2 (2021), Issue 4
Fig. 1 – Message‑passing phase: (left) Message, (mid) aggregation and Fig. 2 – Transformation from the physical network scenario to the graph
(right) update. representation of RouteNet.
Such a function (·) can also be implemented as a neural of paths and links, and how they relate to the output per‑
network, typically a feed-forward NN, and can be used to path performance metrics (e.g., delay).
produce either node‑level predictions by processing in‑ In this regard, applying explainability over this model
dividually each node hidden state, or make global pre‑ would enable us to identify the most critical edges of its
dictions of the graph by combining all the hidden states. internal graph (i.e., path‑link relations). We refer to crit‑
In this latter case, hidden states are typically aggregated ical edges as the set of path‑link pairs that better explain
(e.g., element‑wise sum) before they are introduced into the QoS metrics obtained by the model. Thus, with this
the readout function. solution, we can extract relevant knowledge of the pro‑
This technology has proven to generalize successfully cessing made by the GNN given a network scenario, which
over graphs of different sizes and structures, which was can have many diverse applications, as later discussed in
not possible with traditional neural network architec- Section 7.
tures (e.g., feed‑forward NN, convolutional NN, recurrent NN).
3. RELATED WORK
2.2 Graph neural networks applied to
Recent years have attracted increasing interest in pro‑
networking
ducing explainability solutions for neural network mod‑
The strong generalization capabilities of GNN over graphs els (e.g., Convolutional Neural Networks [5]). Despite this,
make these models interesting for applications in the explainability techniques for GNN have been scarcely ex‑
networking ield since the most natural way to formal‑ plored so far. In this context, GNNExplainer [17] is, to the
ize many network control and management problems in‑ best of our knowledge, the irst proposal approaching this
volves the use of graphs (e.g., topology, routing, inter‑ problem.
low dependencies) [3]. Recently, several GNN‑based so‑ GNNExplainer is given as input a target GNN model and a
lutions have been proposed to tackle different use cases sample graph = ( , ), with input features . GNNEx‑
in the ield of computer networks (e.g., network mod‑ plainer, then, outputs a subset containing the connections
′
′
eling [12, 16], automatic routing protocols [13]). In ⊂ and the node features ⊂ , that affect most
this section, for illustrative purposes, we focus only on critically the output of the target GNN (see Fig. 3). This is
RouteNet [12], as it is quite representative of how GNN‑ done by computing a set of weights , formally de ined
based solutions represent and process network‑related in Eq. (5), that represents how critical are the pair‑wise
data to solve complex problems. connections of input graphs to the prediction accuracy of
RouteNet targets the problem of modeling the per‑path the target GNN.
QoS metrics (e.g., delay, jitter) of a computer network. For = { , | ( , ) ∈ } (5)
this purpose, a network snapshot is provided as input:
a network topology, a routing con iguration, and a traf‑ Particularly, the most relevant connections are those that
ic matrix. To this end, this model makes a transforma‑ have more impact on the loss function used to train the
tion of the physical network scenario into a more re ined model (e.g., mean squared error for regression tasks).
graph representation in which physical and logical ele‑ The number of relevant connections produced by the al‑
ments are explicitly represented –paths and links in this gorithm can be tuned by setting a threshold on the result‑
case. More speci ically, every link of the physical network ing weights , ∈ .
topology is transformed into a node in the input graph of Overall, GNNExplainer is a generic solution proposed
the GNN. Likewise, each source‑destination path is also from the ML community that targets only at producing
converted into a node. Finally, edges connect links with explainability representations of GNNs used for global
paths according to the routing con iguration. Thus, each graph classi ication, node‑level classi ication, or link pre‑
path is connected to those links that it traverses given the diction. However, this solution does not support GNN‑
input routing scheme. This process is illustrated in Fig. 2, based models used for regression. In this context, a pos‑
where we can observe how a physical network scenario terior solution proposed from the networking commu‑
with two paths and three links is transformed into the in‑ nity presents Metis [3], a similar approach adapted to
put graph of RouteNet. This graph representation enables GNN models trained for regression problems, particularly
us to model the complex relationships between the state showcasing its use in several networking applications.
© International Telecommunication Union, 2021 59