Page 98 - ITU Journal Future and evolving technologies Volume 2 (2021), Issue 4 – AI and machine learning solutions in 5G and future networks
P. 98
ITU Journal on Future and Evolving Technologies, Volume 2 (2021), Issue 4
2. LSTM MODEL FOR TRAFFIC 3) Train the model: Input training data and test
FORECASTING data into the LSTM model, and set epochs and
batch_size to 50 and 32 respectively.
Network traffic forecasting is made based on
network elements and affected by various factors 4) Conduct traffic forecasting: Forecast the traffic
such as the location of network elements, weather, of the next hour through iteration technologies
and traffic on different base stations. Forecasting is based on the time sliding window. Enhance the
mainly implemented via Gompertz models or time forecasting efficiency via multiprocessing
series forecasting models like LSTM and Prophet technologies and concurrent processes.
models [8,9,10]. The traffic forecast made based on 2.3 Optimization of the LSTM model
traditional Extended Gompertz Models (EGMs)
cannot reflect traffic differences between working At the beginning of the study, the traffic forecasted
days and holidays. However, the traffic predicted with the default parameters of the model was quite
based on time series forecasting models with deep different from the data (traffic of the last five days)
learning is more accurate, as history data can be which remained for result verification. After
learned and used during the traffic forecast [11]. repeated comparison and analysis, the most proper
epochs, batch_size, and slide_window were
2.1 Traffic forecasting model with deep determined. The following shows the way that we
learning used in the study to determine those parameters.
The short-term traffic forecast predicts the data of First, set the batch_size and slide_window to fixed
the next few days. As daily traffic features strong values, and the epochs to 50, 100, 200 and 400
periodicity, traffic curves of the days in weeks with respectively. Then, evaluate the impact of each
similar attributes are almost the same. In view of epoch’s value on the errors in traffic forecasting in
that, this paper proposes an LSTM model [10,11], a terms of the Root Mean Square Error (RMSE) and
recurrent neural network architecture that gives running time. The results indicate that
full play to the correlation of traffic at different "epochs=200" is the optimal choice.
times, for traffic forecasting. The model is trained
with time-stamped traffic (for example, traffic of After the model was trained for 50, 100, 200, and
working days or holidays), and outputs more 400 times respectively, the RMSE generated
accurate predicted data through iteration accordingly, predicted traffic volume, and the time
technologies [12]. that training costs were recorded in the following
table.
2.2 Establishment of the LSTM model
Table 1 – Optimization of the model's epochs
In the study, the LSTM model was built with
TensorFlow2.0 and traffic per hour was taken as a Predicted Running
Traffic
sample. The traffic samples of the last 20 days were Times RMSE Volume (GB) Time(s)
processed first, and the traffic over each network
element was listed by time series after processing 50 16.246 251.3 55
[13,14]. The following four steps describe how the 100 3.714 279.9 85
LSTM model was built and traffic forecasting was
conducted. 200 1.9 292.4 148
1) Make data sets: The traffic samples of the first 400 2.3 293 305
15 days were used for training and testing, and
those of the remaining 5 days were used for As Table 1 shows, when the model was trained for
result verification. The training data set 200 times, it generated the smallest RMSE and cost
included 75% traffic samples of the first 15 the shortest time. However, when the training was
conducted for 50, 100, and 400 times, both the
days, and the testing data set contained the RMSEs that were generated accordingly, and the
other 25%. The initial default time sliding running time of each training failed to meet the
window was 24.
requirements. Therefore, the model was finally
2) Establish the LSTM model: Add two LSTM trained for 200 times to ensure both high efficiency
layers to the LSTM model for traffic forecasting. and accuracy.
82 © International Telecommunication Union, 2021