Abstract [eng] |
The progress of mobile communication is relentlessly increasing. New mobile technologies and the amount of generated data is growing every year. Increased data volumes and number of users forced network operators to find more advanced solution to optimize their network. The combination of big data and machine learning has become a promising are to better assess network performance indicators, end user experience and identify problematic areas in network. The main goal of this thesis is by using the statistics of the modeled LTE network and additional data sources to create models for big data analysis that could predict the network performance. We used LSTM recurrent neural network model and ARIMA, fbProphet as baseline models. During the work we investigated NRMSE, NMSE, correlation metric dependencies when forecast was in range of 12-300 hours. We introduced new cross-correlation error metric that led NRMSE in training phase to decrease by 0,011 compared to LSTM model only with MSE metric, while testing phase NRMSE was around the same and the statistical significance of the obtained results is confirmed by Student’s test. With the addition of cross-correlation error metric, the training rate of the model increased with forecast of up to 48 hours and in third epoch reached and average of 0,112 lower NRMSE compared with model that is using only MSE as loss metric. By comparing the developed LSTM network model with baseline ARIMA, fbProphet models, LSTM has an in average of 0,261 lower NRMSE as well as in average of 0,083 higher correlation, substantiating the significance by Student’s test. The obtained results are comparable with the results of 3 published articles, but lag behind compared to other 4. |