Backpropagation Neural Network with Combination of Activation Functions for Inbound Traffic Prediction

Main Authors: Purnawansyah, Purnawansyah, Haviluddin, Haviluddin, Darwis, Herdianti, Azis, Huzain, Salim, Yulita
Other Authors: Universitas Muslim Indonesia, Faculty of Computer Science
Format: Article info application/pdf
Bahasa: eng
Terbitan: Universitas Negeri Malang , 2021
Online Access: http://journal2.um.ac.id/index.php/keds/article/view/20008
http://journal2.um.ac.id/index.php/keds/article/view/20008/8468
http://journal2.um.ac.id/index.php/keds/article/downloadSuppFile/20008/5259
Daftar Isi:
  • Predicting network traffic is crucial for preventing congestion and gaining superior quality of network services. This research aims to use backpropagation to predict the inbound level to understand and determine internet usage. The architecture consists of one input layer, two hidden layers, and one output layer. The study compares three activation functions: sigmoid, rectified linear unit (ReLU), and hyperbolic Tangent (tanh). Three learning rates: 0.1, 0.5, and 0.9 represent low, moderate, and high rates, respectively. Based on the result, in terms of a single form of activation function, although sigmoid provides the least RMSE and MSE values, the ReLu function is more superior in learning the high traffic pattern with a learning rate of 0.9. In addition, Re-LU is more powerful to be used in the first order in terms of combination. Hence, combining a high learning rate and pure ReLU, ReLu-sigmoid, or ReLu-Tanh is more suitable and recommended to predict upper traffic utilization