首页    期刊浏览 2025年06月15日 星期日
登录注册

文章基本信息

  • 标题:Backpropagation Neural Network with Combination of Activation Functions for Inbound Traffic Prediction
  • 本地全文:下载
  • 作者:Purnawansyah Purnawansyah ; Haviluddin Haviluddin ; Herdianti Darwis
  • 期刊名称:Knowledge Engineering and Data Science
  • 印刷版ISSN:2597-4602
  • 电子版ISSN:2597-4637
  • 出版年度:2021
  • 卷号:4
  • 期号:1
  • 页码:14-28
  • DOI:10.17977/um018v4i12021p14-28
  • 语种:English
  • 出版社:Universitas Negeri Malang
  • 摘要:Predicting network traffic is crucial for preventing congestion and gaining superior quality of network services. This research aims to use backpropagation to predict the inbound level to understand and determine internet usage. The architecture consists of one input layer, two hidden layers, and one output layer. The study compares three activation functions: sigmoid, rectified linear unit (ReLU), and hyperbolic Tangent (tanh). Three learning rates: 0.1, 0.5, and 0.9 represent low, moderate, and high rates, respectively. Based on the result, in terms of a single form of activation function, although sigmoid provides the least RMSE and MSE values, the ReLu function is more superior in learning the high traffic pattern with a learning rate of 0.9. In addition, Re-LU is more powerful to be used in the first order in terms of combination. Hence, combining a high learning rate and pure ReLU, ReLu-sigmoid, or ReLu-Tanh is more suitable and recommended to predict upper traffic utilization
国家哲学社会科学文献中心版权所有