首页    期刊浏览 2024年11月30日 星期六
登录注册

文章基本信息

  • 标题:Back Propagation Algorithm : The Best Algorithm Among the Multi-layer Perceptron Algorithm
  • 作者:Mutasem khalil Sari Alsmadi ; Khairuddin Bin Omar ; Shahrul Azman Noah
  • 期刊名称:International Journal of Computer Science and Network Security
  • 印刷版ISSN:1738-7906
  • 出版年度:2009
  • 卷号:9
  • 期号:4
  • 页码:378-383
  • 出版社:International Journal of Computer Science and Network Security
  • 摘要:A multilayer perceptron is a feed forward artificial neural network model that maps sets of input data onto a set of appropriate output. It is a modification of the standard linear perceptron in that it uses three or more layers of neurons (nodes) with nonlinear activation functions and is more powerful than the perceptron in that it can distinguish data that is not linearly separable, or separable by a hyper plane. MLP networks are general-purpose, flexible, nonlinear models consisting of a number of units organized into multiple layers. The complexity of the MLP network can be changed by varying the number of layers and the number of units in each layer. Given enough hidden units and enough data, it has been shown that MLPs can approximate virtually any function to any desired accuracy. This study presents the performance comparison between multi-layer perceptron (back propagation, delta rule and perceptron). Perceptron is a steepest descent type algorithm that normally has slow convergence rate and the search for the global minimum often becomes trapped at poor local minima. The current study investigates the performance of three algorithms to train MLP networks. It was found that the back propagation algorithm are much better than others algorithms.
  • 关键词:Back propagation; perceptron; delta rule learning; classification
Loading...
联系我们|关于我们|网站声明
国家哲学社会科学文献中心版权所有