首页    期刊浏览 2024年08月24日 星期六
登录注册

文章基本信息

  • 标题:Optimal Feed Forward MLPArchitecture for Off-Line Cursive Numeral Recognition
  • 本地全文:下载
  • 作者:Amit Choudhary ; Rahul Rishi ; Savita Ahlawat
  • 期刊名称:International Journal on Computer Science and Engineering
  • 印刷版ISSN:2229-5631
  • 电子版ISSN:0975-3397
  • 出版年度:2010
  • 卷号:2
  • 期号:1 Supplementary
  • 页码:1-7
  • 出版社:Engg Journals Publications
  • 摘要:The purpose of this work is to analyze the performance of back-propagation feed-forward algorithm using various different activation functions for the neurons of hidden and output layer and varying the number of neurons in the hidden layer. For sample creation, 250 numerals were gathered form 35 people of different ages including male and female. After binarization, these numerals were clubbed together to form training patterns for the neural network. Network was trained to learn its behavior by adjusting the connection strengths at every iteration. The conjugate gradient descent of each presented training pattern was calculated to identify the minima on the error surface for each training pattern. Experiments were performed by selecting different combinations of two activation functions out of the three activation functions logsig, tansig and purelin for the neurons of the hidden and output layers and the results revealed that as the number of neurons in the hidden layer is increased, the network gets trained in small number of epochs and the percentage recognition accuracy of the neural network was observed to increase up to certain level and then it starts decreasing when number of hidden neurons exceeds a certain level.
  • 关键词:Numeral Recognition; MLP; Hidden Layers; Backpropagation; Conjugate Gradient Descent; Activation Functions.
国家哲学社会科学文献中心版权所有