期刊名称:International Journal on Computer Science and Engineering
印刷版ISSN:2229-5631
电子版ISSN:0975-3397
出版年度:2010
卷号:2
期号:1 Supplementary
页码:8-13
出版社:Engg Journals Publications
摘要:The purpose of this work is to analyze the performance of back-propagation feed-forward algorithm using various different activation functions for the neurons of hidden and output layer and varying the number of neurons in the hidden layer. For sample creation, 250 numerals were gathered form 35 people of different ages including male and female. After binarization, these numerals were clubbed together to form training patterns for the neural network. Network was trained to learn its behavior by adjusting the connection strengths at every iteration. The conjugate gradient descent of each presented training pattern was calculated to identify the minima on the error surface for each training pattern. Experiments were performed by selecting different combinations of two activation functions out of the three activation functions logsig, tansig and purelin for the neurons of the hidden and output layers and the results revealed that as the number of neurons in the hidden layer is increased, the network gets trained in small number of epochs and the percentage recognition accuracy of the neural network was observed to increase up to certain level and then it starts decreasing when number of hidden neurons exceeds a certain level.