期刊名称:Journal of Intelligent Learning Systems and Applications
印刷版ISSN:2150-8402
电子版ISSN:2150-8410
出版年度:2011
卷号:3
期号:4
页码:242-248
DOI:10.4236/jilsa.2011.34027
出版社:Scientific Research Publishing
摘要:In this paper, an efficient weight initialization method is proposed using Cauchy’s inequality based on sensitivity analy- sis to improve the convergence speed in single hidden layer feedforward neural networks. The proposed method ensures that the outputs of hidden neurons are in the active region which increases the rate of convergence. Also the weights are learned by minimizing the sum of squared errors and obtained by solving linear system of equations. The proposed method is simulated on various problems. In all the problems the number of epochs and time required for the proposed method is found to be minimum compared with other weight initialization methods.
关键词:Weight Initialization; Backpropagation; Feedforward Neural Network; Cauchy’s Inequality; Linear System of Equations