首页    期刊浏览 2024年11月30日 星期六
登录注册

文章基本信息

  • 标题:Impact of Variances of Random Weights and Biases on Extreme Learning Machine
  • 本地全文:下载
  • 作者:Xiao Tao ; Xu Zhou ; Yu Lin He
  • 期刊名称:Journal of Software
  • 印刷版ISSN:1796-217X
  • 出版年度:2016
  • 卷号:11
  • 期号:5
  • 页码:440-454
  • DOI:10.17706/jsw.11.5.440-454
  • 出版社:Academy Publisher
  • 摘要:Although the uniform convergence of extreme learning machine (ELM) has been proved for any continues probability distribution, the variances of random numbers initializing input-layer weights and hidden-layer biases indeed have the obvious impact on generalization performance of ELM. In this paper, we validate this effect by testing the classification accuracies of ELMs initialized by the random numbers with different variances. We select three commonly-used probability distributions (i.e., Uniform, Gamma and Normal) and 30 UCI data sets to conduct our comparative study. The experimental results present some important and valuable observations and instructions: (1) Uniform and Gamma distributions with the smaller variances usually make ELMs get the higher training and testing accuracies; (2) In comparison with Normal distribution, the variances of Uniform and Gamma distributions have the significant impact on classification performance of ELMs; (3) Uniform and Gamma distributions with the larger variances could seriously degrade the classification capability of ELMs; (4) ELMs initialized by Uniform and Gamma distributions with the larger variances generally needs the more hidden-layer nodes to achieve the equivalent classification accuracies with ones having the smaller variances; and (5) Normal distribution are more easily lead to the over-fitting of ELMs.
  • 其他关键词:Extreme learning machine, random initialization, probability distribution, variance.
国家哲学社会科学文献中心版权所有