首页    期刊浏览 2024年12月04日 星期三
登录注册

文章基本信息

  • 标题:Bayesian Statistics as an Alternative to Gradient Descent in Sequence Learning
  • 其他标题:Bayesian Statistics as an Alternative to Gradient Descent in Sequence Learning
  • 本地全文:下载
  • 作者:Rainer Spiegel
  • 期刊名称:International Journal of Emerging Technologies in Learning (iJET)
  • 印刷版ISSN:1863-0383
  • 出版年度:2007
  • 卷号:2
  • 期号:3
  • 语种:English
  • 出版社:Kassel University Press
  • 摘要:Recurrent neural networks are frequently applied to simulate sequence learning applications such as language processing, sensory-motor learning, etc. For this purpose, they often apply a truncated gradient descent (=error correcting) learning algorithm. In order to converge to a solution that is congruent with a target set of sequences, many iterations of sequence presentations and weight adjustments are typically needed. Moreover, there is no guarantee of finding the global minimum of error in a multidimensional error landscape resulting from the discrepancy between target values and the network�??s prediction. This paper presents a new approach of inferring the global error minimum right from the start. It further applies this information to reverse-engineer the weights. As a consequence, learning is speeded-up tremendously, whilst computationally-expensive iterative training trials can be skipped. Technology applications in established and emerging industries will be discussed.
国家哲学社会科学文献中心版权所有