首页    期刊浏览 2024年11月24日 星期日
登录注册

文章基本信息

  • 标题:Developing a Novel Recurrent Neural Network Architecture with Fewer Parameters and Good Learning Performance
  • 本地全文:下载
  • 作者:Kazunori D YAMADA ; Fangzhou LIN ; Tsukasa NAKAMURA
  • 期刊名称:Interdisciplinary Information Sciences
  • 印刷版ISSN:1340-9050
  • 电子版ISSN:1347-6157
  • 出版年度:2021
  • 卷号:27
  • 期号:1
  • 页码:25-40
  • DOI:10.4036/iis.2020.R.01
  • 出版社:The Editorial Committee of the Interdisciplinary Information Sciences
  • 摘要:Recurrent neural networks (RNNs) are among the most promising of the many artificial intelligence techniques now under development, showing great potential for memory, interaction, and linguistic understanding. Among the more sophisticated RNNs are long short-term memory (LSTM) and gated recurrent units (GRUs), which emulate animal brain behavior; these methods yield superior memory and learning speed because of the excellent core structure of their architectures. In this study, we attempted to make further improvements in core structure and develop a novel, compact architecture with a high learning speed. We stochastically generated 30000 RNN architectures, evaluated their performance, and selected the one most capable of memorizing long contexts with relatively few parameters. This RNN, YamRNN, had fewer parameters than LSTM and GRU by a factor of two-thirds or better and reduced the time required to achieve the same learning performance on a sequence classification task as LSTM and GRU by 80% at maximum. This novel RNN architecture is expected to be useful for addressing problems such as predictions and analyses on contextual data and also suggests that there is room for the development of better architectures.
  • 关键词:recurrent neural network;compact architecture;memory power;learning speed
国家哲学社会科学文献中心版权所有