首页    期刊浏览 2024年10月05日 星期六
登录注册

文章基本信息

  • 标题:A Randomized Hyperparameter Tuning of Adaptive Moment Estimation Optimizer of Binary Tree-Structured LSTM
  • 本地全文:下载
  • 作者:Ruo Ando ; Yoshiyasu Takefuji
  • 期刊名称:International Journal of Advanced Computer Science and Applications(IJACSA)
  • 印刷版ISSN:2158-107X
  • 电子版ISSN:2156-5570
  • 出版年度:2021
  • 卷号:12
  • 期号:7
  • DOI:10.14569/IJACSA.2021.0120771
  • 语种:English
  • 出版社:Science and Information Society (SAI)
  • 摘要:Adam (Adaptive Moment Estimation) is one of the promising techniques for parameter optimization of deep learning. Because Adam is an adaptive learning rate method and easier to use than Gradient Descent. In this paper, we propose a novel randomized search method for Adam with randomizing parameters of beta1 and beta2. Random noise generated by normal distribution is added to the parameters of beta1 and beta2 every step of updating function is called. In the experiment, we have implemented binary tree-structured LSTM and adam optimizer function. It turned out that in the best case, randomized hyperparameter tuning with beta1 ranging from 0.88 to 0.92 and beta2 ranging from 0.9980 to 0.9999 is 3.81 times faster than the fixed parameter with beta1 = 0.999 and beta2 = 0.9. Our method is optimization algorithm independent and therefore performs well in using other algorithms such as NAG, AdaGrad, and RMSProp.
  • 关键词:Adaptive moment estimation; gradient descent; tree-structured LSTM; hyperparameter tuning
国家哲学社会科学文献中心版权所有