首页    期刊浏览 2024年11月24日 星期日
登录注册

文章基本信息

  • 标题:An Improved the Performance of GRU Model based on Batch Normalization for Sentence Classification
  • 本地全文:下载
  • 作者:Muhammad Zulqarnain Rozaida Ghazali ; Shihab Hamad Khaleefah ; Ayesha Rehan
  • 期刊名称:International Journal of Computer Science and Network Security
  • 印刷版ISSN:1738-7906
  • 出版年度:2019
  • 卷号:19
  • 期号:9
  • 页码:176-186
  • 出版社:International Journal of Computer Science and Network Security
  • 摘要:Sentiment classification is a very popular topic for identifying user opinions and has been extensively applied in Natural Language Processing (NLP) tasks. Gated Recurrent Unit (GRU) has been successfully implemented to NLP mechanism with comparable, outstanding results. GRUs network performs better on sequential learning tasks and overcomes the issues of vanishing and explosion of gradients in standard recurrent neural networks (RNNs). In this paper, we describe to improve the efficiency of the GRU framework based on batch normalization and replace traditional tanh activation function with Leaky ReLU (LReLU). Empirically, we present that our model, with slight hyperparameters, and tuning the statistic vectors, obtains excellent results on benchmark datasets for sentiment classification. The proposed BN-GRU model performs well as compared to various existing approaches in terms of accuracy and loss function. The experimental results has shown that the proposed model achieved better performance over several state-of-the-art approaches on two benchmark datasets, IMDB dataset with 82.4% accuracy, and SSTb dataset with 88.1% binary classification accuracy and 49.9% Fine-grained accuracy respectively. The proposed results are obtained to show the proposed model capable to minimize the loss function, and extract long-term dependencies with a compact architecture that obtained superior performance with significantly fewer parameters.
  • 关键词:RNN; GRU; Batch Normalization; Long-term dependencies; Sentence Classification.
国家哲学社会科学文献中心版权所有