首页    期刊浏览 2024年07月07日 星期日
登录注册

文章基本信息

  • 标题:Transformer based Contextual Model for Sentiment Analysis of Customer Reviews: A Fine-tuned BERT
  • 本地全文:下载
  • 作者:Ashok Kumar Durairaj ; Anandan Chinnalagu
  • 期刊名称:International Journal of Advanced Computer Science and Applications(IJACSA)
  • 印刷版ISSN:2158-107X
  • 电子版ISSN:2156-5570
  • 出版年度:2021
  • 卷号:12
  • 期号:11
  • DOI:10.14569/IJACSA.2021.0121153
  • 语种:English
  • 出版社:Science and Information Society (SAI)
  • 摘要:The Bidirectional Encoder Representations from Transformers (BERT) is a state-of-the-art language model used for multiple natural language processing tasks and sequential modeling applications. The accuracy of predictions from context-based sentiment and analysis of customer review data from various social media platforms are challenging and time-consuming tasks due to the high volumes of unstructured data. In recent years, more research has been conducted based on the recurrent neural network algorithm, Long Short-Term Memory (LSTM), Bidirectional LSTM (BiLSTM) as well as hybrid, neutral, and traditional text classification algorithms. This paper presents our experimental research work to overcome these known challenges of the sentiment analysis models, such as its performance, accuracy, and context-based predictions. We’ve proposed a fine-tuned BERT model to predict customer sentiments through the utilization of customer reviews from Twitter, IMDB Movie Reviews, Yelp, Amazon. In addition, we compared the results of the proposed model with our custom Linear Support Vector Machine (LSVM), fastText, BiLSTM and hybrid fastText-BiLSTM models, as well as presented a comparative analysis dashboard report. This experiment result shows that the proposed model performs better than other models with respect to various performance measures.
  • 关键词:Transformers model; BERT; sequential model; deep learning; RNN; LSVM; LSTM; BiLSTM; fastText
国家哲学社会科学文献中心版权所有