首页    期刊浏览 2024年09月15日 星期日
登录注册

文章基本信息

  • 标题:Attention-Based Transformer-BiGRU for Question Classification
  • 本地全文:下载
  • 作者:Dongfang Han ; Turdi Tohti ; Askar Hamdulla
  • 期刊名称:Information
  • 电子版ISSN:2078-2489
  • 出版年度:2022
  • 卷号:13
  • 期号:5
  • 页码:214
  • DOI:10.3390/info13050214
  • 语种:English
  • 出版社:MDPI Publishing
  • 摘要:A question answering (QA) system is a research direction in the field of artificial intelligence and natural language processing (NLP) that has attracted much attention and has broad development prospects. As one of the main components in the QA system, the accuracy of question classification plays a key role in the entire QA task. Therefore, not only the traditional machine learning methods but also today’s deep learning methods are widely used and deeply studied in question classification tasks. This paper mainly introduces our work on two aspects of Chinese question classification. The first is to use an answer-driven method to build a richer Chinese question classification dataset for the small-scale problems of the existing experimental dataset, which has a certain reference value for the expansion of the dataset, especially for the construction of those low-resource language datasets. The second is to propose a deep learning model of problem classification with a Transformer + Bi-GRU + Attention structure. Transformer has strong learning and coding ability, but it adopts the scheme of fixed coding length, which divides the long text into multiple segments, and each segment is coded separately; there is no interaction that occurs between segments. Here, we achieve the information interaction between segments through Bi-GRU so as to improve the coding effect of long sentences. Our purpose of adding the Attention mechanism is to highlight the key semantics in questions that contain answers. The experimental results show that the model proposed in this paper has significantly improved the accuracy of question classification.
国家哲学社会科学文献中心版权所有