首页    期刊浏览 2025年07月12日 星期六
登录注册

文章基本信息

  • 标题:Zero‐anaphora resolution in Korean based on deep language representation model: BERT
  • 本地全文:下载
  • 作者:Youngtae Kim ; Dongyul Ra ; Soojong Lim
  • 期刊名称:ETRI Journal
  • 印刷版ISSN:1225-6463
  • 电子版ISSN:2233-7326
  • 出版年度:2020
  • 卷号:43
  • 期号:2
  • 页码:299-312
  • DOI:10.4218/etrij.2019-0441
  • 语种:English
  • 出版社:Electronics and Telecommunications Research Institute
  • 摘要:It is necessary to achieve high performance in the task of zero anaphora resolution (ZAR) for completely understanding the texts in Korean, Japanese, Chinese, and various other languages. Deep‐learning‐based models are being employed for building ZAR systems, owing to the success of deep learning in the recent years. However, the objective of building a high‐quality ZAR system is far from being achieved even using these models. To enhance the current ZAR techniques, we fine‐tuned a pre‐trained bidirectional encoder representations from transformers (BERT). Notably, BERT is a general language representation model that enables systems to utilize deep bidirectional contextual information in a natural language text. It extensively exploits the attention mechanism based upon the sequence‐transduction model Transformer. In our model, classification is simultaneously performed for all the words in the input word sequence to decide whether each word can be an antecedent. We seek end‐to‐end learning by disallowing any use of hand‐crafted or dependency‐parsing features. Experimental results show that compared with other models, our approach can significantly improve the performance of ZAR.
国家哲学社会科学文献中心版权所有