首页    期刊浏览 2024年11月24日 星期日
登录注册

文章基本信息

  • 标题:Injecting Event Knowledge into Pre-Trained Language Models for Event Extraction
  • 本地全文:下载
  • 作者:Zining Yang ; Siyu Zhan ; Mengshu Hou
  • 期刊名称:Computer Science & Information Technology
  • 电子版ISSN:2231-5403
  • 出版年度:2020
  • 卷号:10
  • 期号:14
  • 页码:41-50
  • DOI:10.5121/csit.2020.101404
  • 出版社:Academy & Industry Research Collaboration Center (AIRCC)
  • 摘要:The recent pre-trained language model has made great success in many NLP tasks. In this paper, we propose an event extraction system based on the novel pre-trained language model BERT to extract both event trigger and argument. As a deep-learningbased method, the size of the training dataset has a crucial impact on performance. To address the lacking training data problem for event extraction, we further train the pretrained language model with a carefully constructed in-domain corpus to inject event knowledge to our event extraction system with minimal efforts. Empirical evaluation on the ACE2005 dataset shows that injecting event knowledge can significantly improve the performance of event extraction.
  • 关键词:Natural Language Processing ;Event Extraction ;BERT ;Lacking Training Data Problem.
国家哲学社会科学文献中心版权所有