首页    期刊浏览 2024年11月28日 星期四
登录注册

文章基本信息

  • 标题:Extractive Summarization with Very Deep Pretrained Language Model
  • 本地全文:下载
  • 作者:Yang Gu ; Yanke Hu
  • 期刊名称:International Journal of Artificial Intelligence & Applications (IJAIA)
  • 印刷版ISSN:0976-2191
  • 电子版ISSN:0975-900X
  • 出版年度:2019
  • 卷号:10
  • 期号:2
  • 页码:1-6
  • DOI:10.5121/ijaia.2019.10203
  • 出版社:Academy & Industry Research Collaboration Center (AIRCC)
  • 摘要:Recent development of generative pretrained language models has been proven very successful on a widerange of NLP tasks, such as text classification, question answering, textual entailment and so on.In thiswork, we present a two-phase encoder decoder architecture based on Bidirectional EncodingRepresentation from Transformers(BERT) for extractive summarization task. We evaluated our model byboth automatic metrics and human annotators, and demonstrated that the architecture achieves the state-of-the-art comparable result on large scale corpus - CNN/Daily Mail 1 . As the best of our knowledge, thisis the first work that applies BERT based architecture to a text summarization task and achieved the state-of-the-art comparable result..
  • 关键词:BERT; AI; Deep Learning; Summarization
国家哲学社会科学文献中心版权所有