首页    期刊浏览 2024年10月03日 星期四
登录注册

文章基本信息

  • 标题:Skeleton to Abstraction: An Attentive Information Extraction Schema for Enhancing the Saliency of Text Summarization
  • 本地全文:下载
  • 作者:Xiujuan Xiang ; Guangluan Xu ; Xingyu Fu
  • 期刊名称:Information
  • 电子版ISSN:2078-2489
  • 出版年度:2018
  • 卷号:9
  • 期号:9
  • 页码:217-235
  • DOI:10.3390/info9090217
  • 出版社:MDPI Publishing
  • 摘要:Current popular abstractive summarization is based on an attentional encoder-decoder framework. Based on the architecture, the decoder generates a summary according to the full text that often results in the decoder being interfered by some irrelevant information, thereby causing the generated summaries to suffer from low saliency. Besides, we have observed the process of people writing summaries and find that they write a summary based on the necessary information rather than the full text. Thus, in order to enhance the saliency of the abstractive summarization, we propose an attentive information extraction model. It consists of a multi-layer perceptron (MLP) gated unit that pays more attention to the important information of the source text and a similarity module to encourage high similarity between the reference summary and the important information. Before the summary decoder, the MLP and the similarity module work together to extract the important information for the decoder, thus obtaining the skeleton of the source text. This effectively reduces the interference of irrelevant information to the decoder, therefore improving the saliency of the summary. Our proposed model was tested on CNN/Daily Mail and DUC-2004 datasets, and achieved a 42.01 ROUGE-1 f-score and 33.94 ROUGE-1, recall respectively. The result outperforms the state-of-the-art abstractive model on the same dataset. In addition, by subjective human evaluation, the saliency of the generated summaries was further enhanced.
  • 关键词:recurrent neural network (RNN); abstractive text summarization; information extraction; attention mechanism; semantic relevance; saliency of summarization recurrent neural network (RNN) ; abstractive text summarization ; information extraction ; attention mechanism ; semantic relevance ; saliency of summarization
国家哲学社会科学文献中心版权所有