首页    期刊浏览 2024年07月08日 星期一
登录注册

文章基本信息

  • 标题:MTL-DAS: Automatic Text Summarization for Domain Adaptation
  • 本地全文:下载
  • 作者:Jiang Zhong ; Zhiying Wang
  • 期刊名称:Computational Intelligence and Neuroscience
  • 印刷版ISSN:1687-5265
  • 电子版ISSN:1687-5273
  • 出版年度:2022
  • 卷号:2022
  • DOI:10.1155/2022/4851828
  • 语种:English
  • 出版社:Hindawi Publishing Corporation
  • 摘要:Domain adaptation on text summarization task is always challenging, which is caused by the lack of annotated data in the target domain. Previous methodologies focused more on introducing knowledge in the target domain and shifted the model to the target domain. However, they mostly studied the adaptation to a single low-resource domain, which restricted practicality. In this paper, we propose MTL-DAS, a unified model for multidomain adaptive text summarization, which stands for Multitask Learning for Multidomain Adaptation Summarization model. Combined with BART, we investigate multitask learning method to enhance the generalization ability in multidomain. We adapt the ability of detect summary-worthy content from source domain and obtain the knowledge and generation style in target domains by text reconstruction task and text classification task. We carry out the domain adaptation ability experiment on AdaptSum dataset, which includes six domains in low-resource scenarios. The experiment shows the unified model not only outperforms separately trained models, but also is time-consuming and requires less computational resources.
国家哲学社会科学文献中心版权所有