首页    期刊浏览 2024年10月07日 星期一
登录注册

文章基本信息

  • 标题:Alternating Recurrent Dialog Model with Large-scale Pre-trained Language Models
  • 本地全文:下载
  • 作者:Qingyang Wu ; Yichi Zhang ; Yu Li
  • 期刊名称:Conference on European Chapter of the Association for Computational Linguistics (EACL)
  • 出版年度:2021
  • 卷号:2021
  • 页码:1292-1301
  • DOI:10.18653/v1/2021.eacl-main.110
  • 语种:English
  • 出版社:ACL Anthology
  • 摘要:Existing dialog system models require extensive human annotations and are difficult to generalize to different tasks. The recent success of large pre-trained language models such as BERT and GPT-2 (Devlin et al., 2019; Radford et al., 2019) have suggested the effectiveness of incorporating language priors in down-stream NLP tasks. However, how much pre-trained language models can help dialog response generation is still under exploration. In this paper, we propose a simple, general, and effective framework: Alternating Recurrent Dialog Model (ARDM). ARDM models each speaker separately and takes advantage of the large pre-trained language model. It requires no supervision from human annotations such as belief states or dialog acts to achieve effective conversations. ARDM outperforms or is on par with state-of-the-art methods on two popular task-oriented dialog datasets: CamRest676 and MultiWOZ. Moreover, we can generalize ARDM to more challenging, non-collaborative tasks such as persuasion. In persuasion tasks, ARDM is capable of generating human-like responses to persuade people to donate to a charity.
国家哲学社会科学文献中心版权所有