首页    期刊浏览 2024年09月20日 星期五
登录注册

文章基本信息

  • 标题:Pre-Reordering for Neural Machine Translation: Helpful or Harmful?
  • 本地全文:下载
  • 作者:Jinhua Du ; Andy Way
  • 期刊名称:The Prague Bulletin of Mathematical Linguistics
  • 印刷版ISSN:0032-6585
  • 电子版ISSN:1804-0462
  • 出版年度:2017
  • 卷号:108
  • 期号:1
  • 页码:171-182
  • DOI:10.1515/pralin-2017-0018
  • 语种:English
  • 出版社:Walter de Gruyter GmbH
  • 摘要:Pre-reordering, a preprocessing to make the source-side word orders close to those of the target side, has been proven very helpful for statistical machine translation (SMT) in improving translation quality. However, is it the case in neural machine translation (NMT)? In this paper, we firstly investigate the impact of pre-reordered source-side data on NMT, and then propose to incorporate features for the pre-reordering model in SMT as input factors into NMT (factored NMT). The features, namely parts-of-speech (POS), word class and reordered index, are encoded as feature vectors and concatenated to the word embeddings to provide extra knowledge for NMT. Pre-reordering experiments conducted on Japanese↔English and Chinese↔English show that pre-reordering the source-side data for NMT is redundant and NMT models trained on pre-reordered data deteriorate translation performance. However, factored NMT using SMT-based pre-reordering features on Japanese→English and Chinese→English is beneficial and can further improve by 4.48 and 5.89 relative BLEU points, respectively, compared to the baseline NMT system.
国家哲学社会科学文献中心版权所有