首页    期刊浏览 2024年11月25日 星期一
登录注册

文章基本信息

  • 标题:Applying the Transformer to Character-level Transduction
  • 本地全文:下载
  • 作者:Shijie Wu ; Ryan Cotterell ; Mans Hulden
  • 期刊名称:Conference on European Chapter of the Association for Computational Linguistics (EACL)
  • 出版年度:2021
  • 卷号:2021
  • 页码:1901-1907
  • DOI:10.18653/v1/2021.eacl-main.163
  • 语种:English
  • 出版社:ACL Anthology
  • 摘要:The transformer has been shown to outperform recurrent neural network-based sequence-to-sequence models in various word-level NLP tasks. Yet for character-level transduction tasks, e.g. morphological inflection generation and historical text normalization, there are few works that outperform recurrent models using the transformer. In an empirical study, we uncover that, in contrast to recurrent sequence-to-sequence models, the batch size plays a crucial role in the performance of the transformer on character-level tasks, and we show that with a large enough batch size, the transformer does indeed outperform recurrent models. We also introduce a simple technique to handle feature-guided character-level transduction that further improves performance. With these insights, we achieve state-of-the-art performance on morphological inflection and historical text normalization. We also show that the transformer outperforms a strong baseline on two other character-level transduction tasks: grapheme-to-phoneme conversion and transliteration.
国家哲学社会科学文献中心版权所有