首页    期刊浏览 2024年10月06日 星期日
登录注册

文章基本信息

  • 标题:First Align, then Predict: Understanding the Cross-Lingual Ability of MultilingualBERT
  • 本地全文:下载
  • 作者:Benjamin Muller ; Yanai Elazar ; Benoît Sagot
  • 期刊名称:Conference on European Chapter of the Association for Computational Linguistics (EACL)
  • 出版年度:2021
  • 卷号:2021
  • 页码:2214-2231
  • DOI:10.18653/v1/2021.eacl-main.189
  • 语种:English
  • 出版社:ACL Anthology
  • 摘要:Multilingual pretrained language models have demonstrated remarkable zero-shot cross-lingual transfer capabilities. Such transfer emerges by fine-tuning on a task of interest in one language and evaluating on a distinct language, not seen during the fine-tuning. Despite promising results, we still lack a proper understanding of the source of this transfer. Using a novel layer ablation technique and analyses of the model’s internal representations, we show that multilingual BERT, a popular multilingual language model, can be viewed as the stacking of two sub-networks: a multilingual encoder followed by a task-specific language-agnostic predictor. While the encoder is crucial for cross-lingual transfer and remains mostly unchanged during fine-tuning, the task predictor has little importance on the transfer and can be reinitialized during fine-tuning. We present extensive experiments with three distinct tasks, seventeen typologically diverse languages and multiple domains to support our hypothesis.
国家哲学社会科学文献中心版权所有