首页    期刊浏览 2024年07月23日 星期二
登录注册

文章基本信息

  • 标题:On the evolution of syntactic information encoded byBERT’s contextualized representations
  • 本地全文:下载
  • 作者:Laura Pérez-Mayos ; Roberto Carlini ; Miguel Ballesteros
  • 期刊名称:Conference on European Chapter of the Association for Computational Linguistics (EACL)
  • 出版年度:2021
  • 卷号:2021
  • 页码:2243-2258
  • DOI:10.18653/v1/2021.eacl-main.191
  • 语种:English
  • 出版社:ACL Anthology
  • 摘要:The adaptation of pretrained language models to solve supervised tasks has become a baseline in NLP, and many recent works have focused on studying how linguistic information is encoded in the pretrained sentence representations. Among other information, it has been shown that entire syntax trees are implicitly embedded in the geometry of such models. As these models are often fine-tuned, it becomes increasingly important to understand how the encoded knowledge evolves along the fine-tuning. In this paper, we analyze the evolution of the embedded syntax trees along the fine-tuning process of BERT for six different tasks, covering all levels of the linguistic structure. Experimental results show that the encoded syntactic information is forgotten (PoS tagging), reinforced (dependency and constituency parsing) or preserved (semantics-related tasks) in different ways along the fine-tuning process depending on the task.
国家哲学社会科学文献中心版权所有