首页    期刊浏览 2024年07月09日 星期二
登录注册

文章基本信息

  • 标题:What Do Recurrent Neural Network Grammars Learn About Syntax?
  • 本地全文:下载
  • 作者:Adhiguna Kuncoro ; Miguel Ballesteros ; Lingpeng Kong
  • 期刊名称:Conference on European Chapter of the Association for Computational Linguistics (EACL)
  • 出版年度:2017
  • 卷号:2017
  • 页码:1249-1258
  • 语种:English
  • 出版社:ACL Anthology
  • 摘要:Recurrent neural network grammars (RNNG) are a recently proposed probablistic generative modeling family for natural language. They show state-of-the-art language modeling and parsing performance. We investigate what information they learn, from a linguistic perspective, through various ablations to the model and the data, and by augmenting the model with an attention mechanism (GA-RNNG) to enable closer inspection. We find that explicit modeling of composition is crucial for achieving the best performance. Through the attention mechanism, we find that headedness plays a central role in phrasal representation (with the model’s latent attention largely agreeing with predictions made by hand-crafted head rules, albeit with some important differences). By training grammars without nonterminal labels, we find that phrasal representations depend minimally on nonterminals, providing support for the endocentricity hypothesis.
国家哲学社会科学文献中心版权所有