首页    期刊浏览 2024年11月06日 星期三
登录注册

文章基本信息

  • 标题:Neural Language Models vsWordnet-based Semantically Enriched Representation inCSTRelation Recognition
  • 本地全文:下载
  • 作者:Arkadiusz Janz ; Maciej Piasecki ; Piotr Wątorski
  • 期刊名称:Conference on European Chapter of the Association for Computational Linguistics (EACL)
  • 出版年度:2021
  • 卷号:2021
  • 页码:223-233
  • 语种:English
  • 出版社:ACL Anthology
  • 摘要:Neural language models, including transformer-based models, that are pre-trained on very large corpora became a common way to represent text in various tasks, including recognition of textual semantic relations, e.g. Cross-document Structure Theory. Pre-trained models are usually fine tuned to downstream tasks and the obtained vectors are used as an input for deep neural classifiers. No linguistic knowledge obtained from resources and tools is utilised. In this paper we compare such universal approaches with a combination of rich graph-based linguistically motivated sentence representation and a typical neural network classifier applied to a task of recognition of CST relation in Polish. The representation describes selected levels of the sentence structure including description of lexical meanings on the basis of the wordnet (plWordNet) synsets and connected SUMO concepts. The obtained results show that in the case of difficult relations and medium size training corpus semantically enriched text representation leads to significantly better results.
国家哲学社会科学文献中心版权所有