首页    期刊浏览 2024年11月15日 星期五
登录注册

文章基本信息

  • 标题:Ask2Transformers: Zero-Shot Domain labelling with Pretrained Language Models
  • 本地全文:下载
  • 作者:Oscar Sainz ; German Rigau
  • 期刊名称:Conference on European Chapter of the Association for Computational Linguistics (EACL)
  • 出版年度:2021
  • 卷号:2021
  • 页码:44-52
  • 语种:English
  • 出版社:ACL Anthology
  • 摘要:In this paper we present a system that exploits different pre-trained Language Models for assigning domain labels to WordNet synsets without any kind of supervision. Furthermore, the system is not restricted to use a particular set of domain labels. We exploit the knowledge encoded within different off-the-shelf pre-trained Language Models and task formulations to infer the domain label of a particular WordNet definition. The proposed zero-shot system achieves a new state-of-the-art on the English dataset used in the evaluation.
国家哲学社会科学文献中心版权所有