首页    期刊浏览 2024年12月01日 星期日
登录注册

文章基本信息

  • 标题:Language Models as Knowledge Bases: On Entity Representations, Storage Capacity, and Paraphrased Queries
  • 本地全文:下载
  • 作者:Benjamin Heinzerling ; Kentaro Inui
  • 期刊名称:Conference on European Chapter of the Association for Computational Linguistics (EACL)
  • 出版年度:2021
  • 卷号:2021
  • 页码:1772-1791
  • DOI:10.18653/v1/2021.eacl-main.153
  • 语种:English
  • 出版社:ACL Anthology
  • 摘要:Pretrained language models have been suggested as a possible alternative or complement to structured knowledge bases. However, this emerging LM-as-KB paradigm has so far only been considered in a very limited setting, which only allows handling 21k entities whose name is found in common LM vocabularies. Furthermore, a major benefit of this paradigm, i.e., querying the KB using natural language paraphrases, is underexplored. Here we formulate two basic requirements for treating LMs as KBs: (i) the ability to store a large number facts involving a large number of entities and (ii) the ability to query stored facts. We explore three entity representations that allow LMs to handle millions of entities and present a detailed case study on paraphrased querying of facts stored in LMs, thereby providing a proof-of-concept that language models can indeed serve as knowledge bases.
国家哲学社会科学文献中心版权所有