首页    期刊浏览 2024年07月03日 星期三
登录注册

文章基本信息

  • 标题:Understanding Negative Sampling in Knowledge Graph Embedding
  • 本地全文:下载
  • 作者:Jing Qian ; Gangmin Li ; Katie Atkinson
  • 期刊名称:International Journal of Artificial Intelligence & Applications (IJAIA)
  • 印刷版ISSN:0976-2191
  • 电子版ISSN:0975-900X
  • 出版年度:2021
  • 卷号:12
  • 期号:1
  • 页码:71-81
  • DOI:10.5121/ijaia.2021.12105
  • 出版社:Academy & Industry Research Collaboration Center (AIRCC)
  • 摘要:Knowledge graph embedding (KGE) is to project entities and relations of a knowledge graph (KG) into a low-dimensional vector space, which has made steady progress in recent years. Conventional KGE methods, especially translational distance-based models, are trained through discriminating positive samples from negative ones. Most KGs store only positive samples for space efficiency. Negative sampling thus plays a crucial role in encoding triples of a KG. The quality of generated negative samples has a direct impact on the performance of learnt knowledge representation in a myriad of downstream tasks, such as recommendation, link prediction and node classification. We summarize current negative sampling approaches in KGE into three categories, static distribution-based, dynamic distribution-based and custom cluster-based respectively. Based on this categorization we discuss the most prevalent existing approaches and their characteristics. It is a hope that this review can provide some guidelines for new thoughts about negative sampling in KGE.
  • 其他摘要:Knowledge graph embedding (KGE) is to project entities and relations of a knowledge graph (KG) into a low-dimensional vector space, which has made steady progress in recent years. Conventional KGE methods, especially translational distance-based models, are trained through discriminating positive samples from negative ones. Most KGs store only positive samples for space efficiency. Negative sampling thus plays a crucial role in encoding triples of a KG. The quality of generated negative samples has a direct impact on the performance of learnt knowledge representation in a myriad of downstream tasks, such as recommendation, link prediction and node classification. We summarize current negative sampling approaches in KGE into three categories, static distribution-based, dynamic distribution-based and custom cluster-based respectively. Based on this categorization we discuss the most prevalent existing approaches and their characteristics. It is a hope that this review can provide some guidelines for new thoughts about negative sampling in KGE.
  • 关键词:Negative Sampling; Knowledge Graph Embedding; Generative Adversarial Network
国家哲学社会科学文献中心版权所有