NLP

[KG-BERT 논문 리뷰] KG-BERT: BERT for Knowledge Graph Completion

코딩무민 2022. 5. 18. 16:00
반응형

1. 핵심 요약 

  • KBC tasks에 pre-trained LM 이용
  • knowledge graph의 triples를 textual sequence로 인식 → 이러한 triples를 만들기 위해 KG-BERT라는 새로운 framework 제안
    • input : entity, relation description of a triple → computes scoring function of the triple
    • 결과 : triple classification, link prediction, relation prediction tasks에서 SOTA 달성

 

2. 논문 링크

https://arxiv.org/abs/1909.03193

 

KG-BERT: BERT for Knowledge Graph Completion

Knowledge graphs are important resources for many artificial intelligence tasks but often suffer from incompleteness. In this work, we propose to use pre-trained language models for knowledge graph completion. We treat triples in knowledge graphs as textua

arxiv.org

 

3. 논문 설명 링크 

https://coding-moomin.notion.site/KG-BERT-BERT-for-Knowledge-Graph-Completion-e2df6f2936904ec2a460dc801a5de564

 

[KG-BERT] BERT for Knowledge Graph Completion

Abstract

coding-moomin.notion.site

 

반응형