Message passing for hyper-relational knowledge graphs

Research output: Contributions to collected editions/worksArticle in conference proceedingsResearchpeer-review

Authors

  • Mikhail Galkin
  • Priyansh Trivedi
  • Gaurav Maheshwari
  • Ricardo Usbeck
  • Jens Lehmann

Hyper-relational knowledge graphs (KGs) (e.g., Wikidata) enable associating additional key-value pairs along with the main triple to disambiguate, or restrict the validity of a fact. In this work, we propose a message passing based graph encoder - STARE capable of modeling such hyper-relational KGs. Unlike existing approaches, STARE can encode an arbitrary number of additional information (qualifiers) along with the main triple while keeping the semantic roles of qualifiers and triples intact. We also demonstrate that existing benchmarks for evaluating link prediction (LP) performance on hyper-relational KGs suffer from fundamental flaws and thus develop a new Wikidata-based dataset - WD50K. Our experiments demonstrate that STARE based LP model outperforms existing approaches across multiple benchmarks. We also confirm that leveraging qualifiers is vital for link prediction with gains up to 25 MRR points compared to triple-based representations.

Original languageEnglish
Title of host publicationEMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference
EditorsBonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Number of pages14
PublisherAssociation for Computational Linguistics (ACL)
Publication date01.01.2020
Pages7346-7359
ISBN (electronic)9781952148606
DOIs
Publication statusPublished - 01.01.2020
Externally publishedYes
Event2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020 - Virtual, Online
Duration: 16.11.202020.11.2020
https://2020.emnlp.org

Bibliographical note

Publisher Copyright:
© 2020 Association for Computational Linguistics.