Message passing for hyper-relational knowledge graphs

Research output: Contributions to collected editions/worksArticle in conference proceedingsResearchpeer-review

Standard

Message passing for hyper-relational knowledge graphs. / Galkin, Mikhail; Trivedi, Priyansh; Maheshwari, Gaurav et al.

EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference. ed. / Bonnie Webber; Trevor Cohn; Yulan He; Yang Liu. Association for Computational Linguistics (ACL), 2020. p. 7346-7359 (EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference).

Research output: Contributions to collected editions/worksArticle in conference proceedingsResearchpeer-review

Harvard

Galkin, M, Trivedi, P, Maheshwari, G, Usbeck, R & Lehmann, J 2020, Message passing for hyper-relational knowledge graphs. in B Webber, T Cohn, Y He & Y Liu (eds), EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference. EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference, Association for Computational Linguistics (ACL), pp. 7346-7359, 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020, Virtual, Online, 16.11.20. https://doi.org/10.48550/arXiv.2009.10847, https://doi.org/10.18653/v1/2020.emnlp-main.596

APA

Galkin, M., Trivedi, P., Maheshwari, G., Usbeck, R., & Lehmann, J. (2020). Message passing for hyper-relational knowledge graphs. In B. Webber, T. Cohn, Y. He, & Y. Liu (Eds.), EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference (pp. 7346-7359). (EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference). Association for Computational Linguistics (ACL). https://doi.org/10.48550/arXiv.2009.10847, https://doi.org/10.18653/v1/2020.emnlp-main.596

Vancouver

Galkin M, Trivedi P, Maheshwari G, Usbeck R, Lehmann J. Message passing for hyper-relational knowledge graphs. In Webber B, Cohn T, He Y, Liu Y, editors, EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference. Association for Computational Linguistics (ACL). 2020. p. 7346-7359. (EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference). doi: 10.48550/arXiv.2009.10847, 10.18653/v1/2020.emnlp-main.596

Bibtex

@inbook{f7459fa44c2b4d81b74314aae2d669b4,
title = "Message passing for hyper-relational knowledge graphs",
abstract = "Hyper-relational knowledge graphs (KGs) (e.g., Wikidata) enable associating additional key-value pairs along with the main triple to disambiguate, or restrict the validity of a fact. In this work, we propose a message passing based graph encoder - STARE capable of modeling such hyper-relational KGs. Unlike existing approaches, STARE can encode an arbitrary number of additional information (qualifiers) along with the main triple while keeping the semantic roles of qualifiers and triples intact. We also demonstrate that existing benchmarks for evaluating link prediction (LP) performance on hyper-relational KGs suffer from fundamental flaws and thus develop a new Wikidata-based dataset - WD50K. Our experiments demonstrate that STARE based LP model outperforms existing approaches across multiple benchmarks. We also confirm that leveraging qualifiers is vital for link prediction with gains up to 25 MRR points compared to triple-based representations.",
keywords = "Informatics, Business informatics",
author = "Mikhail Galkin and Priyansh Trivedi and Gaurav Maheshwari and Ricardo Usbeck and Jens Lehmann",
note = "Funding Information: We thank the Center for Information Services and High Performance Computing (ZIH) at TU Dresden for generous allocations of computer time. We acknowledge the support of the following projects: SPEAKER (FKZ 01MK20011A), JOSEPH (Fraunhofer Zukunftsstiftung), H2020 Cleopatra (GA 812997), ML2R (FKZ 01 15 18038 A/B/C), ML-win (01IS18050 D/F), ScaDS (01IS18026A), TAILOR (GA 952215). Publisher Copyright: {\textcopyright} 2020 Association for Computational Linguistics.; 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020, EMNLP 2020 ; Conference date: 16-11-2020 Through 20-11-2020",
year = "2020",
month = jan,
day = "1",
doi = "10.48550/arXiv.2009.10847",
language = "English",
series = "EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference",
publisher = "Association for Computational Linguistics (ACL)",
pages = "7346--7359",
editor = "Bonnie Webber and Trevor Cohn and Yulan He and Yang Liu",
booktitle = "EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference",
address = "United States",
url = "https://2020.emnlp.org",

}

RIS

TY - CHAP

T1 - Message passing for hyper-relational knowledge graphs

AU - Galkin, Mikhail

AU - Trivedi, Priyansh

AU - Maheshwari, Gaurav

AU - Usbeck, Ricardo

AU - Lehmann, Jens

N1 - Funding Information: We thank the Center for Information Services and High Performance Computing (ZIH) at TU Dresden for generous allocations of computer time. We acknowledge the support of the following projects: SPEAKER (FKZ 01MK20011A), JOSEPH (Fraunhofer Zukunftsstiftung), H2020 Cleopatra (GA 812997), ML2R (FKZ 01 15 18038 A/B/C), ML-win (01IS18050 D/F), ScaDS (01IS18026A), TAILOR (GA 952215). Publisher Copyright: © 2020 Association for Computational Linguistics.

PY - 2020/1/1

Y1 - 2020/1/1

N2 - Hyper-relational knowledge graphs (KGs) (e.g., Wikidata) enable associating additional key-value pairs along with the main triple to disambiguate, or restrict the validity of a fact. In this work, we propose a message passing based graph encoder - STARE capable of modeling such hyper-relational KGs. Unlike existing approaches, STARE can encode an arbitrary number of additional information (qualifiers) along with the main triple while keeping the semantic roles of qualifiers and triples intact. We also demonstrate that existing benchmarks for evaluating link prediction (LP) performance on hyper-relational KGs suffer from fundamental flaws and thus develop a new Wikidata-based dataset - WD50K. Our experiments demonstrate that STARE based LP model outperforms existing approaches across multiple benchmarks. We also confirm that leveraging qualifiers is vital for link prediction with gains up to 25 MRR points compared to triple-based representations.

AB - Hyper-relational knowledge graphs (KGs) (e.g., Wikidata) enable associating additional key-value pairs along with the main triple to disambiguate, or restrict the validity of a fact. In this work, we propose a message passing based graph encoder - STARE capable of modeling such hyper-relational KGs. Unlike existing approaches, STARE can encode an arbitrary number of additional information (qualifiers) along with the main triple while keeping the semantic roles of qualifiers and triples intact. We also demonstrate that existing benchmarks for evaluating link prediction (LP) performance on hyper-relational KGs suffer from fundamental flaws and thus develop a new Wikidata-based dataset - WD50K. Our experiments demonstrate that STARE based LP model outperforms existing approaches across multiple benchmarks. We also confirm that leveraging qualifiers is vital for link prediction with gains up to 25 MRR points compared to triple-based representations.

KW - Informatics

KW - Business informatics

UR - http://www.scopus.com/inward/record.url?scp=85106090678&partnerID=8YFLogxK

UR - https://www.mendeley.com/catalogue/bfb53193-4630-3af2-97d6-f3834fa3d874/

U2 - 10.48550/arXiv.2009.10847

DO - 10.48550/arXiv.2009.10847

M3 - Article in conference proceedings

AN - SCOPUS:85106090678

T3 - EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference

SP - 7346

EP - 7359

BT - EMNLP 2020 - 2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference

A2 - Webber, Bonnie

A2 - Cohn, Trevor

A2 - He, Yulan

A2 - Liu, Yang

PB - Association for Computational Linguistics (ACL)

T2 - 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020

Y2 - 16 November 2020 through 20 November 2020

ER -