GETT-QA: Graph Embedding Based T2T Transformer for Knowledge Graph Question Answering

Publikation: Beiträge in SammelwerkenAufsätze in KonferenzbändenForschungbegutachtet

Standard

GETT-QA: Graph Embedding Based T2T Transformer for Knowledge Graph Question Answering. / Banerjee, Debayan; Nair, Pranav Ajit; Usbeck, Ricardo et al.
The Semantic Web - 20th International Conference, ESWC 2023, Proceedings. Hrsg. / Catia Pesquita; Daniel Faria; Ernesto Jimenez-Ruiz; Jamie McCusker; Mauro Dragoni; Anastasia Dimou; Raphael Troncy; Sven Hertling. Springer Science and Business Media Deutschland GmbH, 2023. S. 279-297 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Band 13870 LNCS).

Publikation: Beiträge in SammelwerkenAufsätze in KonferenzbändenForschungbegutachtet

Harvard

Banerjee, D, Nair, PA, Usbeck, R & Biemann, C 2023, GETT-QA: Graph Embedding Based T2T Transformer for Knowledge Graph Question Answering. in C Pesquita, D Faria, E Jimenez-Ruiz, J McCusker, M Dragoni, A Dimou, R Troncy & S Hertling (Hrsg.), The Semantic Web - 20th International Conference, ESWC 2023, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Bd. 13870 LNCS, Springer Science and Business Media Deutschland GmbH, S. 279-297, 20th International Conference on The Semantic Web - ESWC 2023, Hersonissos, Griechenland, 28.05.23. https://doi.org/10.48550/arXiv.2303.13284, https://doi.org/10.1007/978-3-031-33455-9_17

APA

Banerjee, D., Nair, P. A., Usbeck, R., & Biemann, C. (2023). GETT-QA: Graph Embedding Based T2T Transformer for Knowledge Graph Question Answering. In C. Pesquita, D. Faria, E. Jimenez-Ruiz, J. McCusker, M. Dragoni, A. Dimou, R. Troncy, & S. Hertling (Hrsg.), The Semantic Web - 20th International Conference, ESWC 2023, Proceedings (S. 279-297). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Band 13870 LNCS). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.48550/arXiv.2303.13284, https://doi.org/10.1007/978-3-031-33455-9_17

Vancouver

Banerjee D, Nair PA, Usbeck R, Biemann C. GETT-QA: Graph Embedding Based T2T Transformer for Knowledge Graph Question Answering. in Pesquita C, Faria D, Jimenez-Ruiz E, McCusker J, Dragoni M, Dimou A, Troncy R, Hertling S, Hrsg., The Semantic Web - 20th International Conference, ESWC 2023, Proceedings. Springer Science and Business Media Deutschland GmbH. 2023. S. 279-297. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). doi: 10.48550/arXiv.2303.13284, 10.1007/978-3-031-33455-9_17

Bibtex

@inbook{e7fcccd69b804f559c1b79e243b23eb0,
title = "GETT-QA: Graph Embedding Based T2T Transformer for Knowledge Graph Question Answering",
abstract = "In this work, we present an end-to-end Knowledge Graph Question Answering (KGQA) system named GETT-QA. GETT-QA uses T5, a popular text-to-text pre-trained language model. The model takes a question in natural language as input and produces a simpler form of the intended SPARQL query. In the simpler form, the model does not directly produce entity and relation IDs. Instead, it produces corresponding entity and relation labels. The labels are grounded to KG entity and relation IDs in a subsequent step. To further improve the results, we instruct the model to produce a truncated version of the KG embedding for each entity. The truncated KG embedding enables a finer search for disambiguation purposes. We find that T5 is able to learn the truncated KG embeddings without any change of loss function, improving KGQA performance. As a result, we report strong results for LC-QuAD 2.0 and SimpleQuestions-Wikidata datasets on end-to-end KGQA over Wikidata.",
keywords = "Informatics, Business informatics",
author = "Debayan Banerjee and Nair, {Pranav Ajit} and Ricardo Usbeck and Chris Biemann",
note = "Publisher Copyright: {\textcopyright} 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG.; 20th International Conference on The Semantic Web - ESWC 2023 : The Extended Semantic Web Conference, ESWC 2023 ; Conference date: 28-05-2023 Through 01-06-2023",
year = "2023",
month = may,
day = "23",
doi = "10.48550/arXiv.2303.13284",
language = "English",
isbn = "978-3-031-33455-9",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Science and Business Media Deutschland GmbH",
pages = "279--297",
editor = "Catia Pesquita and Daniel Faria and Ernesto Jimenez-Ruiz and Jamie McCusker and Mauro Dragoni and Anastasia Dimou and Raphael Troncy and Sven Hertling",
booktitle = "The Semantic Web - 20th International Conference, ESWC 2023, Proceedings",
address = "Germany",
url = "https://2023.eswc-conferences.org/",

}

RIS

TY - CHAP

T1 - GETT-QA

T2 - 20th International Conference on The Semantic Web - ESWC 2023

AU - Banerjee, Debayan

AU - Nair, Pranav Ajit

AU - Usbeck, Ricardo

AU - Biemann, Chris

N1 - Conference code: 20

PY - 2023/5/23

Y1 - 2023/5/23

N2 - In this work, we present an end-to-end Knowledge Graph Question Answering (KGQA) system named GETT-QA. GETT-QA uses T5, a popular text-to-text pre-trained language model. The model takes a question in natural language as input and produces a simpler form of the intended SPARQL query. In the simpler form, the model does not directly produce entity and relation IDs. Instead, it produces corresponding entity and relation labels. The labels are grounded to KG entity and relation IDs in a subsequent step. To further improve the results, we instruct the model to produce a truncated version of the KG embedding for each entity. The truncated KG embedding enables a finer search for disambiguation purposes. We find that T5 is able to learn the truncated KG embeddings without any change of loss function, improving KGQA performance. As a result, we report strong results for LC-QuAD 2.0 and SimpleQuestions-Wikidata datasets on end-to-end KGQA over Wikidata.

AB - In this work, we present an end-to-end Knowledge Graph Question Answering (KGQA) system named GETT-QA. GETT-QA uses T5, a popular text-to-text pre-trained language model. The model takes a question in natural language as input and produces a simpler form of the intended SPARQL query. In the simpler form, the model does not directly produce entity and relation IDs. Instead, it produces corresponding entity and relation labels. The labels are grounded to KG entity and relation IDs in a subsequent step. To further improve the results, we instruct the model to produce a truncated version of the KG embedding for each entity. The truncated KG embedding enables a finer search for disambiguation purposes. We find that T5 is able to learn the truncated KG embeddings without any change of loss function, improving KGQA performance. As a result, we report strong results for LC-QuAD 2.0 and SimpleQuestions-Wikidata datasets on end-to-end KGQA over Wikidata.

KW - Informatics

KW - Business informatics

UR - http://www.scopus.com/inward/record.url?scp=85163305271&partnerID=8YFLogxK

U2 - 10.48550/arXiv.2303.13284

DO - 10.48550/arXiv.2303.13284

M3 - Article in conference proceedings

AN - SCOPUS:85163305271

SN - 978-3-031-33455-9

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 279

EP - 297

BT - The Semantic Web - 20th International Conference, ESWC 2023, Proceedings

A2 - Pesquita, Catia

A2 - Faria, Daniel

A2 - Jimenez-Ruiz, Ernesto

A2 - McCusker, Jamie

A2 - Dragoni, Mauro

A2 - Dimou, Anastasia

A2 - Troncy, Raphael

A2 - Hertling, Sven

PB - Springer Science and Business Media Deutschland GmbH

Y2 - 28 May 2023 through 1 June 2023

ER -

DOI