Standard
GETT-QA: Graph Embedding Based T2T Transformer for Knowledge Graph Question Answering. /
Banerjee, Debayan; Nair, Pranav Ajit
; Usbeck, Ricardo et al.
The Semantic Web - 20th International Conference, ESWC 2023, Proceedings. ed. / Catia Pesquita; Daniel Faria; Ernesto Jimenez-Ruiz; Jamie McCusker; Mauro Dragoni; Anastasia Dimou; Raphael Troncy; Sven Hertling. Springer Science and Business Media Deutschland GmbH, 2023. p. 279-297 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 13870 LNCS).
Research output: Contributions to collected editions/works › Article in conference proceedings › Research › peer-review
Harvard
Banerjee, D, Nair, PA
, Usbeck, R & Biemann, C 2023,
GETT-QA: Graph Embedding Based T2T Transformer for Knowledge Graph Question Answering. in C Pesquita, D Faria, E Jimenez-Ruiz, J McCusker, M Dragoni, A Dimou, R Troncy & S Hertling (eds),
The Semantic Web - 20th International Conference, ESWC 2023, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 13870 LNCS, Springer Science and Business Media Deutschland GmbH, pp. 279-297, 20th International Conference on The Semantic Web - ESWC 2023, Hersonissos, Greece,
28.05.23.
https://doi.org/10.48550/arXiv.2303.13284,
https://doi.org/10.1007/978-3-031-33455-9_17
APA
Banerjee, D., Nair, P. A.
, Usbeck, R., & Biemann, C. (2023).
GETT-QA: Graph Embedding Based T2T Transformer for Knowledge Graph Question Answering. In C. Pesquita, D. Faria, E. Jimenez-Ruiz, J. McCusker, M. Dragoni, A. Dimou, R. Troncy, & S. Hertling (Eds.),
The Semantic Web - 20th International Conference, ESWC 2023, Proceedings (pp. 279-297). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 13870 LNCS). Springer Science and Business Media Deutschland GmbH.
https://doi.org/10.48550/arXiv.2303.13284,
https://doi.org/10.1007/978-3-031-33455-9_17
Vancouver
Banerjee D, Nair PA
, Usbeck R, Biemann C.
GETT-QA: Graph Embedding Based T2T Transformer for Knowledge Graph Question Answering. In Pesquita C, Faria D, Jimenez-Ruiz E, McCusker J, Dragoni M, Dimou A, Troncy R, Hertling S, editors, The Semantic Web - 20th International Conference, ESWC 2023, Proceedings. Springer Science and Business Media Deutschland GmbH. 2023. p. 279-297. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). doi: 10.48550/arXiv.2303.13284, 10.1007/978-3-031-33455-9_17
Bibtex
@inbook{e7fcccd69b804f559c1b79e243b23eb0,
title = "GETT-QA: Graph Embedding Based T2T Transformer for Knowledge Graph Question Answering",
abstract = "In this work, we present an end-to-end Knowledge Graph Question Answering (KGQA) system named GETT-QA. GETT-QA uses T5, a popular text-to-text pre-trained language model. The model takes a question in natural language as input and produces a simpler form of the intended SPARQL query. In the simpler form, the model does not directly produce entity and relation IDs. Instead, it produces corresponding entity and relation labels. The labels are grounded to KG entity and relation IDs in a subsequent step. To further improve the results, we instruct the model to produce a truncated version of the KG embedding for each entity. The truncated KG embedding enables a finer search for disambiguation purposes. We find that T5 is able to learn the truncated KG embeddings without any change of loss function, improving KGQA performance. As a result, we report strong results for LC-QuAD 2.0 and SimpleQuestions-Wikidata datasets on end-to-end KGQA over Wikidata.",
keywords = "Informatics, Business informatics",
author = "Debayan Banerjee and Nair, {Pranav Ajit} and Ricardo Usbeck and Chris Biemann",
note = "Publisher Copyright: {\textcopyright} 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG.; 20th International Conference on The Semantic Web - ESWC 2023 : The Extended Semantic Web Conference, ESWC 2023 ; Conference date: 28-05-2023 Through 01-06-2023",
year = "2023",
month = may,
day = "23",
doi = "10.48550/arXiv.2303.13284",
language = "English",
isbn = "978-3-031-33455-9",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer Science and Business Media Deutschland GmbH",
pages = "279--297",
editor = "Catia Pesquita and Daniel Faria and Ernesto Jimenez-Ruiz and Jamie McCusker and Mauro Dragoni and Anastasia Dimou and Raphael Troncy and Sven Hertling",
booktitle = "The Semantic Web - 20th International Conference, ESWC 2023, Proceedings",
address = "Germany",
url = "https://2023.eswc-conferences.org/",
}
RIS
TY - CHAP
T1 - GETT-QA
T2 - 20th International Conference on The Semantic Web - ESWC 2023
AU - Banerjee, Debayan
AU - Nair, Pranav Ajit
AU - Usbeck, Ricardo
AU - Biemann, Chris
N1 - Conference code: 20
PY - 2023/5/23
Y1 - 2023/5/23
N2 - In this work, we present an end-to-end Knowledge Graph Question Answering (KGQA) system named GETT-QA. GETT-QA uses T5, a popular text-to-text pre-trained language model. The model takes a question in natural language as input and produces a simpler form of the intended SPARQL query. In the simpler form, the model does not directly produce entity and relation IDs. Instead, it produces corresponding entity and relation labels. The labels are grounded to KG entity and relation IDs in a subsequent step. To further improve the results, we instruct the model to produce a truncated version of the KG embedding for each entity. The truncated KG embedding enables a finer search for disambiguation purposes. We find that T5 is able to learn the truncated KG embeddings without any change of loss function, improving KGQA performance. As a result, we report strong results for LC-QuAD 2.0 and SimpleQuestions-Wikidata datasets on end-to-end KGQA over Wikidata.
AB - In this work, we present an end-to-end Knowledge Graph Question Answering (KGQA) system named GETT-QA. GETT-QA uses T5, a popular text-to-text pre-trained language model. The model takes a question in natural language as input and produces a simpler form of the intended SPARQL query. In the simpler form, the model does not directly produce entity and relation IDs. Instead, it produces corresponding entity and relation labels. The labels are grounded to KG entity and relation IDs in a subsequent step. To further improve the results, we instruct the model to produce a truncated version of the KG embedding for each entity. The truncated KG embedding enables a finer search for disambiguation purposes. We find that T5 is able to learn the truncated KG embeddings without any change of loss function, improving KGQA performance. As a result, we report strong results for LC-QuAD 2.0 and SimpleQuestions-Wikidata datasets on end-to-end KGQA over Wikidata.
KW - Informatics
KW - Business informatics
UR - http://www.scopus.com/inward/record.url?scp=85163305271&partnerID=8YFLogxK
U2 - 10.48550/arXiv.2303.13284
DO - 10.48550/arXiv.2303.13284
M3 - Article in conference proceedings
AN - SCOPUS:85163305271
SN - 978-3-031-33455-9
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 279
EP - 297
BT - The Semantic Web - 20th International Conference, ESWC 2023, Proceedings
A2 - Pesquita, Catia
A2 - Faria, Daniel
A2 - Jimenez-Ruiz, Ernesto
A2 - McCusker, Jamie
A2 - Dragoni, Mauro
A2 - Dimou, Anastasia
A2 - Troncy, Raphael
A2 - Hertling, Sven
PB - Springer Science and Business Media Deutschland GmbH
Y2 - 28 May 2023 through 1 June 2023
ER -