Ontology-Guided, Hybrid Prompt Learning for Generalization in Knowledge Graph Question Answering

Publikation: Andere wissenschaftliche BeiträgeAndereForschung

Standard

Ontology-Guided, Hybrid Prompt Learning for Generalization in Knowledge Graph Question Answering. / Jiang, Longquan; Huang, Junbo; Möller, Cedric et al.
2025.

Publikation: Andere wissenschaftliche BeiträgeAndereForschung

Harvard

APA

Vancouver

Bibtex

@misc{0f899aebf9484268bed84536ff5c9d0a,
title = "Ontology-Guided, Hybrid Prompt Learning for Generalization in Knowledge Graph Question Answering",
abstract = "Most existing Knowledge Graph Question Answering (KGQA) approaches are designed for a specific KG, such as Wikidata, DBpedia or Freebase. Due to the heterogeneity of the underlying graph schema, topology and assertions, most KGQA systems cannot be transferred to unseen Knowledge Graphs (KGs) without resource-intensive training data. We present OntoSCPrompt, a novel Large Language Model (LLM)-basedKGQA approach with a two-stage architecture that separates semantic parsing from KG-dependent interactions. OntoSCPrompt first generates a SPARQL query structure (including SPARQL keywords such as SELECT, ASK, WHERE and placeholdersfor missing tokens) and then fills them with KG-specific information. To enhance the understanding of the underlying KG, we present an ontology-guided, hybrid prompt learning strategy that integrates KG ontology into the learning process of hybridprompts (e.g., discrete and continuous vectors). We also present several task-specific decoding strategies to ensure the correctness and executability of generated SPARQL queries in both stages. Experimental results demonstrate that OntoSCPrompt performs as well as SOTA approaches without retraining on a number ofKGQA datasets such as CWQ, WebQSP and LC-QuAD 1.0 in a resource-efficient manner and can generalize well to unseen domain-specific KGs like DBLP-QuAD and CoyPu KG 1. Index Terms—QA, KGQA, LLM, Generalization.",
keywords = "cs.CL, cs.AI",
author = "Longquan Jiang and Junbo Huang and Cedric M{\"o}ller and Ricardo Usbeck",
note = "Accepted By ICSC 2025",
year = "2025",
month = feb,
day = "6",
language = "English",
type = "Other",

}

RIS

TY - GEN

T1 - Ontology-Guided, Hybrid Prompt Learning for Generalization in Knowledge Graph Question Answering

AU - Jiang, Longquan

AU - Huang, Junbo

AU - Möller, Cedric

AU - Usbeck, Ricardo

N1 - Accepted By ICSC 2025

PY - 2025/2/6

Y1 - 2025/2/6

N2 - Most existing Knowledge Graph Question Answering (KGQA) approaches are designed for a specific KG, such as Wikidata, DBpedia or Freebase. Due to the heterogeneity of the underlying graph schema, topology and assertions, most KGQA systems cannot be transferred to unseen Knowledge Graphs (KGs) without resource-intensive training data. We present OntoSCPrompt, a novel Large Language Model (LLM)-basedKGQA approach with a two-stage architecture that separates semantic parsing from KG-dependent interactions. OntoSCPrompt first generates a SPARQL query structure (including SPARQL keywords such as SELECT, ASK, WHERE and placeholdersfor missing tokens) and then fills them with KG-specific information. To enhance the understanding of the underlying KG, we present an ontology-guided, hybrid prompt learning strategy that integrates KG ontology into the learning process of hybridprompts (e.g., discrete and continuous vectors). We also present several task-specific decoding strategies to ensure the correctness and executability of generated SPARQL queries in both stages. Experimental results demonstrate that OntoSCPrompt performs as well as SOTA approaches without retraining on a number ofKGQA datasets such as CWQ, WebQSP and LC-QuAD 1.0 in a resource-efficient manner and can generalize well to unseen domain-specific KGs like DBLP-QuAD and CoyPu KG 1. Index Terms—QA, KGQA, LLM, Generalization.

AB - Most existing Knowledge Graph Question Answering (KGQA) approaches are designed for a specific KG, such as Wikidata, DBpedia or Freebase. Due to the heterogeneity of the underlying graph schema, topology and assertions, most KGQA systems cannot be transferred to unseen Knowledge Graphs (KGs) without resource-intensive training data. We present OntoSCPrompt, a novel Large Language Model (LLM)-basedKGQA approach with a two-stage architecture that separates semantic parsing from KG-dependent interactions. OntoSCPrompt first generates a SPARQL query structure (including SPARQL keywords such as SELECT, ASK, WHERE and placeholdersfor missing tokens) and then fills them with KG-specific information. To enhance the understanding of the underlying KG, we present an ontology-guided, hybrid prompt learning strategy that integrates KG ontology into the learning process of hybridprompts (e.g., discrete and continuous vectors). We also present several task-specific decoding strategies to ensure the correctness and executability of generated SPARQL queries in both stages. Experimental results demonstrate that OntoSCPrompt performs as well as SOTA approaches without retraining on a number ofKGQA datasets such as CWQ, WebQSP and LC-QuAD 1.0 in a resource-efficient manner and can generalize well to unseen domain-specific KGs like DBLP-QuAD and CoyPu KG 1. Index Terms—QA, KGQA, LLM, Generalization.

KW - cs.CL

KW - cs.AI

UR - https://arxiv.org/abs/2502.03992

M3 - Other

ER -

Dokumente

  • 2502.03992v1

    739 KB, PDF-Dokument