DialoKG: Knowledge-Structure Aware Task-Oriented Dialogue Generation
Research output: Contributions to collected editions/works › Article in conference proceedings › Research › peer-review
Standard
Findings of the Association for Computational Linguistics: NAACL 2022 - Findings. Association for Computational Linguistics (ACL), 2022. p. 2557-2571 (Findings of the Association for Computational Linguistics: NAACL 2022 - Findings).
Research output: Contributions to collected editions/works › Article in conference proceedings › Research › peer-review
Harvard
APA
Vancouver
Bibtex
}
RIS
TY - CHAP
T1 - DialoKG
T2 - 2022 Findings of the Association for Computational Linguistics - NAACL 2022
AU - Al Hasan Rony, Md Rashad
AU - Usbeck, Ricardo
AU - Lehmann, Jens
N1 - Funding Information: We acknowledge the support of the following projects: SPEAKER (BMWi FKZ 01MK20011A), JOSEPH (Fraunhofer Zukunftsstiftung), OpenGPT-X (BMWK FKZ 68GX21007A), the excellence clusters ML2R (BmBF FKZ 01 15 18038 A/B/C), ScaDS.AI (IS18026A-F) and TAILOR (EU GA 952215). The authors also acknowledge the financial support by the Federal Ministry for Economic Affairs and Energy of Germany in the project CoyPu (project number 01MK21007G). Publisher Copyright: © Findings of the Association for Computational Linguistics: NAACL 2022 - Findings.
PY - 2022/1/1
Y1 - 2022/1/1
N2 - Task-oriented dialogue generation is challenging since the underlying knowledge is often dynamic and effectively incorporating knowledge into the learning process is hard. It is particularly challenging to generate both humanlike and informative responses in this setting. Recent research primarily focused on various knowledge distillation methods where the underlying relationship between the facts in a knowledge base is not effectively captured. In this paper, we go one step further and demonstrate how the structural information of a knowledge graph can improve the system's inference capabilities. Specifically, we propose DialoKG, a novel task-oriented dialogue system that effectively incorporates knowledge into a language model. Our proposed system views relational knowledge as a knowledge graph and introduces (1) a structure-aware knowledge embedding technique, and (2) a knowledge graph-weighted attention masking strategy to facilitate the system selecting relevant information during the dialogue generation. An empirical evaluation demonstrates the effectiveness of DialoKG over state-of-theart methods on several standard benchmark datasets.
AB - Task-oriented dialogue generation is challenging since the underlying knowledge is often dynamic and effectively incorporating knowledge into the learning process is hard. It is particularly challenging to generate both humanlike and informative responses in this setting. Recent research primarily focused on various knowledge distillation methods where the underlying relationship between the facts in a knowledge base is not effectively captured. In this paper, we go one step further and demonstrate how the structural information of a knowledge graph can improve the system's inference capabilities. Specifically, we propose DialoKG, a novel task-oriented dialogue system that effectively incorporates knowledge into a language model. Our proposed system views relational knowledge as a knowledge graph and introduces (1) a structure-aware knowledge embedding technique, and (2) a knowledge graph-weighted attention masking strategy to facilitate the system selecting relevant information during the dialogue generation. An empirical evaluation demonstrates the effectiveness of DialoKG over state-of-theart methods on several standard benchmark datasets.
KW - Informatics
KW - Business informatics
UR - http://www.scopus.com/inward/record.url?scp=85137369692&partnerID=8YFLogxK
UR - https://www.mendeley.com/catalogue/51550c99-d305-34c2-a149-3d0941d7dfe9/
U2 - 10.18653/v1/2022.findings-naacl.195
DO - 10.18653/v1/2022.findings-naacl.195
M3 - Article in conference proceedings
AN - SCOPUS:85137369692
T3 - Findings of the Association for Computational Linguistics: NAACL 2022 - Findings
SP - 2557
EP - 2571
BT - Findings of the Association for Computational Linguistics
PB - Association for Computational Linguistics (ACL)
Y2 - 10 July 2022 through 15 July 2022
ER -