Biomedical Entity Linking with Triple-aware Pre-Training

Publikation: Beiträge in SammelwerkenAufsätze in KonferenzbändenForschung

Standard

Biomedical Entity Linking with Triple-aware Pre-Training. / Yan, Xi; Möller, Cedric; Usbeck, Ricardo.
Semantic Technologies and Deep Learning Models for Scientific, Technical and Legal Data 2025. Hrsg. / Rima Dessi; Joy Jeenu; Danilo Dessi; Francesco Osborne; Hidir Aras. Aachen: CEUR-WS, 2025. (CEUR Workshop Proceedings; Band 3979).

Publikation: Beiträge in SammelwerkenAufsätze in KonferenzbändenForschung

Harvard

Yan, X, Möller, C & Usbeck, R 2025, Biomedical Entity Linking with Triple-aware Pre-Training. in R Dessi, J Jeenu, D Dessi, F Osborne & H Aras (Hrsg.), Semantic Technologies and Deep Learning Models for Scientific, Technical and Legal Data 2025. CEUR Workshop Proceedings, Bd. 3979, CEUR-WS, Aachen, Third International Workshop on Semantic Technologies and Deep Learning Models for Scientific, Technical and Legal Data - SemTech4STLD 2025, Portoroz, Slowenien, 01.06.25. https://doi.org/10.48550/arXiv.2308.14429

APA

Yan, X., Möller, C., & Usbeck, R. (2025). Biomedical Entity Linking with Triple-aware Pre-Training. In R. Dessi, J. Jeenu, D. Dessi, F. Osborne, & H. Aras (Hrsg.), Semantic Technologies and Deep Learning Models for Scientific, Technical and Legal Data 2025 (CEUR Workshop Proceedings; Band 3979). CEUR-WS. https://doi.org/10.48550/arXiv.2308.14429

Vancouver

Yan X, Möller C, Usbeck R. Biomedical Entity Linking with Triple-aware Pre-Training. in Dessi R, Jeenu J, Dessi D, Osborne F, Aras H, Hrsg., Semantic Technologies and Deep Learning Models for Scientific, Technical and Legal Data 2025. Aachen: CEUR-WS. 2025. (CEUR Workshop Proceedings). doi: 10.48550/arXiv.2308.14429

Bibtex

@inbook{70dae52eff184dcab552ec79040de9c7,
title = "Biomedical Entity Linking with Triple-aware Pre-Training",
abstract = "The large-scale analysis of scientific and technical documents is crucial for extracting structured knowledge from unstructured text. A key challenge in this process is linking biomedical entities, as these entities are sparsely distributed and often underrepresented in the training data of large language models (LLM). At the same time, those LLMs are not aware of high level semantic connection between different biomedical entities, which are useful in identifying similar concepts in different textual contexts. To cope with aforementioned problems, some recent works focused on injecting knowledge graph information into LLMs. However, former methods either ignore the relational knowledge of the entities or lead to catastrophic forgetting. Therefore, we propose a novel framework to pre-train the powerful generative LLM by a corpus synthesized from a KG. In the evaluations we are unable to confirm the benefit of including synonym, description or relational information. This work-in-progress highlights key challenges and invites further discussion on leveraging semantic information for LLm performance and on scientific document processing. ",
keywords = "Entity Linking,, cientific data,, Deep Learning, Semantic information, Informatics",
author = "Xi Yan and Cedric M{\"o}ller and Ricardo Usbeck",
year = "2025",
month = jun,
day = "16",
doi = "10.48550/arXiv.2308.14429",
language = "English",
series = "CEUR Workshop Proceedings",
publisher = "CEUR-WS",
editor = "Rima Dessi and Joy Jeenu and Danilo Dessi and Francesco Osborne and Hidir Aras",
booktitle = "Semantic Technologies and Deep Learning Models for Scientific, Technical and Legal Data 2025",
address = "Germany",
note = "Third International Workshop on Semantic Technologies and Deep Learning Models for Scientific, Technical and Legal Data - SemTech4STLD 2025, SemTech4STLD 2025 ; Conference date: 01-06-2025 Through 01-06-2025",

}

RIS

TY - CHAP

T1 - Biomedical Entity Linking with Triple-aware Pre-Training

AU - Yan, Xi

AU - Möller, Cedric

AU - Usbeck, Ricardo

N1 - Conference code: 3

PY - 2025/6/16

Y1 - 2025/6/16

N2 - The large-scale analysis of scientific and technical documents is crucial for extracting structured knowledge from unstructured text. A key challenge in this process is linking biomedical entities, as these entities are sparsely distributed and often underrepresented in the training data of large language models (LLM). At the same time, those LLMs are not aware of high level semantic connection between different biomedical entities, which are useful in identifying similar concepts in different textual contexts. To cope with aforementioned problems, some recent works focused on injecting knowledge graph information into LLMs. However, former methods either ignore the relational knowledge of the entities or lead to catastrophic forgetting. Therefore, we propose a novel framework to pre-train the powerful generative LLM by a corpus synthesized from a KG. In the evaluations we are unable to confirm the benefit of including synonym, description or relational information. This work-in-progress highlights key challenges and invites further discussion on leveraging semantic information for LLm performance and on scientific document processing.

AB - The large-scale analysis of scientific and technical documents is crucial for extracting structured knowledge from unstructured text. A key challenge in this process is linking biomedical entities, as these entities are sparsely distributed and often underrepresented in the training data of large language models (LLM). At the same time, those LLMs are not aware of high level semantic connection between different biomedical entities, which are useful in identifying similar concepts in different textual contexts. To cope with aforementioned problems, some recent works focused on injecting knowledge graph information into LLMs. However, former methods either ignore the relational knowledge of the entities or lead to catastrophic forgetting. Therefore, we propose a novel framework to pre-train the powerful generative LLM by a corpus synthesized from a KG. In the evaluations we are unable to confirm the benefit of including synonym, description or relational information. This work-in-progress highlights key challenges and invites further discussion on leveraging semantic information for LLm performance and on scientific document processing.

KW - Entity Linking,

KW - cientific data,

KW - Deep Learning

KW - Semantic information

KW - Informatics

U2 - 10.48550/arXiv.2308.14429

DO - 10.48550/arXiv.2308.14429

M3 - Article in conference proceedings

T3 - CEUR Workshop Proceedings

BT - Semantic Technologies and Deep Learning Models for Scientific, Technical and Legal Data 2025

A2 - Dessi, Rima

A2 - Jeenu, Joy

A2 - Dessi, Danilo

A2 - Osborne, Francesco

A2 - Aras, Hidir

PB - CEUR-WS

CY - Aachen

T2 - Third International Workshop on Semantic Technologies and Deep Learning Models for Scientific, Technical and Legal Data - SemTech4STLD 2025

Y2 - 1 June 2025 through 1 June 2025

ER -

Links

DOI