Biomedical Entity Linking with Triple-aware Pre-Training
Research output: Contributions to collected editions/works › Article in conference proceedings › Research
Authors
The large-scale analysis of scientific and technical documents is crucial for extracting structured knowledge from unstructured text. A key challenge in this process is linking biomedical entities, as these entities are sparsely distributed and often underrepresented in the training data of large language models (LLM). At the same time, those LLMs are not aware of high level semantic connection between different biomedical entities, which are useful in identifying similar concepts in different textual contexts. To cope with aforementioned problems, some recent works focused on injecting knowledge graph information into LLMs. However, former methods either ignore the relational knowledge of the entities or lead to catastrophic forgetting. Therefore, we propose a novel framework to pre-train the powerful generative LLM by a corpus synthesized from a KG. In the evaluations we are unable to confirm the benefit of including synonym, description or relational information. This work-in-progress highlights key challenges and invites further discussion on leveraging semantic information for LLm performance and on scientific document processing.
Original language | English |
---|---|
Title of host publication | Semantic Technologies and Deep Learning Models for Scientific, Technical and Legal Data 2025 |
Editors | Rima Dessi, Joy Jeenu, Danilo Dessi, Francesco Osborne, Hidir Aras |
Number of pages | 8 |
Place of Publication | Aachen |
Publisher | CEUR-WS |
Publication date | 16.06.2025 |
DOIs | |
Publication status | Published - 16.06.2025 |
Event | Third International Workshop on Semantic Technologies and Deep Learning Models for Scientific, Technical and Legal Data - SemTech4STLD 2025 - Portoroz, Slovenia Duration: 01.06.2025 → 01.06.2025 Conference number: 3 |
- Entity Linking, cientific data, Deep Learning, Semantic information
- Informatics