Surveying the FAIRness of Annotation Tools: Difficult to find, difficult to reuse
Research output: Contributions to collected editions/works › Article in conference proceedings › Research › peer-review
Standard
LAW 2024 - 18th Linguistic Annotation Workshop, Co-located with EACL 2024 - Proceedings of the Workshop: Proceedings of the Workshop. ed. / Sophie Henning; Manfred Stede. Stroudsburg: Association for Computational Linguistics (ACL), 2024. p. 29-45.
Research output: Contributions to collected editions/works › Article in conference proceedings › Research › peer-review
Harvard
APA
Vancouver
Bibtex
}
RIS
TY - CHAP
T1 - Surveying the FAIRness of Annotation Tools: Difficult to find, difficult to reuse
AU - Borisova, Ekaterina
AU - Abu Ahmad, Raia
AU - Garcia-Castro, Leyla Jael
AU - Usbeck, Ricardo
AU - Rehm, Georg
N1 - Conference code: 18
PY - 2024/3/1
Y1 - 2024/3/1
N2 - In the realm of Machine Learning and Deep Learning, there is a need for high-quality annotated data to train and evaluate supervised models. An extensive number of annotation tools have been developed to facilitate the data labelling process. However, finding the right tool is a demanding task involving thorough searching and testing. Hence, to effectively navigate the multitude of tools, it becomes essential to ensure their findability, accessibility, interoperability, and reusability (FAIR). This survey addresses the FAIRness of existing annotation software by evaluating 50 different tools against the FAIR principles for research software (FAIR4RS). The study indicates that while being accessible and interoperable, annotation tools are difficult to find and reuse. In addition, there is a need to establish community standards for annotation software development, documentation, and distribution.
AB - In the realm of Machine Learning and Deep Learning, there is a need for high-quality annotated data to train and evaluate supervised models. An extensive number of annotation tools have been developed to facilitate the data labelling process. However, finding the right tool is a demanding task involving thorough searching and testing. Hence, to effectively navigate the multitude of tools, it becomes essential to ensure their findability, accessibility, interoperability, and reusability (FAIR). This survey addresses the FAIRness of existing annotation software by evaluating 50 different tools against the FAIR principles for research software (FAIR4RS). The study indicates that while being accessible and interoperable, annotation tools are difficult to find and reuse. In addition, there is a need to establish community standards for annotation software development, documentation, and distribution.
KW - Business informatics
UR - https://aclanthology.org/2024.law-1.pdf
UR - http://www.scopus.com/inward/record.url?scp=85188717570&partnerID=8YFLogxK
M3 - Article in conference proceedings
SP - 29
EP - 45
BT - LAW 2024 - 18th Linguistic Annotation Workshop, Co-located with EACL 2024 - Proceedings of the Workshop
A2 - Henning, Sophie
A2 - Stede, Manfred
PB - Association for Computational Linguistics (ACL)
CY - Stroudsburg
T2 - 18th Linguistic Annotation Workshop
Y2 - 21 March 2024 through 22 March 2024
ER -