Standard
Treating dialogue quality evaluation as an anomaly detection problem. / Nedelchev, Rostislav; Lehmann, Jens
; Usbeck, Ricardo.
LREC 2020 - 12th International Conference on Language Resources and Evaluation, Conference Proceedings. Hrsg. / Nicoletta Calzolari; Frederic Bechet; Philippe Blache; Khalid Choukri; Christopher Cieri; Thierry Declerck; Sara Goggi; Hitoshi Isahara; Bente Maegaard; Joseph Mariani; Helene Mazo; Asuncion Moreno; Jan Odijk; Stelios Piperidis. European Language Resources Association (ELRA), 2020. S. 508-512 (LREC 2020 - 12th International Conference on Language Resources and Evaluation, Conference Proceedings).
Publikation: Beiträge in Sammelwerken › Aufsätze in Konferenzbänden › Forschung › begutachtet
Harvard
Nedelchev, R, Lehmann, J
& Usbeck, R 2020,
Treating dialogue quality evaluation as an anomaly detection problem. in N Calzolari, F Bechet, P Blache, K Choukri, C Cieri, T Declerck, S Goggi, H Isahara, B Maegaard, J Mariani, H Mazo, A Moreno, J Odijk & S Piperidis (Hrsg.),
LREC 2020 - 12th International Conference on Language Resources and Evaluation, Conference Proceedings. LREC 2020 - 12th International Conference on Language Resources and Evaluation, Conference Proceedings, European Language Resources Association (ELRA), S. 508-512, 12th International Conference on Language Resources and Evaluation, LREC 2020, Marseille, Frankreich,
11.05.20. <
https://aclanthology.org/2020.lrec-1.64>
APA
Nedelchev, R., Lehmann, J.
, & Usbeck, R. (2020).
Treating dialogue quality evaluation as an anomaly detection problem. In N. Calzolari, F. Bechet, P. Blache, K. Choukri, C. Cieri, T. Declerck, S. Goggi, H. Isahara, B. Maegaard, J. Mariani, H. Mazo, A. Moreno, J. Odijk, & S. Piperidis (Hrsg.),
LREC 2020 - 12th International Conference on Language Resources and Evaluation, Conference Proceedings (S. 508-512). (LREC 2020 - 12th International Conference on Language Resources and Evaluation, Conference Proceedings). European Language Resources Association (ELRA).
https://aclanthology.org/2020.lrec-1.64
Vancouver
Nedelchev R, Lehmann J
, Usbeck R.
Treating dialogue quality evaluation as an anomaly detection problem. in Calzolari N, Bechet F, Blache P, Choukri K, Cieri C, Declerck T, Goggi S, Isahara H, Maegaard B, Mariani J, Mazo H, Moreno A, Odijk J, Piperidis S, Hrsg., LREC 2020 - 12th International Conference on Language Resources and Evaluation, Conference Proceedings. European Language Resources Association (ELRA). 2020. S. 508-512. (LREC 2020 - 12th International Conference on Language Resources and Evaluation, Conference Proceedings).
Bibtex
@inbook{4e2b83e41f414cb48aa475435e6918da,
title = "Treating dialogue quality evaluation as an anomaly detection problem",
abstract = "Dialogue systems for interaction with humans have been enjoying increased popularity in the research and industry fields. To this day, the best way to estimate their success is through means of human evaluation and not automated approaches, despite the abundance of work done in the field. In this paper, we investigate the effectiveness of perceiving dialogue evaluation as an anomaly detection task. The paper looks into four dialogue modeling approaches and how their objective functions correlate with human annotation scores. A high-level perspective exhibits negative results. However, a more in-depth look shows limited potential for using anomaly detection for evaluating dialogues.",
keywords = "Dialogue, Discourse Annotation, Evaluation Methodologies, Processing, Representation, Informatics, Business informatics",
author = "Rostislav Nedelchev and Jens Lehmann and Ricardo Usbeck",
note = "Publisher Copyright: {\textcopyright} European Language Resources Association (ELRA), licensed under CC-BY-NC; 12th International Conference on Language Resources and Evaluation, LREC 2020 ; Conference date: 11-05-2020 Through 16-05-2020",
year = "2020",
language = "English",
series = "LREC 2020 - 12th International Conference on Language Resources and Evaluation, Conference Proceedings",
publisher = "European Language Resources Association (ELRA)",
pages = "508--512",
editor = "Nicoletta Calzolari and Frederic Bechet and Philippe Blache and Khalid Choukri and Christopher Cieri and Thierry Declerck and Sara Goggi and Hitoshi Isahara and Bente Maegaard and Joseph Mariani and Helene Mazo and Asuncion Moreno and Jan Odijk and Stelios Piperidis",
booktitle = "LREC 2020 - 12th International Conference on Language Resources and Evaluation, Conference Proceedings",
address = "Luxembourg",
url = "https://lrec2020.lrec-conf.org/en/about/organizers/index.html",
}
RIS
TY - CHAP
T1 - Treating dialogue quality evaluation as an anomaly detection problem
AU - Nedelchev, Rostislav
AU - Lehmann, Jens
AU - Usbeck, Ricardo
N1 - Publisher Copyright:
© European Language Resources Association (ELRA), licensed under CC-BY-NC
PY - 2020
Y1 - 2020
N2 - Dialogue systems for interaction with humans have been enjoying increased popularity in the research and industry fields. To this day, the best way to estimate their success is through means of human evaluation and not automated approaches, despite the abundance of work done in the field. In this paper, we investigate the effectiveness of perceiving dialogue evaluation as an anomaly detection task. The paper looks into four dialogue modeling approaches and how their objective functions correlate with human annotation scores. A high-level perspective exhibits negative results. However, a more in-depth look shows limited potential for using anomaly detection for evaluating dialogues.
AB - Dialogue systems for interaction with humans have been enjoying increased popularity in the research and industry fields. To this day, the best way to estimate their success is through means of human evaluation and not automated approaches, despite the abundance of work done in the field. In this paper, we investigate the effectiveness of perceiving dialogue evaluation as an anomaly detection task. The paper looks into four dialogue modeling approaches and how their objective functions correlate with human annotation scores. A high-level perspective exhibits negative results. However, a more in-depth look shows limited potential for using anomaly detection for evaluating dialogues.
KW - Dialogue
KW - Discourse Annotation
KW - Evaluation Methodologies
KW - Processing
KW - Representation
KW - Informatics
KW - Business informatics
UR - http://www.scopus.com/inward/record.url?scp=85096532825&partnerID=8YFLogxK
UR - https://www.mendeley.com/catalogue/50653d2d-f7e2-36ae-871a-85fb7753b95b/
M3 - Article in conference proceedings
AN - SCOPUS:85096532825
T3 - LREC 2020 - 12th International Conference on Language Resources and Evaluation, Conference Proceedings
SP - 508
EP - 512
BT - LREC 2020 - 12th International Conference on Language Resources and Evaluation, Conference Proceedings
A2 - Calzolari, Nicoletta
A2 - Bechet, Frederic
A2 - Blache, Philippe
A2 - Choukri, Khalid
A2 - Cieri, Christopher
A2 - Declerck, Thierry
A2 - Goggi, Sara
A2 - Isahara, Hitoshi
A2 - Maegaard, Bente
A2 - Mariani, Joseph
A2 - Mazo, Helene
A2 - Moreno, Asuncion
A2 - Odijk, Jan
A2 - Piperidis, Stelios
PB - European Language Resources Association (ELRA)
T2 - 12th International Conference on Language Resources and Evaluation, LREC 2020
Y2 - 11 May 2020 through 16 May 2020
ER -