Latent structure perceptron with feature induction for unrestricted coreference resolution

Publikation: Beiträge in SammelwerkenAufsätze in KonferenzbändenForschungbegutachtet

Authors

We describe a machine learning system based on large margin structure perceptron for unrestricted coreference resolution that introduces two key modeling techniques: latent coreference trees and entropy guided feature induction. The proposed latent tree modeling turns the learning problem computationally feasible. Additionally, using an automatic feature induction method, we are able to efficiently build nonlinear models and, hence, achieve high performances with a linear learning algorithm. Our system is evaluated on the CoNLL-2012 Shared Task closed track, which comprises three languages: Arabic, Chinese and English. We apply the same system to all languages, except for minor adaptations on some language dependent features, like static lists of pronouns. Our system achieves an official score of 58.69, the best one among all the competitors.

OriginalspracheEnglisch
TitelJoint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning : EMNLP-CoNLL 2012; Proceedings of the Shared Task: Modeling Multilingual Unrestricted Coreference in OntoNotes, July 13, 2012
Anzahl der Seiten8
ErscheinungsortStroudsburg
VerlagAssociation for Computational Linguistics (ACL)
Erscheinungsdatum2012
Seiten41-48
ISBN (elektronisch)978-1-937284-45-9
PublikationsstatusErschienen - 2012
Extern publiziertJa
Veranstaltung2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning - EMNLP-CoNLL 2012: Shared Task: Modeling Multilingual Unrestricted Coreference in OntoNotes - Jeju Island, Südkorea
Dauer: 12.07.201214.07.2012

Links