Latent structure perceptron with feature induction for unrestricted coreference resolution

Research output: Contributions to collected editions/worksArticle in conference proceedingsResearchpeer-review

Authors

We describe a machine learning system based on large margin structure perceptron for unrestricted coreference resolution that introduces two key modeling techniques: latent coreference trees and entropy guided feature induction. The proposed latent tree modeling turns the learning problem computationally feasible. Additionally, using an automatic feature induction method, we are able to efficiently build nonlinear models and, hence, achieve high performances with a linear learning algorithm. Our system is evaluated on the CoNLL-2012 Shared Task closed track, which comprises three languages: Arabic, Chinese and English. We apply the same system to all languages, except for minor adaptations on some language dependent features, like static lists of pronouns. Our system achieves an official score of 58.69, the best one among all the competitors.

Original languageEnglish
Title of host publicationJoint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning : EMNLP-CoNLL 2012; Proceedings of the Shared Task: Modeling Multilingual Unrestricted Coreference in OntoNotes, July 13, 2012
Number of pages8
Place of PublicationStroudsburg
PublisherAssociation for Computational Linguistics (ACL)
Publication date2012
Pages41-48
ISBN (electronic)978-1-937284-45-9
Publication statusPublished - 2012
Externally publishedYes
Event2012 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning - EMNLP-CoNLL 2012: Shared Task: Modeling Multilingual Unrestricted Coreference in OntoNotes - Jeju Island, Korea, Republic of
Duration: 12.07.201214.07.2012