Semi-supervised learning for structured output variables

Research output: Contributions to collected editions/worksArticle in conference proceedingsResearchpeer-review

Authors

The problem of learning a mapping between input and structured, interdependent output variables covers sequential, spatial, and relational learning as well as predicting recursive structures. Joint feature representations of the input and output variables have paved the way to leveraging discriminative learners such as SVMs to this class of problems. We address the problem of semi-supervised learning in joint input output spaces. The cotraining approach is based on the principle of maximizing the consensus among multiple independent hypotheses; we develop this principle into a semi-supervised support vector learning algorithm for joint input out-put spaces and arbitrary loss functions. Experiments investigate the benefit of semi-supervised structured models in terms of accuracy and F1 score.

Original languageEnglish
Title of host publicationProceedings of the 23rd international conference on Machine learning
EditorsWilliam Cohen, Andrew Moore
Number of pages8
PublisherAssociation for Computing Machinery, Inc
Publication date01.01.2006
Pages145-152
ISBN (print)978-1-59593-383-6
DOIs
Publication statusPublished - 01.01.2006
Externally publishedYes
EventICML '06 - Carnegie Mellon University, Pittsburgh, United States
Duration: 25.06.200629.06.2006
Conference number: 23

DOI