Co-EM Support Vector learning
Publikation: Beiträge in Sammelwerken › Aufsätze in Konferenzbänden › Forschung › begutachtet
Authors
Multi-view algorithms, such as co-training and co-EM, utilize unlabeled data when the available attributes can be split into independent and compatible subsets. Co-EM outperforms co-training for many problems, but it requires the underlying learner to estimate class probabilities, and to learn from probabilistically labeled data. Therefore, co-EM has so far only been studied with naive Bayesian learners. We cast linear classifiers into a probabilistic framework and develop a co-EM version of the Support Vector Machine. We conduct experiments on text classification problems and compare the family of semi-supervised support vector algorithms under different conditions, including violations of the assumptions underlying multiview learning. For some problems, such as course web page classification, we observe the most accurate results reported so far.
Originalsprache | Englisch |
---|---|
Titel | Proceeding ICML '04 Proceedings of the twenty-first international conference on Machine learning |
Anzahl der Seiten | 8 |
Erscheinungsort | New York |
Verlag | Association for Computing Machinery, Inc |
Erscheinungsdatum | 2004 |
Seiten | 121-128 |
ISBN (Print) | 1-58113-838-5 , 978-1-58113-838-2 |
DOIs | |
Publikationsstatus | Erschienen - 2004 |
Veranstaltung | 21st International Conference on Machine Learning - 2004 - Banff, Kanada Dauer: 31.12.2004 → … Konferenznummer: 21 https://icml.cc/imls/icml.html |
- Informatik
- Wirtschaftsinformatik