Co-EM Support Vector learning

Research output: Contributions to collected editions/worksArticle in conference proceedingsResearchpeer-review

Standard

Co-EM Support Vector learning. / Brefeld, Ulf; Scheffer, Tobias.
Proceeding ICML '04 Proceedings of the twenty-first international conference on Machine learning. New York: Association for Computing Machinery, Inc, 2004. p. 121-128 (Proceedings, Twenty-First International Conference on Machine Learning, ICML 2004).

Research output: Contributions to collected editions/worksArticle in conference proceedingsResearchpeer-review

Harvard

Brefeld, U & Scheffer, T 2004, Co-EM Support Vector learning. in Proceeding ICML '04 Proceedings of the twenty-first international conference on Machine learning. Proceedings, Twenty-First International Conference on Machine Learning, ICML 2004, Association for Computing Machinery, Inc, New York, pp. 121-128, 21st International Conference on Machine Learning - 2004, Banff, Canada, 31.12.04. https://doi.org/10.1145/1015330.1015350

APA

Brefeld, U., & Scheffer, T. (2004). Co-EM Support Vector learning. In Proceeding ICML '04 Proceedings of the twenty-first international conference on Machine learning (pp. 121-128). (Proceedings, Twenty-First International Conference on Machine Learning, ICML 2004). Association for Computing Machinery, Inc. https://doi.org/10.1145/1015330.1015350

Vancouver

Brefeld U, Scheffer T. Co-EM Support Vector learning. In Proceeding ICML '04 Proceedings of the twenty-first international conference on Machine learning. New York: Association for Computing Machinery, Inc. 2004. p. 121-128. (Proceedings, Twenty-First International Conference on Machine Learning, ICML 2004). doi: 10.1145/1015330.1015350

Bibtex

@inbook{a4397ba89d664a08a6817898eede574b,
title = "Co-EM Support Vector learning",
abstract = "Multi-view algorithms, such as co-training and co-EM, utilize unlabeled data when the available attributes can be split into independent and compatible subsets. Co-EM outperforms co-training for many problems, but it requires the underlying learner to estimate class probabilities, and to learn from probabilistically labeled data. Therefore, co-EM has so far only been studied with naive Bayesian learners. We cast linear classifiers into a probabilistic framework and develop a co-EM version of the Support Vector Machine. We conduct experiments on text classification problems and compare the family of semi-supervised support vector algorithms under different conditions, including violations of the assumptions underlying multiview learning. For some problems, such as course web page classification, we observe the most accurate results reported so far.",
keywords = "Informatics, Business informatics",
author = "Ulf Brefeld and Tobias Scheffer",
year = "2004",
doi = "10.1145/1015330.1015350",
language = "English",
isbn = "1-58113-838-5 ",
series = "Proceedings, Twenty-First International Conference on Machine Learning, ICML 2004",
publisher = "Association for Computing Machinery, Inc",
pages = "121--128",
booktitle = "Proceeding ICML '04 Proceedings of the twenty-first international conference on Machine learning",
address = "United States",
note = "21st International Conference on Machine Learning - 2004, ICML 2004 ; Conference date: 31-12-2004",
url = "https://icml.cc/imls/icml.html",

}

RIS

TY - CHAP

T1 - Co-EM Support Vector learning

AU - Brefeld, Ulf

AU - Scheffer, Tobias

N1 - Conference code: 21

PY - 2004

Y1 - 2004

N2 - Multi-view algorithms, such as co-training and co-EM, utilize unlabeled data when the available attributes can be split into independent and compatible subsets. Co-EM outperforms co-training for many problems, but it requires the underlying learner to estimate class probabilities, and to learn from probabilistically labeled data. Therefore, co-EM has so far only been studied with naive Bayesian learners. We cast linear classifiers into a probabilistic framework and develop a co-EM version of the Support Vector Machine. We conduct experiments on text classification problems and compare the family of semi-supervised support vector algorithms under different conditions, including violations of the assumptions underlying multiview learning. For some problems, such as course web page classification, we observe the most accurate results reported so far.

AB - Multi-view algorithms, such as co-training and co-EM, utilize unlabeled data when the available attributes can be split into independent and compatible subsets. Co-EM outperforms co-training for many problems, but it requires the underlying learner to estimate class probabilities, and to learn from probabilistically labeled data. Therefore, co-EM has so far only been studied with naive Bayesian learners. We cast linear classifiers into a probabilistic framework and develop a co-EM version of the Support Vector Machine. We conduct experiments on text classification problems and compare the family of semi-supervised support vector algorithms under different conditions, including violations of the assumptions underlying multiview learning. For some problems, such as course web page classification, we observe the most accurate results reported so far.

KW - Informatics

KW - Business informatics

UR - http://www.scopus.com/inward/record.url?scp=14344251008&partnerID=8YFLogxK

UR - https://www.mendeley.com/catalogue/464ca35f-43df-33f6-acdc-c9ae6911d1a7/

U2 - 10.1145/1015330.1015350

DO - 10.1145/1015330.1015350

M3 - Article in conference proceedings

AN - SCOPUS:14344251008

SN - 1-58113-838-5

SN - 978-1-58113-838-2

T3 - Proceedings, Twenty-First International Conference on Machine Learning, ICML 2004

SP - 121

EP - 128

BT - Proceeding ICML '04 Proceedings of the twenty-first international conference on Machine learning

PB - Association for Computing Machinery, Inc

CY - New York

T2 - 21st International Conference on Machine Learning - 2004

Y2 - 31 December 2004

ER -

DOI

Recently viewed

Researchers

  1. Jörn Obermann

Publications

  1. International Master's Programme in sustainable development and management
  2. Local levers for change
  3. Spectra of the planar Multipole Resonance Probe determined by a Kinetic Model
  4. Is decoupling becoming decoupled from institutional theory?
  5. Alltagsorientierung im Mathematikunterricht
  6. How does collaborative governance evolve?
  7. International Master’s Programme in Sustainable Development and Management
  8. The New Media
  9. Erroneous Examples: A Preliminary Investigation into Learning Benefits
  10. Resisting alignment
  11. Measuring the diversity of what? And for what purpose?
  12. Digitized Evaluation of Academic Opportunities to Learn (OTLs) Concerning Linguistically Responsive Teaching (LRT)
  13. Same Play, Different Actors?
  14. Measuring institutional overlap in global governance
  15. Wir sind ihr
  16. Harmony at the Workplace
  17. Relationality
  18. Liquidity, Flows, Circulation
  19. Differentiated Instruction Around the World - A Global Inclusive Insight
  20. Number Pyramids as a Mathematically Rich Learning Environment for All Students
  21. Effect of safflower oil on the protective properties of the in situ formed salivary pellicle
  22. Two high-mountain burnet moth species (Lepidoptera, Zygaenidae) react differently to the global change drivers climate and land-use
  23. Delivering community benefits through REDD plus : Lessons from Joint Forest Management in Zambia
  24. Temporal and spatial scaling impacts on extreme precipitation
  25. Aligning Experimentation with Product Operations
  26. Series foreword
  27. Schreiben in der Sekundarstufe II
  28. Priming effects induced by glucose and decaying plant residues on SOM decomposition: A three-source 13C/14C partitioning study