Efficient co-regularised least squares regression

Publikation: Beiträge in SammelwerkenAufsätze in KonferenzbändenForschungbegutachtet

Standard

Efficient co-regularised least squares regression. / Brefeld, Ulf; Gärtner, Thomas; Scheffer, Tobias et al.
Proceedings of the 23rd international conference on Machine learning. Hrsg. / William Cohen. Association for Computing Machinery, Inc, 2006. S. 137-144 (ACM International Conference Proceeding Series; Band 148).

Publikation: Beiträge in SammelwerkenAufsätze in KonferenzbändenForschungbegutachtet

Harvard

Brefeld, U, Gärtner, T, Scheffer, T & Wrobel, S 2006, Efficient co-regularised least squares regression. in W Cohen (Hrsg.), Proceedings of the 23rd international conference on Machine learning. ACM International Conference Proceeding Series, Bd. 148, Association for Computing Machinery, Inc, S. 137-144, International Conference on Machine Learning - ICML 2006, Pittsburgh, USA / Vereinigte Staaten, 25.06.06. https://doi.org/10.1145/1143844.1143862

APA

Brefeld, U., Gärtner, T., Scheffer, T., & Wrobel, S. (2006). Efficient co-regularised least squares regression. In W. Cohen (Hrsg.), Proceedings of the 23rd international conference on Machine learning (S. 137-144). (ACM International Conference Proceeding Series; Band 148). Association for Computing Machinery, Inc. https://doi.org/10.1145/1143844.1143862

Vancouver

Brefeld U, Gärtner T, Scheffer T, Wrobel S. Efficient co-regularised least squares regression. in Cohen W, Hrsg., Proceedings of the 23rd international conference on Machine learning. Association for Computing Machinery, Inc. 2006. S. 137-144. (ACM International Conference Proceeding Series). doi: 10.1145/1143844.1143862

Bibtex

@inbook{d224cc66882a4795951b471f850457f5,
title = "Efficient co-regularised least squares regression",
abstract = "In many applications, unlabelled examples are inexpensive and easy to obtain. Semi-supervised approaches try to utilise such examples to reduce the predictive error. In this paper, we investigate a semi-supervised least squares regression algorithm based on the co-learning approach. Similar to other semi-supervised algorithms, our base algorithm has cubic runtime complexity in the number of unlabelled examples. To be able to handle larger sets of unlabelled examples, we devise a semi-parametric variant that scales linearly in the number of unlabelled examples. Ex-periments show a significant error reduction by co-regularisation and a large runtime improvement for the semi-parametric approximation. Last but not least, we propose a distributed procedure that can be applied without collecting all data at a single site.",
keywords = "Informatics, Business informatics",
author = "Ulf Brefeld and Thomas G{\"a}rtner and Tobias Scheffer and Stefan Wrobel",
year = "2006",
month = jan,
day = "1",
doi = "10.1145/1143844.1143862",
language = "English",
isbn = "978-159593383-6",
series = "ACM International Conference Proceeding Series",
publisher = "Association for Computing Machinery, Inc",
pages = "137--144",
editor = "William Cohen",
booktitle = "Proceedings of the 23rd international conference on Machine learning",
address = "United States",
note = "ICML '06 ; Conference date: 25-06-2006 Through 29-06-2006",

}

RIS

TY - CHAP

T1 - Efficient co-regularised least squares regression

AU - Brefeld, Ulf

AU - Gärtner, Thomas

AU - Scheffer, Tobias

AU - Wrobel, Stefan

N1 - Conference code: 23

PY - 2006/1/1

Y1 - 2006/1/1

N2 - In many applications, unlabelled examples are inexpensive and easy to obtain. Semi-supervised approaches try to utilise such examples to reduce the predictive error. In this paper, we investigate a semi-supervised least squares regression algorithm based on the co-learning approach. Similar to other semi-supervised algorithms, our base algorithm has cubic runtime complexity in the number of unlabelled examples. To be able to handle larger sets of unlabelled examples, we devise a semi-parametric variant that scales linearly in the number of unlabelled examples. Ex-periments show a significant error reduction by co-regularisation and a large runtime improvement for the semi-parametric approximation. Last but not least, we propose a distributed procedure that can be applied without collecting all data at a single site.

AB - In many applications, unlabelled examples are inexpensive and easy to obtain. Semi-supervised approaches try to utilise such examples to reduce the predictive error. In this paper, we investigate a semi-supervised least squares regression algorithm based on the co-learning approach. Similar to other semi-supervised algorithms, our base algorithm has cubic runtime complexity in the number of unlabelled examples. To be able to handle larger sets of unlabelled examples, we devise a semi-parametric variant that scales linearly in the number of unlabelled examples. Ex-periments show a significant error reduction by co-regularisation and a large runtime improvement for the semi-parametric approximation. Last but not least, we propose a distributed procedure that can be applied without collecting all data at a single site.

KW - Informatics

KW - Business informatics

UR - http://www.scopus.com/inward/record.url?scp=34250767770&partnerID=8YFLogxK

UR - https://www.mendeley.com/catalogue/48577d33-c292-3153-b395-46acf9d90df8/

U2 - 10.1145/1143844.1143862

DO - 10.1145/1143844.1143862

M3 - Article in conference proceedings

AN - SCOPUS:34250767770

SN - 978-159593383-6

SN - 1595933832

T3 - ACM International Conference Proceeding Series

SP - 137

EP - 144

BT - Proceedings of the 23rd international conference on Machine learning

A2 - Cohen, William

PB - Association for Computing Machinery, Inc

T2 - ICML '06

Y2 - 25 June 2006 through 29 June 2006

ER -

DOI

Zuletzt angesehen

Publikationen

  1. Using the Domestication Approach for the Analysis of Diffusion and Participation Processes of New Media
  2. Predictive mapping of plant species and communities using GIS and Landsat data in a southern Mongolian mountain range
  3. Almost-invariant and finite-time coherent sets
  4. Empirical research on mathematical modelling
  5. Assessing the costs and cost-effectiveness of ICare internet-based interventions (protocol)
  6. More Evidence for Three Types of Cognitive Style
  7. Mapping perceptions of energy transition pathways
  8. Designing and evaluating a crew resource management training for manufacturing industries
  9. INVENTORY REDUCTION BY MODERN TECHNIQUES OF DYNAMIC REPLENISHMENT OF MATERIALS AVAILABILITY IN MANUFACTURING - PRACTICAL EXAMPLE FROM THE AUTOMOTIVE INDUSTRY
  10. Lagging behind in CSR?
  11. Mindfulness as self-confirmation? An exploratory intervention study on potentials and limitations of mindfulness-based interventions in the context of environmental and sustainability education
  12. Learning to Summarise Related Sentences
  13. Feature selection for density level-sets
  14. The contralateral effects of foam rolling on range of motion and muscle performance
  15. Working time dimensions and well-being
  16. Real fake? Appropriating mobility via Schengen visa in the context of biometric border controls
  17. Magnesium recycling: State-of-the-Art developments, part II
  18. Simplify the Uptake of Community Energy by Leveraging Intermediaries and the Use of Digital Planning Tools
  19. The negative interplay between national custodial sanctions and leniency
  20. Organizing Colour
  21. Heinz von Foerster and Early Research in the Field of Pattern Recognition at the Biological Computer Laboratory
  22. Leveraging the macro-level environment to balance work and life
  23. Towards a thick understanding of sustainability transitions - Linking transition management, capabilities and social practices
  24. Advanced Controlling - eine Ideenskizze
  25. Evidence on copula-based double-hurdle models with flexible margins
  26. Reallabore im Kontext Transformativer Forschung
  27. Analysing clickstream data
  28. Converging perspectives in audience studies and digital literacies
  29. Activity-based working

Presse / Medien

  1. Bretterwerk Nietzsche