Efficient co-regularised least squares regression

Research output: Contributions to collected editions/worksArticle in conference proceedingsResearchpeer-review

Authors

In many applications, unlabelled examples are inexpensive and easy to obtain. Semi-supervised approaches try to utilise such examples to reduce the predictive error. In this paper, we investigate a semi-supervised least squares regression algorithm based on the co-learning approach. Similar to other semi-supervised algorithms, our base algorithm has cubic runtime complexity in the number of unlabelled examples. To be able to handle larger sets of unlabelled examples, we devise a semi-parametric variant that scales linearly in the number of unlabelled examples. Ex-periments show a significant error reduction by co-regularisation and a large runtime improvement for the semi-parametric approximation. Last but not least, we propose a distributed procedure that can be applied without collecting all data at a single site.

Original languageEnglish
Title of host publicationProceedings of the 23rd international conference on Machine learning
EditorsWilliam Cohen
Number of pages8
PublisherAssociation for Computing Machinery, Inc
Publication date01.01.2006
Pages137-144
ISBN (print)978-159593383-6, 1595933832
DOIs
Publication statusPublished - 01.01.2006
Externally publishedYes
EventICML '06 - Carnegie Mellon University, Pittsburgh, United States
Duration: 25.06.200629.06.2006
Conference number: 23

DOI

Recently viewed

Publications

  1. Logistical Potentials of Load Balancing via the Build-up and Reduction of Stock
  2. The model of educational reconstruction: A framework for the design of theory-based content specific interventions
  3. A Column Generation Approach for Bus Driver Rostering Problems
  4. The Lifecycle of "Facts'': A Survey of Social Bias in Knowledge Graphs
  5. Graph-based Approaches for Analyzing Team Interaction on the Example of Soccer
  6. Segment Introduction
  7. A longitudinal multilevel CFA-MTMM model for interchangeable and structurally different methods
  8. A latent state-trait analysis of current achievement motivation across different tasks of cognitive ability
  9. Modellieren in der Sekundarstufe
  10. Comparison of Backpropagation and Kalman Filter-based Training for Neural Networks
  11. Hybrid models for future event prediction
  12. Optimization of a gaseous multitube detector for soft X-ray detection
  13. Integration of laboratory experiments into introductory electrical engineering courses
  14. Convergence of adaptive learning and expectational stability
  15. Diffusion patterns in small vs. large capital markets-the case of value-based management
  16. Using conditional inference trees and random forests to predict the bioaccumulation potential of organic chemicals
  17. Amplifying actions for food system transformation: insights from the Stockholm region
  18. Assembly Modes of General Planar 3-RPR Parallel Mechanisms when Using the Linear Actuators’ Orientations
  19. Gerbil – Benchmarking named entity recognition and linking consistently
  20. Development of Early Spatial Perspective-Taking - Toward a Three-Level Model
  21. Simulation and optimization of material and energy flow systems
  22. Unraveling Privacy Concerns in Complex Data Ecosystems with Architectural Thinking
  23. Life satisfaction in Germany after reunification: Additional insights on the pattern of convergence
  24. Comparing temperature data sources for use in species distribution models
  25. Reducing the peaking phenomenon in Luenberger observers in presence of quasi-static disturbances for linear time invariant systems