lp-Norm Multiple Kernel Learning

Publikation: Beiträge in ZeitschriftenZeitschriftenaufsätzeForschungbegutachtet

Standard

lp-Norm Multiple Kernel Learning. / Kloft, Marius; Brefeld, Ulf; Sonnenburg, Sören et al.
in: Journal of Machine Learning Research, Jahrgang 2011, Nr. 12, 2011, S. 953-997.

Publikation: Beiträge in ZeitschriftenZeitschriftenaufsätzeForschungbegutachtet

Harvard

APA

Vancouver

Kloft M, Brefeld U, Sonnenburg S, Zien A. lp-Norm Multiple Kernel Learning. Journal of Machine Learning Research. 2011;2011(12):953-997.

Bibtex

@article{1c2221df8dec4ba085cf168bdcc8fe59,
title = "lp-Norm Multiple Kernel Learning",
abstract = "Learning linear combinations of multiple kernels is an appealing strategy when the right choice of features is unknown. Previous approaches to multiple kernel learning (MKL) promote sparse kernel combinations to support interpretability and scalability. Unfortunately, this `1-norm MKL is rarely observed to outperform trivial baselines in practical applications. To allow for robust kernelmixtures that generalize well, we extend MKL to arbitrary norms. We devise new insights on the connection between several existing MKL formulations and develop two efficient interleaved optimization strategies for arbitrary norms, that is `p-norms with p1. This interleaved optimization is much faster than the commonly used wrapper approaches, as demonstrated on several data sets. A theoretical analysis and an experiment on controlled artificial data shed light on the appropriateness of sparse, non-sparse and `¥-norm MKL in various scenarios. Importantly, empirical applications of `p-norm MKL to three real-world problems from computational biology show that non-sparse MKL achieves accuracies that surpass the state-of-the-art.",
keywords = "Informatik, Wirtschaftsinformatik",
author = "Marius Kloft and Ulf Brefeld and S{\"o}ren Sonnenburg and Alexander Zien",
year = "2011",
language = "Deutsch",
volume = "2011",
pages = "953--997",
journal = "Journal of Machine Learning Research",
issn = "1532-4435",
publisher = "MIT Press",
number = "12",

}

RIS

TY - JOUR

T1 - lp-Norm Multiple Kernel Learning

AU - Kloft, Marius

AU - Brefeld, Ulf

AU - Sonnenburg, Sören

AU - Zien, Alexander

PY - 2011

Y1 - 2011

N2 - Learning linear combinations of multiple kernels is an appealing strategy when the right choice of features is unknown. Previous approaches to multiple kernel learning (MKL) promote sparse kernel combinations to support interpretability and scalability. Unfortunately, this `1-norm MKL is rarely observed to outperform trivial baselines in practical applications. To allow for robust kernelmixtures that generalize well, we extend MKL to arbitrary norms. We devise new insights on the connection between several existing MKL formulations and develop two efficient interleaved optimization strategies for arbitrary norms, that is `p-norms with p1. This interleaved optimization is much faster than the commonly used wrapper approaches, as demonstrated on several data sets. A theoretical analysis and an experiment on controlled artificial data shed light on the appropriateness of sparse, non-sparse and `¥-norm MKL in various scenarios. Importantly, empirical applications of `p-norm MKL to three real-world problems from computational biology show that non-sparse MKL achieves accuracies that surpass the state-of-the-art.

AB - Learning linear combinations of multiple kernels is an appealing strategy when the right choice of features is unknown. Previous approaches to multiple kernel learning (MKL) promote sparse kernel combinations to support interpretability and scalability. Unfortunately, this `1-norm MKL is rarely observed to outperform trivial baselines in practical applications. To allow for robust kernelmixtures that generalize well, we extend MKL to arbitrary norms. We devise new insights on the connection between several existing MKL formulations and develop two efficient interleaved optimization strategies for arbitrary norms, that is `p-norms with p1. This interleaved optimization is much faster than the commonly used wrapper approaches, as demonstrated on several data sets. A theoretical analysis and an experiment on controlled artificial data shed light on the appropriateness of sparse, non-sparse and `¥-norm MKL in various scenarios. Importantly, empirical applications of `p-norm MKL to three real-world problems from computational biology show that non-sparse MKL achieves accuracies that surpass the state-of-the-art.

KW - Informatik

KW - Wirtschaftsinformatik

M3 - Zeitschriftenaufsätze

VL - 2011

SP - 953

EP - 997

JO - Journal of Machine Learning Research

JF - Journal of Machine Learning Research

SN - 1532-4435

IS - 12

ER -