Feature selection for density level-sets

Publikation: Beiträge in SammelwerkenAufsätze in KonferenzbändenForschungbegutachtet

Standard

Feature selection for density level-sets. / Kloft, Marius; Nakajima, Shinichi; Brefeld, Ulf.

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Hrsg. / Wray Buntine; Marko Grobelnik; Dunja Mladenic; John Shawe-Taylor. Heidelberg : Springer, 2009. S. 692-704 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Band 5781 LNAI, Nr. PART 1).

Publikation: Beiträge in SammelwerkenAufsätze in KonferenzbändenForschungbegutachtet

Harvard

Kloft, M, Nakajima, S & Brefeld, U 2009, Feature selection for density level-sets. in W Buntine, M Grobelnik, D Mladenic & J Shawe-Taylor (Hrsg.), Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), Nr. PART 1, Bd. 5781 LNAI, Springer, Heidelberg, S. 692-704, European Conference on Machine Learning and Knowledge Discovery in Databases - 2009, Bled, Slowenien, 07.09.09. https://doi.org/10.1007/978-3-642-04180-8_62

APA

Kloft, M., Nakajima, S., & Brefeld, U. (2009). Feature selection for density level-sets. in W. Buntine, M. Grobelnik, D. Mladenic, & J. Shawe-Taylor (Hrsg.), Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (S. 692-704). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Band 5781 LNAI, Nr. PART 1). Springer. https://doi.org/10.1007/978-3-642-04180-8_62

Vancouver

Kloft M, Nakajima S, Brefeld U. Feature selection for density level-sets. in Buntine W, Grobelnik M, Mladenic D, Shawe-Taylor J, Hrsg., Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Heidelberg: Springer. 2009. S. 692-704. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); PART 1). doi: 10.1007/978-3-642-04180-8_62

Bibtex

@inbook{fd7170bb236f40429ad4da94cf30074c,
title = "Feature selection for density level-sets",
abstract = "A frequent problem in density level-set estimation is the choice of the right features that give rise to compact and concise representations of the observed data. We present an efficient feature selection method for density level-set estimation where optimal kernel mixing coefficients and model parameters are determined simultaneously. Our approach generalizes one-class support vector machines and can be equivalently expressed as a semi-infinite linear program that can be solved with interleaved cutting plane algorithms. The experimental evaluation of the new method on network intrusion detection and object recognition tasks demonstrate that our approach not only attains competitive performance but also spares practitioners from a priori decisions on feature sets to be used.",
keywords = "Informatics, Concise representations, Cutting plane algorithms, Density levels, Efficient feature selections, Experimental evaluation, Feature selection, Feature sets, Linear programs, Mixing coefficient, Model parameters, Network intrusion detection, Observed data, One-class support vector machine, Semi-infinite, Business informatics",
author = "Marius Kloft and Shinichi Nakajima and Ulf Brefeld",
year = "2009",
doi = "10.1007/978-3-642-04180-8_62",
language = "English",
isbn = "978-3-642-04179-2",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
publisher = "Springer",
number = "PART 1",
pages = "692--704",
editor = "Wray Buntine and Marko Grobelnik and Dunja Mladenic and John Shawe-Taylor",
booktitle = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
address = "Germany",
note = "European Conference on Machine Learning and Knowledge Discovery in Databases - 2009, ECML-PKDD ; Conference date: 07-09-2009 Through 11-09-2009",
url = "https://www.k4all.org/event/european-conference-on-machine-learning-and-principles-and-practice-of-knowledge-discovery-in-databases/",

}

RIS

TY - CHAP

T1 - Feature selection for density level-sets

AU - Kloft, Marius

AU - Nakajima, Shinichi

AU - Brefeld, Ulf

PY - 2009

Y1 - 2009

N2 - A frequent problem in density level-set estimation is the choice of the right features that give rise to compact and concise representations of the observed data. We present an efficient feature selection method for density level-set estimation where optimal kernel mixing coefficients and model parameters are determined simultaneously. Our approach generalizes one-class support vector machines and can be equivalently expressed as a semi-infinite linear program that can be solved with interleaved cutting plane algorithms. The experimental evaluation of the new method on network intrusion detection and object recognition tasks demonstrate that our approach not only attains competitive performance but also spares practitioners from a priori decisions on feature sets to be used.

AB - A frequent problem in density level-set estimation is the choice of the right features that give rise to compact and concise representations of the observed data. We present an efficient feature selection method for density level-set estimation where optimal kernel mixing coefficients and model parameters are determined simultaneously. Our approach generalizes one-class support vector machines and can be equivalently expressed as a semi-infinite linear program that can be solved with interleaved cutting plane algorithms. The experimental evaluation of the new method on network intrusion detection and object recognition tasks demonstrate that our approach not only attains competitive performance but also spares practitioners from a priori decisions on feature sets to be used.

KW - Informatics

KW - Concise representations

KW - Cutting plane algorithms

KW - Density levels

KW - Efficient feature selections

KW - Experimental evaluation

KW - Feature selection

KW - Feature sets

KW - Linear programs

KW - Mixing coefficient

KW - Model parameters

KW - Network intrusion detection

KW - Observed data

KW - One-class support vector machine

KW - Semi-infinite

KW - Business informatics

UR - http://www.scopus.com/inward/record.url?scp=70350633038&partnerID=8YFLogxK

U2 - 10.1007/978-3-642-04180-8_62

DO - 10.1007/978-3-642-04180-8_62

M3 - Article in conference proceedings

AN - SCOPUS:70350633038

SN - 978-3-642-04179-2

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 692

EP - 704

BT - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

A2 - Buntine, Wray

A2 - Grobelnik, Marko

A2 - Mladenic, Dunja

A2 - Shawe-Taylor, John

PB - Springer

CY - Heidelberg

T2 - European Conference on Machine Learning and Knowledge Discovery in Databases - 2009

Y2 - 7 September 2009 through 11 September 2009

ER -

DOI