Analyzing local structure in kernel-based learning: Explanation, complexity, and reliability assessment

Gregoire Montavon, Mikio L. Braun, Tammo Krueger, Klaus Muller

Research output: Contribution to journalArticle

31 Citations (Scopus)

Abstract

Over the last decade, nonlinear kernel-based learning methods have been widely used in the sciences and in industry for solving, e.g., classification, regression, and ranking problems. While their users are more than happy with the performance of this powerful technology, there is an emerging need to additionally gain better understanding of both the learning machine and the data analysis problem to be solved. Opening the nonlinear black box, however, is a notoriously difficult challenge. In this review, we report on a set of recent methods that can be universally used to make kernel methods more transparent. In particular, we discuss relevant dimension estimation (RDE) that allows to assess the underlying complexity and noise structure of a learning problem and thus to distinguish high/low noise scenarios of high/low complexity respectively. Moreover, we introduce a novel local technique based on RDE for quantifying the reliability of the learned predictions. Finally, we report on techniques that can explain the individual nonlinear prediction. In this manner, our novel methods not only help to gain further knowledge about the nonlinear signal processing problem itself, but they broaden the general usefulness of kernel methods in practical signal processing applications.

Original languageEnglish
Article number6530740
Pages (from-to)62-74
Number of pages13
JournalIEEE Signal Processing Magazine
Volume30
Issue number4
DOIs
Publication statusPublished - 2013
Externally publishedYes

Fingerprint

Reliability Assessment
Local Structure
Signal processing
kernel
Kernel Methods
Learning systems
Signal Processing
Nonlinear Prediction
Black Box
Low Complexity
Industry
Data analysis
Ranking
Machine Learning
Regression
Scenarios
Learning
Kernel
Prediction

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering
  • Applied Mathematics

Cite this

Analyzing local structure in kernel-based learning : Explanation, complexity, and reliability assessment. / Montavon, Gregoire; Braun, Mikio L.; Krueger, Tammo; Muller, Klaus.

In: IEEE Signal Processing Magazine, Vol. 30, No. 4, 6530740, 2013, p. 62-74.

Research output: Contribution to journalArticle

Montavon, Gregoire ; Braun, Mikio L. ; Krueger, Tammo ; Muller, Klaus. / Analyzing local structure in kernel-based learning : Explanation, complexity, and reliability assessment. In: IEEE Signal Processing Magazine. 2013 ; Vol. 30, No. 4. pp. 62-74.
@article{f7277dee73da438ebf251eb7fc294651,
title = "Analyzing local structure in kernel-based learning: Explanation, complexity, and reliability assessment",
abstract = "Over the last decade, nonlinear kernel-based learning methods have been widely used in the sciences and in industry for solving, e.g., classification, regression, and ranking problems. While their users are more than happy with the performance of this powerful technology, there is an emerging need to additionally gain better understanding of both the learning machine and the data analysis problem to be solved. Opening the nonlinear black box, however, is a notoriously difficult challenge. In this review, we report on a set of recent methods that can be universally used to make kernel methods more transparent. In particular, we discuss relevant dimension estimation (RDE) that allows to assess the underlying complexity and noise structure of a learning problem and thus to distinguish high/low noise scenarios of high/low complexity respectively. Moreover, we introduce a novel local technique based on RDE for quantifying the reliability of the learned predictions. Finally, we report on techniques that can explain the individual nonlinear prediction. In this manner, our novel methods not only help to gain further knowledge about the nonlinear signal processing problem itself, but they broaden the general usefulness of kernel methods in practical signal processing applications.",
author = "Gregoire Montavon and Braun, {Mikio L.} and Tammo Krueger and Klaus Muller",
year = "2013",
doi = "10.1109/MSP.2013.2249294",
language = "English",
volume = "30",
pages = "62--74",
journal = "IEEE Signal Processing Magazine",
issn = "1053-5888",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "4",

}

TY - JOUR

T1 - Analyzing local structure in kernel-based learning

T2 - Explanation, complexity, and reliability assessment

AU - Montavon, Gregoire

AU - Braun, Mikio L.

AU - Krueger, Tammo

AU - Muller, Klaus

PY - 2013

Y1 - 2013

N2 - Over the last decade, nonlinear kernel-based learning methods have been widely used in the sciences and in industry for solving, e.g., classification, regression, and ranking problems. While their users are more than happy with the performance of this powerful technology, there is an emerging need to additionally gain better understanding of both the learning machine and the data analysis problem to be solved. Opening the nonlinear black box, however, is a notoriously difficult challenge. In this review, we report on a set of recent methods that can be universally used to make kernel methods more transparent. In particular, we discuss relevant dimension estimation (RDE) that allows to assess the underlying complexity and noise structure of a learning problem and thus to distinguish high/low noise scenarios of high/low complexity respectively. Moreover, we introduce a novel local technique based on RDE for quantifying the reliability of the learned predictions. Finally, we report on techniques that can explain the individual nonlinear prediction. In this manner, our novel methods not only help to gain further knowledge about the nonlinear signal processing problem itself, but they broaden the general usefulness of kernel methods in practical signal processing applications.

AB - Over the last decade, nonlinear kernel-based learning methods have been widely used in the sciences and in industry for solving, e.g., classification, regression, and ranking problems. While their users are more than happy with the performance of this powerful technology, there is an emerging need to additionally gain better understanding of both the learning machine and the data analysis problem to be solved. Opening the nonlinear black box, however, is a notoriously difficult challenge. In this review, we report on a set of recent methods that can be universally used to make kernel methods more transparent. In particular, we discuss relevant dimension estimation (RDE) that allows to assess the underlying complexity and noise structure of a learning problem and thus to distinguish high/low noise scenarios of high/low complexity respectively. Moreover, we introduce a novel local technique based on RDE for quantifying the reliability of the learned predictions. Finally, we report on techniques that can explain the individual nonlinear prediction. In this manner, our novel methods not only help to gain further knowledge about the nonlinear signal processing problem itself, but they broaden the general usefulness of kernel methods in practical signal processing applications.

UR - http://www.scopus.com/inward/record.url?scp=85032751251&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85032751251&partnerID=8YFLogxK

U2 - 10.1109/MSP.2013.2249294

DO - 10.1109/MSP.2013.2249294

M3 - Article

AN - SCOPUS:85032751251

VL - 30

SP - 62

EP - 74

JO - IEEE Signal Processing Magazine

JF - IEEE Signal Processing Magazine

SN - 1053-5888

IS - 4

M1 - 6530740

ER -