How are the centered kernel principal components relevant to regression task? -An exact analysis

Masahiro Yukawa, Klaus Muller, Yuto Ogino

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We present an exact analytic expression of the contributions of the kernel principal components to the relevant information in a nonlinear regression problem. A related study has been presented by Braun, Buhmann, and Müller in 2008, where an upper bound of the contributions was given for a general supervised learning problem but with 'uncentered' kernel PCAs. Our analysis clarifies that the relevant information of a kernel regression under explicit centering operation is contained in a finite number of leading kernel principal components, as in the 'uncentered' kernel-Pca case, if the kernel matches the underlying nonlinear function so that the eigenvalues of the centered kernel matrix decay quickly. We compare the regression performances of the least-square-based methods with the centered and uncentered kernel PCAs by simulations.

Original languageEnglish
Title of host publication2018 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2018 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages2841-2845
Number of pages5
Volume2018-April
ISBN (Print)9781538646588
DOIs
Publication statusPublished - 2018 Sep 10
Event2018 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2018 - Calgary, Canada
Duration: 2018 Apr 152018 Apr 20

Other

Other2018 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2018
CountryCanada
CityCalgary
Period18/4/1518/4/20

Fingerprint

Supervised learning

Keywords

  • Kernel PCA
  • Nonlinear regression
  • Reproducing kernel Hilbert space
  • Spectral decomposition

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Electrical and Electronic Engineering

Cite this

Yukawa, M., Muller, K., & Ogino, Y. (2018). How are the centered kernel principal components relevant to regression task? -An exact analysis. In 2018 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2018 - Proceedings (Vol. 2018-April, pp. 2841-2845). [8462392] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/ICASSP.2018.8462392

How are the centered kernel principal components relevant to regression task? -An exact analysis. / Yukawa, Masahiro; Muller, Klaus; Ogino, Yuto.

2018 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2018 - Proceedings. Vol. 2018-April Institute of Electrical and Electronics Engineers Inc., 2018. p. 2841-2845 8462392.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Yukawa, M, Muller, K & Ogino, Y 2018, How are the centered kernel principal components relevant to regression task? -An exact analysis. in 2018 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2018 - Proceedings. vol. 2018-April, 8462392, Institute of Electrical and Electronics Engineers Inc., pp. 2841-2845, 2018 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2018, Calgary, Canada, 18/4/15. https://doi.org/10.1109/ICASSP.2018.8462392
Yukawa M, Muller K, Ogino Y. How are the centered kernel principal components relevant to regression task? -An exact analysis. In 2018 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2018 - Proceedings. Vol. 2018-April. Institute of Electrical and Electronics Engineers Inc. 2018. p. 2841-2845. 8462392 https://doi.org/10.1109/ICASSP.2018.8462392
Yukawa, Masahiro ; Muller, Klaus ; Ogino, Yuto. / How are the centered kernel principal components relevant to regression task? -An exact analysis. 2018 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2018 - Proceedings. Vol. 2018-April Institute of Electrical and Electronics Engineers Inc., 2018. pp. 2841-2845
@inproceedings{002dd00470244ac4a754f230e44b43e8,
title = "How are the centered kernel principal components relevant to regression task? -An exact analysis",
abstract = "We present an exact analytic expression of the contributions of the kernel principal components to the relevant information in a nonlinear regression problem. A related study has been presented by Braun, Buhmann, and M{\"u}ller in 2008, where an upper bound of the contributions was given for a general supervised learning problem but with 'uncentered' kernel PCAs. Our analysis clarifies that the relevant information of a kernel regression under explicit centering operation is contained in a finite number of leading kernel principal components, as in the 'uncentered' kernel-Pca case, if the kernel matches the underlying nonlinear function so that the eigenvalues of the centered kernel matrix decay quickly. We compare the regression performances of the least-square-based methods with the centered and uncentered kernel PCAs by simulations.",
keywords = "Kernel PCA, Nonlinear regression, Reproducing kernel Hilbert space, Spectral decomposition",
author = "Masahiro Yukawa and Klaus Muller and Yuto Ogino",
year = "2018",
month = "9",
day = "10",
doi = "10.1109/ICASSP.2018.8462392",
language = "English",
isbn = "9781538646588",
volume = "2018-April",
pages = "2841--2845",
booktitle = "2018 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2018 - Proceedings",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - How are the centered kernel principal components relevant to regression task? -An exact analysis

AU - Yukawa, Masahiro

AU - Muller, Klaus

AU - Ogino, Yuto

PY - 2018/9/10

Y1 - 2018/9/10

N2 - We present an exact analytic expression of the contributions of the kernel principal components to the relevant information in a nonlinear regression problem. A related study has been presented by Braun, Buhmann, and Müller in 2008, where an upper bound of the contributions was given for a general supervised learning problem but with 'uncentered' kernel PCAs. Our analysis clarifies that the relevant information of a kernel regression under explicit centering operation is contained in a finite number of leading kernel principal components, as in the 'uncentered' kernel-Pca case, if the kernel matches the underlying nonlinear function so that the eigenvalues of the centered kernel matrix decay quickly. We compare the regression performances of the least-square-based methods with the centered and uncentered kernel PCAs by simulations.

AB - We present an exact analytic expression of the contributions of the kernel principal components to the relevant information in a nonlinear regression problem. A related study has been presented by Braun, Buhmann, and Müller in 2008, where an upper bound of the contributions was given for a general supervised learning problem but with 'uncentered' kernel PCAs. Our analysis clarifies that the relevant information of a kernel regression under explicit centering operation is contained in a finite number of leading kernel principal components, as in the 'uncentered' kernel-Pca case, if the kernel matches the underlying nonlinear function so that the eigenvalues of the centered kernel matrix decay quickly. We compare the regression performances of the least-square-based methods with the centered and uncentered kernel PCAs by simulations.

KW - Kernel PCA

KW - Nonlinear regression

KW - Reproducing kernel Hilbert space

KW - Spectral decomposition

UR - http://www.scopus.com/inward/record.url?scp=85054209716&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85054209716&partnerID=8YFLogxK

U2 - 10.1109/ICASSP.2018.8462392

DO - 10.1109/ICASSP.2018.8462392

M3 - Conference contribution

AN - SCOPUS:85054209716

SN - 9781538646588

VL - 2018-April

SP - 2841

EP - 2845

BT - 2018 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2018 - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

ER -