The subspace information criterion for infinite dimensional hypothesis spaces

Masashi Sugiyama, Klaus Muller

Research output: Contribution to journalArticle

14 Citations (Scopus)

Abstract

The paper extended the range of applicability of subspace information criterion (SIC). It was showed that even if the reproducing kernels centered on training sample points do not span the whole space, SIC was an unbiased estimator of an essential part of the generalization error. The extension allowed the use of any reproducing kernel Hilbert spaces (RKHS) including infinite dimension ones.

Original languageEnglish
Pages (from-to)323-359
Number of pages37
JournalJournal of Machine Learning Research
Volume3
Issue number2
DOIs
Publication statusPublished - 2003 Feb 15
Externally publishedYes

Keywords

  • Cross-validation
  • Finite sample statistics
  • Gaussian processes
  • Generalization error
  • Kernel regression
  • Model selection
  • Reproducing kernel Hilbert space
  • Subspace information criterion
  • Unbiased estimators

ASJC Scopus subject areas

  • Artificial Intelligence

Fingerprint Dive into the research topics of 'The subspace information criterion for infinite dimensional hypothesis spaces'. Together they form a unique fingerprint.

  • Cite this