Selecting ridge parameters in infinite dimensional hypothesis spaces

Masashi Sugiyama, Klaus Muller

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Previously, an unbiased estimator of the generalization error called the subspace information criterion (SIC) was proposed for a finite dimensional reproducing kernel Hilbert space (RKHS). In this paper, we extend SIC so that it can be applied to any RKHSs including infinite dimensional ones. Computer simulations show that the extended SIC works well in ridge parameter selection.

Original languageEnglish
Title of host publicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
PublisherSpringer Verlag
Pages528-534
Number of pages7
Volume2415 LNCS
ISBN (Print)9783540440741
Publication statusPublished - 2002 Jan 1
Externally publishedYes
Event2002 International Conference on Artificial Neural Networks, ICANN 2002 - Madrid, Spain
Duration: 2002 Aug 282002 Aug 30

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume2415 LNCS
ISSN (Print)03029743
ISSN (Electronic)16113349

Other

Other2002 International Conference on Artificial Neural Networks, ICANN 2002
CountrySpain
CityMadrid
Period02/8/2802/8/30

    Fingerprint

ASJC Scopus subject areas

  • Computer Science(all)
  • Theoretical Computer Science

Cite this

Sugiyama, M., & Muller, K. (2002). Selecting ridge parameters in infinite dimensional hypothesis spaces. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 2415 LNCS, pp. 528-534). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 2415 LNCS). Springer Verlag.