Importance-weighted cross-validation for covariate shift

Masashi Sugiyama, Benjamin Blankertz, Matthias Krauledat, Guido Dornhege, Klaus Muller

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Citations (Scopus)

Abstract

A common assumption in supervised learning is that the input points in the training set follow the same probability distribution as the input points used for testing. However, this assumption is not satisfied, for example, when the outside of training region is extrapolated. The situation where the training input points and test input points follow different distributions is called the covariate shift. Under the covariate shift, standard machine learning techniques such as empirical risk minimization or cross-validation do not work well since their unbiasedness is no longer maintained. In this paper, we propose a new method called importance-weighted cross-validation, which is still unbiased even under the covariate shift. The usefulness of our proposed method is successfully tested on toy data and furthermore demonstrated in the brain-computer interface, where strong non-stationarity effects can be seen between calibration and feedback sessions.

Original languageEnglish
Title of host publicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Pages354-363
Number of pages10
Volume4174 LNCS
Publication statusPublished - 2006 Oct 30
Externally publishedYes
Event28th Symposium of the German Association for Pattern Recognition, DAGM 2006 - Berlin, Germany
Duration: 2006 Sep 122006 Sep 14

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume4174 LNCS
ISSN (Print)03029743
ISSN (Electronic)16113349

Other

Other28th Symposium of the German Association for Pattern Recognition, DAGM 2006
CountryGermany
CityBerlin
Period06/9/1206/9/14

Fingerprint

Brain computer interface
Supervised learning
Cross-validation
Probability distributions
Learning systems
Covariates
Calibration
Brain-Computer Interfaces
Feedback
Play and Playthings
Testing
Learning
Unbiasedness
Nonstationarity
Supervised Learning
Machine Learning
Probability Distribution
Training

ASJC Scopus subject areas

  • Computer Science(all)
  • Biochemistry, Genetics and Molecular Biology(all)
  • Theoretical Computer Science

Cite this

Sugiyama, M., Blankertz, B., Krauledat, M., Dornhege, G., & Muller, K. (2006). Importance-weighted cross-validation for covariate shift. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4174 LNCS, pp. 354-363). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 4174 LNCS).

Importance-weighted cross-validation for covariate shift. / Sugiyama, Masashi; Blankertz, Benjamin; Krauledat, Matthias; Dornhege, Guido; Muller, Klaus.

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 4174 LNCS 2006. p. 354-363 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 4174 LNCS).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Sugiyama, M, Blankertz, B, Krauledat, M, Dornhege, G & Muller, K 2006, Importance-weighted cross-validation for covariate shift. in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). vol. 4174 LNCS, Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 4174 LNCS, pp. 354-363, 28th Symposium of the German Association for Pattern Recognition, DAGM 2006, Berlin, Germany, 06/9/12.
Sugiyama M, Blankertz B, Krauledat M, Dornhege G, Muller K. Importance-weighted cross-validation for covariate shift. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 4174 LNCS. 2006. p. 354-363. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
Sugiyama, Masashi ; Blankertz, Benjamin ; Krauledat, Matthias ; Dornhege, Guido ; Muller, Klaus. / Importance-weighted cross-validation for covariate shift. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). Vol. 4174 LNCS 2006. pp. 354-363 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
@inproceedings{29b80d299575417b82ab01b0ef9cf4b8,
title = "Importance-weighted cross-validation for covariate shift",
abstract = "A common assumption in supervised learning is that the input points in the training set follow the same probability distribution as the input points used for testing. However, this assumption is not satisfied, for example, when the outside of training region is extrapolated. The situation where the training input points and test input points follow different distributions is called the covariate shift. Under the covariate shift, standard machine learning techniques such as empirical risk minimization or cross-validation do not work well since their unbiasedness is no longer maintained. In this paper, we propose a new method called importance-weighted cross-validation, which is still unbiased even under the covariate shift. The usefulness of our proposed method is successfully tested on toy data and furthermore demonstrated in the brain-computer interface, where strong non-stationarity effects can be seen between calibration and feedback sessions.",
author = "Masashi Sugiyama and Benjamin Blankertz and Matthias Krauledat and Guido Dornhege and Klaus Muller",
year = "2006",
month = "10",
day = "30",
language = "English",
isbn = "3540444122",
volume = "4174 LNCS",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
pages = "354--363",
booktitle = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",

}

TY - GEN

T1 - Importance-weighted cross-validation for covariate shift

AU - Sugiyama, Masashi

AU - Blankertz, Benjamin

AU - Krauledat, Matthias

AU - Dornhege, Guido

AU - Muller, Klaus

PY - 2006/10/30

Y1 - 2006/10/30

N2 - A common assumption in supervised learning is that the input points in the training set follow the same probability distribution as the input points used for testing. However, this assumption is not satisfied, for example, when the outside of training region is extrapolated. The situation where the training input points and test input points follow different distributions is called the covariate shift. Under the covariate shift, standard machine learning techniques such as empirical risk minimization or cross-validation do not work well since their unbiasedness is no longer maintained. In this paper, we propose a new method called importance-weighted cross-validation, which is still unbiased even under the covariate shift. The usefulness of our proposed method is successfully tested on toy data and furthermore demonstrated in the brain-computer interface, where strong non-stationarity effects can be seen between calibration and feedback sessions.

AB - A common assumption in supervised learning is that the input points in the training set follow the same probability distribution as the input points used for testing. However, this assumption is not satisfied, for example, when the outside of training region is extrapolated. The situation where the training input points and test input points follow different distributions is called the covariate shift. Under the covariate shift, standard machine learning techniques such as empirical risk minimization or cross-validation do not work well since their unbiasedness is no longer maintained. In this paper, we propose a new method called importance-weighted cross-validation, which is still unbiased even under the covariate shift. The usefulness of our proposed method is successfully tested on toy data and furthermore demonstrated in the brain-computer interface, where strong non-stationarity effects can be seen between calibration and feedback sessions.

UR - http://www.scopus.com/inward/record.url?scp=33750229534&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=33750229534&partnerID=8YFLogxK

M3 - Conference contribution

SN - 3540444122

SN - 9783540444121

VL - 4174 LNCS

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 354

EP - 363

BT - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

ER -