Covariate shift adaptation by importance weighted cross validation

Masashi Sugiyama, Matthias Krauledat, Klaus Robert Müller

Research output: Contribution to journalArticlepeer-review

472 Citations (Scopus)

Abstract

A common assumption in supervised learning is that the input points in the training set follow the same probability distribution as the input points that will be given in the future test phase. However, this assumption is not satisfied, for example, when the outside of the training region is extrapolated. The situation where the training input points and test input points follow different distributions while the conditional distribution of output values given input points is unchanged is called the covariate shift. Under the covariate shift, standard model selection techniques such as cross validation do not work as desired since its unbiasedness is no longer maintained. In this paper, we propose a new method called importance weighted cross validation (IWCV), for which we prove its unbiasedness even under the covariate shift. The IWCV procedure is the only one that can be applied for unbiased classification under covariate shift, whereas alternatives to IWCV exist for regression. The usefulness of our proposed method is illustrated by simulations, and furthermore demonstrated in the brain-computer interface, where strong non-stationarity effects can be seen between training and test sessions.

Original languageEnglish
Pages (from-to)985-1005
Number of pages21
JournalJournal of Machine Learning Research
Volume8
Publication statusPublished - 2007 May

Keywords

  • Brain-computer interface
  • Covariate shift
  • Cross validation
  • Extrapolation
  • Importance sampling

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Covariate shift adaptation by importance weighted cross validation'. Together they form a unique fingerprint.

Cite this