Wasserstein Stationary Subspace Analysis

Stephan Kaltenstadler, Shinichi Nakajima, Klaus Robert Müller, Wojciech Samek

Research output: Contribution to journalArticlepeer-review

8 Citations (Scopus)


Learning under nonstationarity can be achieved by decomposing the data into a subspace that is stationary and a nonstationary one [stationary subspace analysis (SSA)]. While SSA has been used in various applications, its robustness and computational efficiency have limits due to the difficulty in optimizing the Kullback-Leibler divergence based objective. In this paper, we contribute by extending SSA twofold: we propose SSA with 1) higher numerical efficiency by defining analytical SSA variants and 2) higher robustness by utilizing the Wasserstein-2 distance (Wasserstein SSA). We show the usefulness of our novel algorithms for toy data demonstrating their mathematical properties and for real-world data 1) allowing better segmentation of time series and 2) brain-computer interfacing, where the Wasserstein-based measure of nonstationarity is used for spatial filter regularization and gives rise to higher decoding performance.

Original languageEnglish
Article number8481426
Pages (from-to)1213-1223
Number of pages11
JournalIEEE Journal on Selected Topics in Signal Processing
Issue number6
Publication statusPublished - 2018 Dec


  • Subspace learning
  • covariance metrics
  • divergence methods
  • optimal transport
  • stationary subspace analysis

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering


Dive into the research topics of 'Wasserstein Stationary Subspace Analysis'. Together they form a unique fingerprint.

Cite this