Wasserstein Stationary Subspace Analysis

Stephan Kaltenstadler, Shinichi Nakajima, Klaus Muller, Wojciech Samek

Research output: Contribution to journalArticle

2 Citations (Scopus)


Learning under non-stationarity can be achieved by decomposing the data into a subspace that is stationary and a non-stationary one (stationary subspace analysis (SSA)). While SSA has been used in various applications, its robustness and computational efficiency has limits due to the difficulty in optimizing the Kullback-Leibler divergence based objective. In this paper we contribute by extending SSA twofold: we propose SSA with (a) higher numerical efficiency by defining analytical SSA variants and (b) higher robustness by utilizing the Wasserstein-2 distance (Wasserstein SSA). We show the usefulness of our novel algorithms for toy data demonstrating their mathematical properties and for real-world data (1) allowing better segmentation of time series and (2) brain-computer interfacing, where the Wasserstein-based measure of non-stationarity is used for spatial filter regularization and gives rise to higher decoding performance.

Original languageEnglish
JournalIEEE Journal on Selected Topics in Signal Processing
Publication statusAccepted/In press - 2018 Jan 1


  • Covariance matrices
  • covariance metrics
  • divergence methods
  • Electroencephalography
  • Gaussian distribution
  • optimal transport
  • Robustness
  • Signal processing algorithms
  • stationary subspace analysis
  • Subspace learning
  • Time series analysis

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Wasserstein Stationary Subspace Analysis'. Together they form a unique fingerprint.

  • Cite this