Wasserstein Stationary Subspace Analysis

Stephan Kaltenstadler, Shinichi Nakajima, Klaus Robert Müller, Wojciech Samek

Research output: Contribution to journalArticle

4 Citations (Scopus)

Abstract

Learning under nonstationarity can be achieved by decomposing the data into a subspace that is stationary and a nonstationary one [stationary subspace analysis (SSA)]. While SSA has been used in various applications, its robustness and computational efficiency have limits due to the difficulty in optimizing the Kullback-Leibler divergence based objective. In this paper, we contribute by extending SSA twofold: we propose SSA with 1) higher numerical efficiency by defining analytical SSA variants and 2) higher robustness by utilizing the Wasserstein-2 distance (Wasserstein SSA). We show the usefulness of our novel algorithms for toy data demonstrating their mathematical properties and for real-world data 1) allowing better segmentation of time series and 2) brain-computer interfacing, where the Wasserstein-based measure of nonstationarity is used for spatial filter regularization and gives rise to higher decoding performance.

Original languageEnglish
Article number8481426
Pages (from-to)1213-1223
Number of pages11
JournalIEEE Journal on Selected Topics in Signal Processing
Volume12
Issue number6
DOIs
Publication statusPublished - 2018 Dec

Keywords

  • Subspace learning
  • covariance metrics
  • divergence methods
  • optimal transport
  • stationary subspace analysis

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Fingerprint Dive into the research topics of 'Wasserstein Stationary Subspace Analysis'. Together they form a unique fingerprint.

  • Cite this

    Kaltenstadler, S., Nakajima, S., Müller, K. R., & Samek, W. (2018). Wasserstein Stationary Subspace Analysis. IEEE Journal on Selected Topics in Signal Processing, 12(6), 1213-1223. [8481426]. https://doi.org/10.1109/JSTSP.2018.2873987