Wasserstein Stationary Subspace Analysis

Stephan Kaltenstadler, Shinichi Nakajima, Klaus Muller, Wojciech Samek

Research output: Contribution to journalArticle

2 Citations (Scopus)

Abstract

Learning under non-stationarity can be achieved by decomposing the data into a subspace that is stationary and a non-stationary one (stationary subspace analysis (SSA)). While SSA has been used in various applications, its robustness and computational efficiency has limits due to the difficulty in optimizing the Kullback-Leibler divergence based objective. In this paper we contribute by extending SSA twofold: we propose SSA with (a) higher numerical efficiency by defining analytical SSA variants and (b) higher robustness by utilizing the Wasserstein-2 distance (Wasserstein SSA). We show the usefulness of our novel algorithms for toy data demonstrating their mathematical properties and for real-world data (1) allowing better segmentation of time series and (2) brain-computer interfacing, where the Wasserstein-based measure of non-stationarity is used for spatial filter regularization and gives rise to higher decoding performance.

Original languageEnglish
JournalIEEE Journal on Selected Topics in Signal Processing
DOIs
Publication statusAccepted/In press - 2018 Jan 1

Fingerprint

Computational efficiency
Decoding
Time series
Brain

Keywords

  • Covariance matrices
  • covariance metrics
  • divergence methods
  • Electroencephalography
  • Gaussian distribution
  • optimal transport
  • Robustness
  • Signal processing algorithms
  • stationary subspace analysis
  • Subspace learning
  • Time series analysis

ASJC Scopus subject areas

  • Signal Processing
  • Electrical and Electronic Engineering

Cite this

Wasserstein Stationary Subspace Analysis. / Kaltenstadler, Stephan; Nakajima, Shinichi; Muller, Klaus; Samek, Wojciech.

In: IEEE Journal on Selected Topics in Signal Processing, 01.01.2018.

Research output: Contribution to journalArticle

Kaltenstadler, Stephan ; Nakajima, Shinichi ; Muller, Klaus ; Samek, Wojciech. / Wasserstein Stationary Subspace Analysis. In: IEEE Journal on Selected Topics in Signal Processing. 2018.
@article{905f32e8a5124c3e87a427b8a86686ee,
title = "Wasserstein Stationary Subspace Analysis",
abstract = "Learning under non-stationarity can be achieved by decomposing the data into a subspace that is stationary and a non-stationary one (stationary subspace analysis (SSA)). While SSA has been used in various applications, its robustness and computational efficiency has limits due to the difficulty in optimizing the Kullback-Leibler divergence based objective. In this paper we contribute by extending SSA twofold: we propose SSA with (a) higher numerical efficiency by defining analytical SSA variants and (b) higher robustness by utilizing the Wasserstein-2 distance (Wasserstein SSA). We show the usefulness of our novel algorithms for toy data demonstrating their mathematical properties and for real-world data (1) allowing better segmentation of time series and (2) brain-computer interfacing, where the Wasserstein-based measure of non-stationarity is used for spatial filter regularization and gives rise to higher decoding performance.",
keywords = "Covariance matrices, covariance metrics, divergence methods, Electroencephalography, Gaussian distribution, optimal transport, Robustness, Signal processing algorithms, stationary subspace analysis, Subspace learning, Time series analysis",
author = "Stephan Kaltenstadler and Shinichi Nakajima and Klaus Muller and Wojciech Samek",
year = "2018",
month = "1",
day = "1",
doi = "10.1109/JSTSP.2018.2873987",
language = "English",
journal = "IEEE Journal on Selected Topics in Signal Processing",
issn = "1932-4553",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - JOUR

T1 - Wasserstein Stationary Subspace Analysis

AU - Kaltenstadler, Stephan

AU - Nakajima, Shinichi

AU - Muller, Klaus

AU - Samek, Wojciech

PY - 2018/1/1

Y1 - 2018/1/1

N2 - Learning under non-stationarity can be achieved by decomposing the data into a subspace that is stationary and a non-stationary one (stationary subspace analysis (SSA)). While SSA has been used in various applications, its robustness and computational efficiency has limits due to the difficulty in optimizing the Kullback-Leibler divergence based objective. In this paper we contribute by extending SSA twofold: we propose SSA with (a) higher numerical efficiency by defining analytical SSA variants and (b) higher robustness by utilizing the Wasserstein-2 distance (Wasserstein SSA). We show the usefulness of our novel algorithms for toy data demonstrating their mathematical properties and for real-world data (1) allowing better segmentation of time series and (2) brain-computer interfacing, where the Wasserstein-based measure of non-stationarity is used for spatial filter regularization and gives rise to higher decoding performance.

AB - Learning under non-stationarity can be achieved by decomposing the data into a subspace that is stationary and a non-stationary one (stationary subspace analysis (SSA)). While SSA has been used in various applications, its robustness and computational efficiency has limits due to the difficulty in optimizing the Kullback-Leibler divergence based objective. In this paper we contribute by extending SSA twofold: we propose SSA with (a) higher numerical efficiency by defining analytical SSA variants and (b) higher robustness by utilizing the Wasserstein-2 distance (Wasserstein SSA). We show the usefulness of our novel algorithms for toy data demonstrating their mathematical properties and for real-world data (1) allowing better segmentation of time series and (2) brain-computer interfacing, where the Wasserstein-based measure of non-stationarity is used for spatial filter regularization and gives rise to higher decoding performance.

KW - Covariance matrices

KW - covariance metrics

KW - divergence methods

KW - Electroencephalography

KW - Gaussian distribution

KW - optimal transport

KW - Robustness

KW - Signal processing algorithms

KW - stationary subspace analysis

KW - Subspace learning

KW - Time series analysis

UR - http://www.scopus.com/inward/record.url?scp=85054534738&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85054534738&partnerID=8YFLogxK

U2 - 10.1109/JSTSP.2018.2873987

DO - 10.1109/JSTSP.2018.2873987

M3 - Article

AN - SCOPUS:85054534738

JO - IEEE Journal on Selected Topics in Signal Processing

JF - IEEE Journal on Selected Topics in Signal Processing

SN - 1932-4553

ER -