Abstract
Learning under nonstationarity can be achieved by decomposing the data into a subspace that is stationary and a nonstationary one [stationary subspace analysis (SSA)]. While SSA has been used in various applications, its robustness and computational efficiency have limits due to the difficulty in optimizing the Kullback-Leibler divergence based objective. In this paper, we contribute by extending SSA twofold: we propose SSA with 1) higher numerical efficiency by defining analytical SSA variants and 2) higher robustness by utilizing the Wasserstein-2 distance (Wasserstein SSA). We show the usefulness of our novel algorithms for toy data demonstrating their mathematical properties and for real-world data 1) allowing better segmentation of time series and 2) brain-computer interfacing, where the Wasserstein-based measure of nonstationarity is used for spatial filter regularization and gives rise to higher decoding performance.
Original language | English |
---|---|
Article number | 8481426 |
Pages (from-to) | 1213-1223 |
Number of pages | 11 |
Journal | IEEE Journal on Selected Topics in Signal Processing |
Volume | 12 |
Issue number | 6 |
DOIs | |
Publication status | Published - 2018 Dec |
Keywords
- Subspace learning
- covariance metrics
- divergence methods
- optimal transport
- stationary subspace analysis
ASJC Scopus subject areas
- Signal Processing
- Electrical and Electronic Engineering