Multimodal imaging, non-stationarity and BCI

Klaus Muller

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Learning to build universal decoders for BCI is a great challenge (see [9], [2], [8] for recent reviews on Machine Learning for BCI). Usually in multimodal imaging we consider the modes to be different types of imaging devices such as EEG, NIRS or fMRI (see e.g. [1], [7], [3], [4]). However, we can also interpret different subjects as imaging modalities to gain a zero training decoder (cf. [5], [6]) from a data base of subjects. Even the same subject data from several experiments can be seen as instantiation of multiple modes. This change of view allows to proceed in various research directions (e.g. [3], [4],[10], [11],[12]). The talk will expand on recent multimodal analysis techniques such as SPoC ([3], [4]). Furthermore we will discuss nonstationarities (cf. [13], [10]) that often occur in neuroscience, e.g. between a subjects' training and testing session in braincomputer interfacing (BCI) (e.g. [10], [11],[12]). We show that such changes can be very similar between subjects, and thus can be reliably estimated using data from other users and utilized to construct an invariant feature space ([11]). These insights can be accumulated into a broader theoretical framework using beta divergences ([12]). We show that it cannot only achieve a significant increase in performance, but also that the extracted change patterns allow for a neurophysiologically meaningful interpretation.

Original languageEnglish
Title of host publication2014 International Winter Workshop on Brain-Computer Interface, BCI 2014
PublisherIEEE Computer Society
DOIs
Publication statusPublished - 2014 Jan 1
Event2014 International Winter Workshop on Brain-Computer Interface, BCI 2014 - Gangwon, Korea, Republic of
Duration: 2014 Feb 172014 Feb 19

Other

Other2014 International Winter Workshop on Brain-Computer Interface, BCI 2014
CountryKorea, Republic of
CityGangwon
Period14/2/1714/2/19

Fingerprint

Imaging techniques
Electroencephalography
Learning systems
neurosciences
divergence
learning
Testing
Experiments
interpretation
experiment
performance
Magnetic Resonance Imaging

Keywords

  • BCI
  • Brain Computer Interface
  • Machine Learning
  • multimodal data analysis

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Human Factors and Ergonomics

Cite this

Muller, K. (2014). Multimodal imaging, non-stationarity and BCI. In 2014 International Winter Workshop on Brain-Computer Interface, BCI 2014 [6782555] IEEE Computer Society. https://doi.org/10.1109/iww-BCI.2014.6782555

Multimodal imaging, non-stationarity and BCI. / Muller, Klaus.

2014 International Winter Workshop on Brain-Computer Interface, BCI 2014. IEEE Computer Society, 2014. 6782555.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Muller, K 2014, Multimodal imaging, non-stationarity and BCI. in 2014 International Winter Workshop on Brain-Computer Interface, BCI 2014., 6782555, IEEE Computer Society, 2014 International Winter Workshop on Brain-Computer Interface, BCI 2014, Gangwon, Korea, Republic of, 14/2/17. https://doi.org/10.1109/iww-BCI.2014.6782555
Muller K. Multimodal imaging, non-stationarity and BCI. In 2014 International Winter Workshop on Brain-Computer Interface, BCI 2014. IEEE Computer Society. 2014. 6782555 https://doi.org/10.1109/iww-BCI.2014.6782555
Muller, Klaus. / Multimodal imaging, non-stationarity and BCI. 2014 International Winter Workshop on Brain-Computer Interface, BCI 2014. IEEE Computer Society, 2014.
@inproceedings{14ead37ab5c3447192a6326863423daa,
title = "Multimodal imaging, non-stationarity and BCI",
abstract = "Learning to build universal decoders for BCI is a great challenge (see [9], [2], [8] for recent reviews on Machine Learning for BCI). Usually in multimodal imaging we consider the modes to be different types of imaging devices such as EEG, NIRS or fMRI (see e.g. [1], [7], [3], [4]). However, we can also interpret different subjects as imaging modalities to gain a zero training decoder (cf. [5], [6]) from a data base of subjects. Even the same subject data from several experiments can be seen as instantiation of multiple modes. This change of view allows to proceed in various research directions (e.g. [3], [4],[10], [11],[12]). The talk will expand on recent multimodal analysis techniques such as SPoC ([3], [4]). Furthermore we will discuss nonstationarities (cf. [13], [10]) that often occur in neuroscience, e.g. between a subjects' training and testing session in braincomputer interfacing (BCI) (e.g. [10], [11],[12]). We show that such changes can be very similar between subjects, and thus can be reliably estimated using data from other users and utilized to construct an invariant feature space ([11]). These insights can be accumulated into a broader theoretical framework using beta divergences ([12]). We show that it cannot only achieve a significant increase in performance, but also that the extracted change patterns allow for a neurophysiologically meaningful interpretation.",
keywords = "BCI, Brain Computer Interface, Machine Learning, multimodal data analysis",
author = "Klaus Muller",
year = "2014",
month = "1",
day = "1",
doi = "10.1109/iww-BCI.2014.6782555",
language = "English",
booktitle = "2014 International Winter Workshop on Brain-Computer Interface, BCI 2014",
publisher = "IEEE Computer Society",

}

TY - GEN

T1 - Multimodal imaging, non-stationarity and BCI

AU - Muller, Klaus

PY - 2014/1/1

Y1 - 2014/1/1

N2 - Learning to build universal decoders for BCI is a great challenge (see [9], [2], [8] for recent reviews on Machine Learning for BCI). Usually in multimodal imaging we consider the modes to be different types of imaging devices such as EEG, NIRS or fMRI (see e.g. [1], [7], [3], [4]). However, we can also interpret different subjects as imaging modalities to gain a zero training decoder (cf. [5], [6]) from a data base of subjects. Even the same subject data from several experiments can be seen as instantiation of multiple modes. This change of view allows to proceed in various research directions (e.g. [3], [4],[10], [11],[12]). The talk will expand on recent multimodal analysis techniques such as SPoC ([3], [4]). Furthermore we will discuss nonstationarities (cf. [13], [10]) that often occur in neuroscience, e.g. between a subjects' training and testing session in braincomputer interfacing (BCI) (e.g. [10], [11],[12]). We show that such changes can be very similar between subjects, and thus can be reliably estimated using data from other users and utilized to construct an invariant feature space ([11]). These insights can be accumulated into a broader theoretical framework using beta divergences ([12]). We show that it cannot only achieve a significant increase in performance, but also that the extracted change patterns allow for a neurophysiologically meaningful interpretation.

AB - Learning to build universal decoders for BCI is a great challenge (see [9], [2], [8] for recent reviews on Machine Learning for BCI). Usually in multimodal imaging we consider the modes to be different types of imaging devices such as EEG, NIRS or fMRI (see e.g. [1], [7], [3], [4]). However, we can also interpret different subjects as imaging modalities to gain a zero training decoder (cf. [5], [6]) from a data base of subjects. Even the same subject data from several experiments can be seen as instantiation of multiple modes. This change of view allows to proceed in various research directions (e.g. [3], [4],[10], [11],[12]). The talk will expand on recent multimodal analysis techniques such as SPoC ([3], [4]). Furthermore we will discuss nonstationarities (cf. [13], [10]) that often occur in neuroscience, e.g. between a subjects' training and testing session in braincomputer interfacing (BCI) (e.g. [10], [11],[12]). We show that such changes can be very similar between subjects, and thus can be reliably estimated using data from other users and utilized to construct an invariant feature space ([11]). These insights can be accumulated into a broader theoretical framework using beta divergences ([12]). We show that it cannot only achieve a significant increase in performance, but also that the extracted change patterns allow for a neurophysiologically meaningful interpretation.

KW - BCI

KW - Brain Computer Interface

KW - Machine Learning

KW - multimodal data analysis

UR - http://www.scopus.com/inward/record.url?scp=84899415927&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84899415927&partnerID=8YFLogxK

U2 - 10.1109/iww-BCI.2014.6782555

DO - 10.1109/iww-BCI.2014.6782555

M3 - Conference contribution

AN - SCOPUS:84899415927

BT - 2014 International Winter Workshop on Brain-Computer Interface, BCI 2014

PB - IEEE Computer Society

ER -