Abstract
Learning to build universal decoders for BCI is a great challenge (see [9], [2], [8] for recent reviews on Machine Learning for BCI). Usually in multimodal imaging we consider the modes to be different types of imaging devices such as EEG, NIRS or fMRI (see e.g. [1], [7], [3], [4]). However, we can also interpret different subjects as imaging modalities to gain a zero training decoder (cf. [5], [6]) from a data base of subjects. Even the same subject data from several experiments can be seen as instantiation of multiple modes. This change of view allows to proceed in various research directions (e.g. [3], [4],[10], [11],[12]). The talk will expand on recent multimodal analysis techniques such as SPoC ([3], [4]). Furthermore we will discuss nonstationarities (cf. [13], [10]) that often occur in neuroscience, e.g. between a subjects' training and testing session in braincomputer interfacing (BCI) (e.g. [10], [11],[12]). We show that such changes can be very similar between subjects, and thus can be reliably estimated using data from other users and utilized to construct an invariant feature space ([11]). These insights can be accumulated into a broader theoretical framework using beta divergences ([12]). We show that it cannot only achieve a significant increase in performance, but also that the extracted change patterns allow for a neurophysiologically meaningful interpretation.
Original language | English |
---|---|
DOIs | |
Publication status | Published - 2014 |
Event | 2014 International Winter Workshop on Brain-Computer Interface, BCI 2014 - Gangwon, Korea, Republic of Duration: 2014 Feb 17 → 2014 Feb 19 |
Other
Other | 2014 International Winter Workshop on Brain-Computer Interface, BCI 2014 |
---|---|
Country/Territory | Korea, Republic of |
City | Gangwon |
Period | 14/2/17 → 14/2/19 |
Keywords
- BCI
- Brain Computer Interface
- Machine Learning
- multimodal data analysis
ASJC Scopus subject areas
- Human-Computer Interaction
- Human Factors and Ergonomics