TY - JOUR
T1 - Integrating dynamic stopping, transfer learning and language models in an adaptive zero-training ERP speller
AU - Kindermans, Pieter Jan
AU - Tangermann, Michael
AU - Müller, Klaus Robert
AU - Schrauwen, Benjamin
N1 - Copyright:
Copyright 2014 Elsevier B.V., All rights reserved.
PY - 2014/6
Y1 - 2014/6
N2 - Objective. Most BCIs have to undergo a calibration session in which data is recorded to train decoders with machine learning. Only recently zero-training methods have become a subject of study. This work proposes a probabilistic framework for BCI applications which exploit event-related potentials (ERPs). For the example of a visual P300 speller we show how the framework harvests the structure suitable to solve the decoding task by (a) transfer learning, (b) unsupervised adaptation, (c) language model and (d) dynamic stopping. Approach. A simulation study compares the proposed probabilistic zero framework (using transfer learning and task structure) to a state-of-the-art supervised model on n = 22 subjects. The individual influence of the involved components (a)-(d) are investigated. Main results. Without any need for a calibration session, the probabilistic zero-training framework with inter-subject transfer learning shows excellent performance - competitive to a state-of-the-art supervised method using calibration. Its decoding quality is carried mainly by the effect of transfer learning in combination with continuous unsupervised adaptation. Significance. A high-performing zero-training BCI is within reach for one of the most popular BCI paradigms: ERP spelling. Recording calibration data for a supervised BCI would require valuable time which is lost for spelling. The time spent on calibration would allow a novel user to spell 29 symbols with our unsupervised approach. It could be of use for various clinical and non-clinical ERP-applications of BCI.
AB - Objective. Most BCIs have to undergo a calibration session in which data is recorded to train decoders with machine learning. Only recently zero-training methods have become a subject of study. This work proposes a probabilistic framework for BCI applications which exploit event-related potentials (ERPs). For the example of a visual P300 speller we show how the framework harvests the structure suitable to solve the decoding task by (a) transfer learning, (b) unsupervised adaptation, (c) language model and (d) dynamic stopping. Approach. A simulation study compares the proposed probabilistic zero framework (using transfer learning and task structure) to a state-of-the-art supervised model on n = 22 subjects. The individual influence of the involved components (a)-(d) are investigated. Main results. Without any need for a calibration session, the probabilistic zero-training framework with inter-subject transfer learning shows excellent performance - competitive to a state-of-the-art supervised method using calibration. Its decoding quality is carried mainly by the effect of transfer learning in combination with continuous unsupervised adaptation. Significance. A high-performing zero-training BCI is within reach for one of the most popular BCI paradigms: ERP spelling. Recording calibration data for a supervised BCI would require valuable time which is lost for spelling. The time spent on calibration would allow a novel user to spell 29 symbols with our unsupervised approach. It could be of use for various clinical and non-clinical ERP-applications of BCI.
KW - BCI
KW - P300
KW - Speller matrix
KW - Subject-to-subject transfer
KW - Unsupervised learning
KW - Visual event-related potentials
UR - http://www.scopus.com/inward/record.url?scp=84901282773&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84901282773&partnerID=8YFLogxK
U2 - 10.1088/1741-2560/11/3/035005
DO - 10.1088/1741-2560/11/3/035005
M3 - Article
C2 - 24834896
AN - SCOPUS:84901282773
VL - 11
JO - Journal of Neural Engineering
JF - Journal of Neural Engineering
SN - 1741-2560
IS - 3
M1 - 035005
ER -