Integrating dynamic stopping, transfer learning and language models in an adaptive zero-training ERP speller

Pieter Jan Kindermans, Michael Tangermann, Klaus Muller, Benjamin Schrauwen

Research output: Contribution to journalArticle

43 Citations (Scopus)

Abstract

Objective. Most BCIs have to undergo a calibration session in which data is recorded to train decoders with machine learning. Only recently zero-training methods have become a subject of study. This work proposes a probabilistic framework for BCI applications which exploit event-related potentials (ERPs). For the example of a visual P300 speller we show how the framework harvests the structure suitable to solve the decoding task by (a) transfer learning, (b) unsupervised adaptation, (c) language model and (d) dynamic stopping. Approach. A simulation study compares the proposed probabilistic zero framework (using transfer learning and task structure) to a state-of-the-art supervised model on n = 22 subjects. The individual influence of the involved components (a)-(d) are investigated. Main results. Without any need for a calibration session, the probabilistic zero-training framework with inter-subject transfer learning shows excellent performance - competitive to a state-of-the-art supervised method using calibration. Its decoding quality is carried mainly by the effect of transfer learning in combination with continuous unsupervised adaptation. Significance. A high-performing zero-training BCI is within reach for one of the most popular BCI paradigms: ERP spelling. Recording calibration data for a supervised BCI would require valuable time which is lost for spelling. The time spent on calibration would allow a novel user to spell 29 symbols with our unsupervised approach. It could be of use for various clinical and non-clinical ERP-applications of BCI.

Original languageEnglish
Article number035005
JournalJournal of Neural Engineering
Volume11
Issue number3
DOIs
Publication statusPublished - 2014 Jan 1

Fingerprint

Evoked Potentials
Calibration
Language
Decoding
Learning systems
Transfer (Psychology)

ASJC Scopus subject areas

  • Biomedical Engineering
  • Cellular and Molecular Neuroscience

Cite this

Integrating dynamic stopping, transfer learning and language models in an adaptive zero-training ERP speller. / Kindermans, Pieter Jan; Tangermann, Michael; Muller, Klaus; Schrauwen, Benjamin.

In: Journal of Neural Engineering, Vol. 11, No. 3, 035005, 01.01.2014.

Research output: Contribution to journalArticle

Kindermans, Pieter Jan ; Tangermann, Michael ; Muller, Klaus ; Schrauwen, Benjamin. / Integrating dynamic stopping, transfer learning and language models in an adaptive zero-training ERP speller. In: Journal of Neural Engineering. 2014 ; Vol. 11, No. 3.
@article{e8c94bbcbddc409796829415e527701d,
title = "Integrating dynamic stopping, transfer learning and language models in an adaptive zero-training ERP speller",
abstract = "Objective. Most BCIs have to undergo a calibration session in which data is recorded to train decoders with machine learning. Only recently zero-training methods have become a subject of study. This work proposes a probabilistic framework for BCI applications which exploit event-related potentials (ERPs). For the example of a visual P300 speller we show how the framework harvests the structure suitable to solve the decoding task by (a) transfer learning, (b) unsupervised adaptation, (c) language model and (d) dynamic stopping. Approach. A simulation study compares the proposed probabilistic zero framework (using transfer learning and task structure) to a state-of-the-art supervised model on n = 22 subjects. The individual influence of the involved components (a)-(d) are investigated. Main results. Without any need for a calibration session, the probabilistic zero-training framework with inter-subject transfer learning shows excellent performance - competitive to a state-of-the-art supervised method using calibration. Its decoding quality is carried mainly by the effect of transfer learning in combination with continuous unsupervised adaptation. Significance. A high-performing zero-training BCI is within reach for one of the most popular BCI paradigms: ERP spelling. Recording calibration data for a supervised BCI would require valuable time which is lost for spelling. The time spent on calibration would allow a novel user to spell 29 symbols with our unsupervised approach. It could be of use for various clinical and non-clinical ERP-applications of BCI.",
keywords = "BCI, P300, Speller matrix, Subject-to-subject transfer, Unsupervised learning, Visual event-related potentials",
author = "Kindermans, {Pieter Jan} and Michael Tangermann and Klaus Muller and Benjamin Schrauwen",
year = "2014",
month = "1",
day = "1",
doi = "10.1088/1741-2560/11/3/035005",
language = "English",
volume = "11",
journal = "Journal of Neural Engineering",
issn = "1741-2560",
publisher = "IOP Publishing Ltd.",
number = "3",

}

TY - JOUR

T1 - Integrating dynamic stopping, transfer learning and language models in an adaptive zero-training ERP speller

AU - Kindermans, Pieter Jan

AU - Tangermann, Michael

AU - Muller, Klaus

AU - Schrauwen, Benjamin

PY - 2014/1/1

Y1 - 2014/1/1

N2 - Objective. Most BCIs have to undergo a calibration session in which data is recorded to train decoders with machine learning. Only recently zero-training methods have become a subject of study. This work proposes a probabilistic framework for BCI applications which exploit event-related potentials (ERPs). For the example of a visual P300 speller we show how the framework harvests the structure suitable to solve the decoding task by (a) transfer learning, (b) unsupervised adaptation, (c) language model and (d) dynamic stopping. Approach. A simulation study compares the proposed probabilistic zero framework (using transfer learning and task structure) to a state-of-the-art supervised model on n = 22 subjects. The individual influence of the involved components (a)-(d) are investigated. Main results. Without any need for a calibration session, the probabilistic zero-training framework with inter-subject transfer learning shows excellent performance - competitive to a state-of-the-art supervised method using calibration. Its decoding quality is carried mainly by the effect of transfer learning in combination with continuous unsupervised adaptation. Significance. A high-performing zero-training BCI is within reach for one of the most popular BCI paradigms: ERP spelling. Recording calibration data for a supervised BCI would require valuable time which is lost for spelling. The time spent on calibration would allow a novel user to spell 29 symbols with our unsupervised approach. It could be of use for various clinical and non-clinical ERP-applications of BCI.

AB - Objective. Most BCIs have to undergo a calibration session in which data is recorded to train decoders with machine learning. Only recently zero-training methods have become a subject of study. This work proposes a probabilistic framework for BCI applications which exploit event-related potentials (ERPs). For the example of a visual P300 speller we show how the framework harvests the structure suitable to solve the decoding task by (a) transfer learning, (b) unsupervised adaptation, (c) language model and (d) dynamic stopping. Approach. A simulation study compares the proposed probabilistic zero framework (using transfer learning and task structure) to a state-of-the-art supervised model on n = 22 subjects. The individual influence of the involved components (a)-(d) are investigated. Main results. Without any need for a calibration session, the probabilistic zero-training framework with inter-subject transfer learning shows excellent performance - competitive to a state-of-the-art supervised method using calibration. Its decoding quality is carried mainly by the effect of transfer learning in combination with continuous unsupervised adaptation. Significance. A high-performing zero-training BCI is within reach for one of the most popular BCI paradigms: ERP spelling. Recording calibration data for a supervised BCI would require valuable time which is lost for spelling. The time spent on calibration would allow a novel user to spell 29 symbols with our unsupervised approach. It could be of use for various clinical and non-clinical ERP-applications of BCI.

KW - BCI

KW - P300

KW - Speller matrix

KW - Subject-to-subject transfer

KW - Unsupervised learning

KW - Visual event-related potentials

UR - http://www.scopus.com/inward/record.url?scp=84901282773&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84901282773&partnerID=8YFLogxK

U2 - 10.1088/1741-2560/11/3/035005

DO - 10.1088/1741-2560/11/3/035005

M3 - Article

VL - 11

JO - Journal of Neural Engineering

JF - Journal of Neural Engineering

SN - 1741-2560

IS - 3

M1 - 035005

ER -