TY - GEN
T1 - Decoding Event-related Potential from Ear-EEG Signals based on Ensemble Convolutional Neural Networks in Ambulatory Environment
AU - Lee, Young Eun
AU - Lee, Seong Whan
N1 - Funding Information:
This work was partly supported by Institute for Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) (No. 2017-0-00451, Development of BCI based Brain and Cognitive Computing Technology for Recognizing User’s Intentions using Deep Learning), (No. 2015-0-00185, Development of Intelligent Pattern Recognition Softwares for Ambulatory Brain Computer Interface), and (No. 2019-0-00079, Artificial Intelligence Graduate School Program (Korea University)).
Publisher Copyright:
© 2021 IEEE.
PY - 2021/2/22
Y1 - 2021/2/22
N2 - Recently, practical brain-computer interface is actively carried out, especially, in an ambulatory environment. However, the electroencephalography (EEG) signals are distorted by movement artifacts and electromyography signals when users are moving, which make hard to recognize human intention. In addition, as hardware issues are also challenging, ear-EEG has been developed for practical brain-computer interface and has been widely used. In this paper, we proposed ensemble-based convolutional neural networks in ambulatory environment and analyzed the visual event-related potential responses in scalp-and ear-EEG in terms of statistical analysis and brain-computer interface performance. The brain-computer interface performance deteriorated as 3-14% when walking fast at 1.6 m/s. The proposed methods showed 0.728 in average of the area under the curve. The proposed method shows robust to the ambulatory environment and imbalanced data as well.
AB - Recently, practical brain-computer interface is actively carried out, especially, in an ambulatory environment. However, the electroencephalography (EEG) signals are distorted by movement artifacts and electromyography signals when users are moving, which make hard to recognize human intention. In addition, as hardware issues are also challenging, ear-EEG has been developed for practical brain-computer interface and has been widely used. In this paper, we proposed ensemble-based convolutional neural networks in ambulatory environment and analyzed the visual event-related potential responses in scalp-and ear-EEG in terms of statistical analysis and brain-computer interface performance. The brain-computer interface performance deteriorated as 3-14% when walking fast at 1.6 m/s. The proposed methods showed 0.728 in average of the area under the curve. The proposed method shows robust to the ambulatory environment and imbalanced data as well.
KW - ambulatory environment
KW - brain-computer interface
KW - ear-EEG
KW - ensemble CNN
KW - event-related potential
UR - http://www.scopus.com/inward/record.url?scp=85104893488&partnerID=8YFLogxK
U2 - 10.1109/BCI51272.2021.9385313
DO - 10.1109/BCI51272.2021.9385313
M3 - Conference contribution
AN - SCOPUS:85104893488
T3 - 9th IEEE International Winter Conference on Brain-Computer Interface, BCI 2021
BT - 9th IEEE International Winter Conference on Brain-Computer Interface, BCI 2021
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 9th IEEE International Winter Conference on Brain-Computer Interface, BCI 2021
Y2 - 22 February 2021 through 24 February 2021
ER -