TY - GEN
T1 - Decoding Movement Imagination and Execution from Eeg Signals Using Bci-Transfer Learning Method Based on Relation Network
AU - Lee, Do Yeun
AU - Jeong, Ji Hoon
AU - Shim, Kyung Hwan
AU - Lee, Seong Whan
N1 - Funding Information:
This work was partly supported by Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (No. 2017-0-00432, Development of Non-Invasive Integrated BCI SW Platform to Control Home Appliances and External Devices by User’s Thought via AR/VR Interface) and partly funded by Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (No. 2017-0-00451, Development of BCI based Brain and Cognitive Computing Technology for Recognizing User’s Intentions using Deep Learning).
PY - 2020/5
Y1 - 2020/5
N2 - A brain-computer interface (BCI) is used to control external devices for healthy people as well as to rehabilitate motor functions for motor-disabled patients. Decoding movement intention is one of the most significant aspects for performing arm movement tasks using brain signals. Decoding movement execution (ME) from electroencephalogram (EEG) signals have shown high performance in previous works, however movement imagination (MI) paradigm-based intention decoding has so far failed to achieve sufficient accuracy. In this study, we focused on a robust MI decoding method with transfer learning for the ME and MI paradigm. We acquired EEG data related to arm reaching for 3D directions. We proposed a BCI-transfer learning method based on a Relation network (BTRN) architecture. Decoding performances showed the highest performance compared to conventional works. We confirmed the possibility of the BTRN architecture to contribute to continuous decoding of MI using ME datasets.
AB - A brain-computer interface (BCI) is used to control external devices for healthy people as well as to rehabilitate motor functions for motor-disabled patients. Decoding movement intention is one of the most significant aspects for performing arm movement tasks using brain signals. Decoding movement execution (ME) from electroencephalogram (EEG) signals have shown high performance in previous works, however movement imagination (MI) paradigm-based intention decoding has so far failed to achieve sufficient accuracy. In this study, we focused on a robust MI decoding method with transfer learning for the ME and MI paradigm. We acquired EEG data related to arm reaching for 3D directions. We proposed a BCI-transfer learning method based on a Relation network (BTRN) architecture. Decoding performances showed the highest performance compared to conventional works. We confirmed the possibility of the BTRN architecture to contribute to continuous decoding of MI using ME datasets.
KW - Brain-computer interface (BCI)
KW - Electroencephalogram (EEG)
KW - Movement imagination and execution
KW - Transfer learning
UR - http://www.scopus.com/inward/record.url?scp=85089230970&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85089230970&partnerID=8YFLogxK
U2 - 10.1109/ICASSP40776.2020.9052997
DO - 10.1109/ICASSP40776.2020.9052997
M3 - Conference contribution
AN - SCOPUS:85089230970
T3 - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
SP - 1354
EP - 1358
BT - 2020 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2020 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2020 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2020
Y2 - 4 May 2020 through 8 May 2020
ER -