Recent advances in brain-computer interface (BCI) techniques have led to increasingly refined interactions between users and external devices. Accurately decoding kinematic information from brain signals is one of the main challenges encountered in the control of human-like robots. In particular, although the forearm of an upper extremity is frequently used in daily life for high-level tasks, only few studies addressed decoding of the forearm movement. In this study, we focus on the classification of forearm movements according to elaborated rotation angles using electroencephalogram (EEG) signals. To this end, we propose a hierarchical flow convolutional neural network (HF-CNN) model for robust classification. We evaluate the proposed model not only with our experimental dataset but also with a public dataset (BNCI Horizon 2020). The grand-average classification accuracies of three rotation angles yield 0.73 (±0.04) for the motor execution (ME) task and 0.65 (±0.09) for the motor imagery (MI) task across ten subjects in our experimental dataset. Further, in the public dataset, the grand-averaged classification accuracies were 0.52 (±0.03) for ME and 0.51 (±0.04) for MI tasks across fifteen subjects. Our experimental results demonstrate the possibility of decoding complex kinematics information using EEG signals. This study will contribute to the development of a brain-controlled robotic arm system capable of performing high-level tasks.
- Brain-computer interface (BCI)
- convolutional neural network (CNN)
- electroencephalogram (EEG)
- forearm motor execution and motor imagery
ASJC Scopus subject areas
- Computer Science(all)
- Materials Science(all)