TY - GEN
T1 - Trajectory Decoding of Arm Reaching Movement Imageries for Brain-Controlled Robot Arm System
AU - Jeong, Ji Hoon
AU - Shim, Kyung Hwan
AU - Kim, Dong Joo
AU - Lee, Seong Whan
N1 - Publisher Copyright:
© 2019 IEEE.
PY - 2019/7
Y1 - 2019/7
N2 - Development of noninvasive brain-machine interface (BMI) systems based on electroencephalography (EEG), driven by spontaneous movement intentions, is a useful tool for controlling external devices or supporting a neuro-rehabilitation. In this study, we present the possibility of brain-controlled robot arm system using arm trajectory decoding. To do that, we first constructed the experimental system that can acquire the EEG data for not only movement execution (ME) task but also movement imagery (MI) tasks. Five subjects participated in our experiments and performed four directional reaching tasks (Left, right, forward, and backward) in the 3D plane. For robust arm trajectory decoding, we propose a subject-dependent deep neural network (DNN) architecture. The decoding model applies the principle of bi-directional long short-term memory (LSTM) network. As a result, we confirmed the decoding performance (r-value: >0.8) for all X-, Y-, and Z-axis across all subjects in the MI as well as ME tasks. These results show the feasibility of the EEG-based intuitive robot arm control system for high-level tasks (e.g., drink water or moving some objects). Also, we confirm that the proposed method has no much decoding performance variations between ME and MI tasks for the offline analysis. Hence, we will demonstrate that the decoding model is capable of robust trajectory decoding even in a real-time environment.
AB - Development of noninvasive brain-machine interface (BMI) systems based on electroencephalography (EEG), driven by spontaneous movement intentions, is a useful tool for controlling external devices or supporting a neuro-rehabilitation. In this study, we present the possibility of brain-controlled robot arm system using arm trajectory decoding. To do that, we first constructed the experimental system that can acquire the EEG data for not only movement execution (ME) task but also movement imagery (MI) tasks. Five subjects participated in our experiments and performed four directional reaching tasks (Left, right, forward, and backward) in the 3D plane. For robust arm trajectory decoding, we propose a subject-dependent deep neural network (DNN) architecture. The decoding model applies the principle of bi-directional long short-term memory (LSTM) network. As a result, we confirmed the decoding performance (r-value: >0.8) for all X-, Y-, and Z-axis across all subjects in the MI as well as ME tasks. These results show the feasibility of the EEG-based intuitive robot arm control system for high-level tasks (e.g., drink water or moving some objects). Also, we confirm that the proposed method has no much decoding performance variations between ME and MI tasks for the offline analysis. Hence, we will demonstrate that the decoding model is capable of robust trajectory decoding even in a real-time environment.
UR - http://www.scopus.com/inward/record.url?scp=85076733121&partnerID=8YFLogxK
U2 - 10.1109/EMBC.2019.8856312
DO - 10.1109/EMBC.2019.8856312
M3 - Conference contribution
C2 - 31947110
AN - SCOPUS:85076733121
T3 - Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS
SP - 5544
EP - 5547
BT - 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2019
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2019
Y2 - 23 July 2019 through 27 July 2019
ER -