TY - GEN
T1 - Classification of Hand Motions within EEG Signals for Non-Invasive BCI-Based Robot Hand Control
AU - Cho, Jeong Hyun
AU - Jeong, Ji Hoon
AU - Shim, Kyung Hwan
AU - Kim, Dong Ju
AU - Lee, Seong Whan
N1 - Funding Information:
This work was partly supported by Institute for Information & communications Technology Promotion(IITP) grant funded by the Korea government(MSIT) (No. 2017-0-00432, Development of non-invasive integrated BCI SW platform to control home appliances and external devices by user's thought via AR/VR interface) and Agency for Defense Development (ADD) of Korea (06-201-305-001, A Study on Human-Computer Interaction Technology for the Pilot Status Recognition).
Publisher Copyright:
© 2018 IEEE.
PY - 2019/1/16
Y1 - 2019/1/16
N2 - The development of brain-computer interface (BCI) systems that are based on electroencephalography (EEG), and driven by spontaneous movement intentions, is useful for rehabilitation and external device control. In this study, we analyzed the decoding of five different hand executions and imageries from EEG signals, for a robot hand control. Five healthy subjects participated in this experiment. They executed and imagined five sustained hand motions. In this motor execution (ME) and motor imagery (MI) experiment, we proposed a subject-specific time interval selection method, and we used common spatial patterns (CSP) and the regularized linear discriminant analysis (RLDA) for the data analysis. As a result, we classified the five different hand motions offline and obtained average classification accuracies of 56.83% for ME, and 51.01% for MI, respectively. Both results were higher than the obtained accuracies from a comparison method that used a standard fixed time interval method. This result is encouraging, and the proposed method could potentially be used in future applications, such as a BCI-driven robot hand control.
AB - The development of brain-computer interface (BCI) systems that are based on electroencephalography (EEG), and driven by spontaneous movement intentions, is useful for rehabilitation and external device control. In this study, we analyzed the decoding of five different hand executions and imageries from EEG signals, for a robot hand control. Five healthy subjects participated in this experiment. They executed and imagined five sustained hand motions. In this motor execution (ME) and motor imagery (MI) experiment, we proposed a subject-specific time interval selection method, and we used common spatial patterns (CSP) and the regularized linear discriminant analysis (RLDA) for the data analysis. As a result, we classified the five different hand motions offline and obtained average classification accuracies of 56.83% for ME, and 51.01% for MI, respectively. Both results were higher than the obtained accuracies from a comparison method that used a standard fixed time interval method. This result is encouraging, and the proposed method could potentially be used in future applications, such as a BCI-driven robot hand control.
KW - (EEG)
KW - a robot hand
KW - brain-computer interface (BCI)
KW - electroencephalography
KW - motor execution (ME)
KW - motor imagery (MI)
UR - http://www.scopus.com/inward/record.url?scp=85062224023&partnerID=8YFLogxK
U2 - 10.1109/SMC.2018.00097
DO - 10.1109/SMC.2018.00097
M3 - Conference contribution
AN - SCOPUS:85062224023
T3 - Proceedings - 2018 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2018
SP - 515
EP - 518
BT - Proceedings - 2018 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2018
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2018 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2018
Y2 - 7 October 2018 through 10 October 2018
ER -