TY - GEN
T1 - Assistive robotic arm control based on brain-machine interface with vision guidance using convolution neural network
AU - Shim, Kyung Hwan
AU - Jeong, Ji Hoon
AU - Kwon, Byoung Hee
AU - Lee, Byeong Hoo
AU - Lee, Seong Whan
N1 - Funding Information:
*Research was partly supported by Institute of Information & Communication Technology Planning & Evaluation (IITP) grant funded by the Korea government (No. 2017-0-00432, Development of Non-Invasive Integrated BCI SW Platform to Control Home Appliances and External Devices by User’s Thought via AR/VR Interface) and partly funded by Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by Korea government (No. 2017-0-00451, Development of BCI based Brain and Cognitive Computing Technology for Recognizing User’s Intentions using Deep Learning).
PY - 2019/10
Y1 - 2019/10
N2 - Brain-machine interface (BMI) provides a new control strategy for both patients and healthy people. An endogenous paradigm such as motor imagery (MI) for BMI is commonly used for detecting user intention without external stimuli. However, manipulating the dexterous robotic arm by using limited MI commands is challenging issues. In this paper, we designed a shared robotic arm control system using the intuitive MI and vision guidance. To accomplish the user's intention on the robotic arm, we used arm reach MI (left, right, and forward), hand grasp MI, and wrist twist MI by using electroencephalogram (EEG) signals. The Kinect sensor is used to match the decoded user intention with the detected object based on the location of the workspace. In addition, to decode intuitive MI successfully, we propose a novel convolutional neural network (CNN) based user intention decoding model. Ten subjects participated in our experiments, and five of them were selected to perform online tasks. The proposed method could decode various user intention (five intuitive MI classes and resting state) with a grand-averaged classification accuracy of 55.91% in offline analysis. For sufficient control on the online shared robotic arm control, the proposed online system was only started, once the patient shows higher performance than 60% in the offline analysis. For the online drinking tasks, we confirmed the averaged 78% success rate. Hence, we confirmed the possibility of the shared robotic arm control based on intuitive BMI and vision guidance with high performance.
AB - Brain-machine interface (BMI) provides a new control strategy for both patients and healthy people. An endogenous paradigm such as motor imagery (MI) for BMI is commonly used for detecting user intention without external stimuli. However, manipulating the dexterous robotic arm by using limited MI commands is challenging issues. In this paper, we designed a shared robotic arm control system using the intuitive MI and vision guidance. To accomplish the user's intention on the robotic arm, we used arm reach MI (left, right, and forward), hand grasp MI, and wrist twist MI by using electroencephalogram (EEG) signals. The Kinect sensor is used to match the decoded user intention with the detected object based on the location of the workspace. In addition, to decode intuitive MI successfully, we propose a novel convolutional neural network (CNN) based user intention decoding model. Ten subjects participated in our experiments, and five of them were selected to perform online tasks. The proposed method could decode various user intention (five intuitive MI classes and resting state) with a grand-averaged classification accuracy of 55.91% in offline analysis. For sufficient control on the online shared robotic arm control, the proposed online system was only started, once the patient shows higher performance than 60% in the offline analysis. For the online drinking tasks, we confirmed the averaged 78% success rate. Hence, we confirmed the possibility of the shared robotic arm control based on intuitive BMI and vision guidance with high performance.
UR - http://www.scopus.com/inward/record.url?scp=85076730072&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85076730072&partnerID=8YFLogxK
U2 - 10.1109/SMC.2019.8914058
DO - 10.1109/SMC.2019.8914058
M3 - Conference contribution
AN - SCOPUS:85076730072
T3 - Conference Proceedings - IEEE International Conference on Systems, Man and Cybernetics
SP - 2785
EP - 2790
BT - 2019 IEEE International Conference on Systems, Man and Cybernetics, SMC 2019
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2019 IEEE International Conference on Systems, Man and Cybernetics, SMC 2019
Y2 - 6 October 2019 through 9 October 2019
ER -