TY - JOUR
T1 - A Subject-Transfer Framework Based on Single-Trial EMG Analysis Using Convolutional Neural Networks
AU - Kim, Keun Tae
AU - Guan, Cuntai
AU - Lee, Seong Whan
N1 - Funding Information:
Manuscript received November 7, 2018; revised April 12, 2019, August 22, 2019, and October 1, 2019; accepted October 4, 2019. Date of publication October 11, 2019; date of current version January 8, 2020. This work was supported in part by the Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea Government (Development of Non-Invasive Integrated BCI SW Platform to Control Home Appliances and External Devices by User’s Thought via AR/VR Interface) under Grant 2017-0-00432 and in part by the Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea Government (Development of BCI based Brain and Cognitive Computing Technology for Recognizing User’s Intentions using Deep Learning) under Grant 2017-0-00451. (Corresponding author: Seong-Whan Lee.) K.-T. Kim and S.-W. Lee are with the Department of Brain and Cognitive Engineering, Korea University, Seoul 02841, South Korea (e-mail: kim_kt@korea.ac.kr; sw.lee@korea.ac.kr).
Funding Information:
This work was supported in part by the Institute of Information and Communications Technology Planning and Evaluation (IITP) grant funded by the Korea Government (Development of Non-Invasive Integrated BCI SW Platform to Control Home Appliances and External Devices by User's Thought via AR/VR Interface) under Grant 2017-0-00432 and in part by the Institute of Information and Communications Technology Planning and Evaluation (IITP) grant funded by the Korea Government (Development of BCI based Brain and Cognitive Computing Technology for Recognizing User's Intentions using Deep Learning) under Grant 2017-0-00451.
Publisher Copyright:
© 2001-2011 IEEE.
PY - 2020/1
Y1 - 2020/1
N2 - In recent years, electromyography (EMG)-based practical myoelectric interfaces have been developed to improve the quality of daily life for people with physical disabilities. With these interfaces, it is very important to decode a user's movement intention, to properly control the external devices. However, improving the performance of these interfaces is difficult due to the high variations in the EMG signal patterns caused by intra-user variability. Therefore, this paper proposes a novel subject-transfer framework for decoding hand movements, which is robust in terms of intra-user variability. In the proposed framework, supportive convolutional neural network (CNN) classifiers, which are pre-trained using the EMG data of several subjects, are selected and fine-tuned for the target subject via single-trial analysis. Then, the target subject's hand movements are classified by voting the outputs of the supportive CNN classifiers. The feasibility of the proposed framework is validated with NinaPro databases 2 and 3, which comprise 49 hand movements of 40 healthy and 11 amputee subjects, respectively. The experimental results indicate that, when compared to the self-decoding framework, which uses only the target subject's data, the proposed framework can successfully decode hand movements with improved performance in both healthy and amputee subjects. From the experimental results, the proposed subject-transfer framework can be seen to represent a useful tool for EMG-based practical myoelectric interfaces controlling external devices.
AB - In recent years, electromyography (EMG)-based practical myoelectric interfaces have been developed to improve the quality of daily life for people with physical disabilities. With these interfaces, it is very important to decode a user's movement intention, to properly control the external devices. However, improving the performance of these interfaces is difficult due to the high variations in the EMG signal patterns caused by intra-user variability. Therefore, this paper proposes a novel subject-transfer framework for decoding hand movements, which is robust in terms of intra-user variability. In the proposed framework, supportive convolutional neural network (CNN) classifiers, which are pre-trained using the EMG data of several subjects, are selected and fine-tuned for the target subject via single-trial analysis. Then, the target subject's hand movements are classified by voting the outputs of the supportive CNN classifiers. The feasibility of the proposed framework is validated with NinaPro databases 2 and 3, which comprise 49 hand movements of 40 healthy and 11 amputee subjects, respectively. The experimental results indicate that, when compared to the self-decoding framework, which uses only the target subject's data, the proposed framework can successfully decode hand movements with improved performance in both healthy and amputee subjects. From the experimental results, the proposed subject-transfer framework can be seen to represent a useful tool for EMG-based practical myoelectric interfaces controlling external devices.
KW - Subject-transfer framework
KW - convolutional neural networks
KW - electromyography
KW - myoelectric interfaces
UR - http://www.scopus.com/inward/record.url?scp=85078358351&partnerID=8YFLogxK
U2 - 10.1109/TNSRE.2019.2946625
DO - 10.1109/TNSRE.2019.2946625
M3 - Article
C2 - 31613773
AN - SCOPUS:85078358351
SN - 1534-4320
VL - 28
SP - 94
EP - 103
JO - IEEE Transactions on Neural Systems and Rehabilitation Engineering
JF - IEEE Transactions on Neural Systems and Rehabilitation Engineering
IS - 1
M1 - 8865669
ER -