Movement intention decoding based on deep learning for multiuser myoelectric interfaces

Ki Hee Park, Seong Whan Lee

Research output: Chapter in Book/Report/Conference proceedingConference contribution

24 Citations (Scopus)

Abstract

Recently, the development of practical myoelectric interfaces has resulted in the emergence of wearable rehabilitation robots such as arm prosthetics. In this paper, we propose a novel method of movement intention decoding based on the deep feature learning using electromyogram of human biosignals. In daily life, the inter-user variability cause decreases in performance by modulating target EMG patterns across different users. Therefore, we propose a user-adaptive decoding method for robust movement intention decoding in the inter-user variability, employing the convolutional neural network for the deep feature learning, trained by different users. In our experimental results, the proposed method predicted hand movement intention more accurately than a competing method.

Original languageEnglish
Title of host publication4th International Winter Conference on Brain-Computer Interface, BCI 2016
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Print)9781467378413
DOIs
Publication statusPublished - 2016 Apr 20
Event4th International Winter Conference on Brain-Computer Interface, BCI 2016 - Gangwon Province, Korea, Republic of
Duration: 2016 Feb 222016 Feb 24

Other

Other4th International Winter Conference on Brain-Computer Interface, BCI 2016
CountryKorea, Republic of
CityGangwon Province
Period16/2/2216/2/24

Fingerprint

Decoding
Prosthetics
Patient rehabilitation
Robots
Neural networks
Deep learning

Keywords

  • Convolutional neural network
  • Deep feature learning
  • Electromyogram
  • Myoelectric interfaces

ASJC Scopus subject areas

  • Human-Computer Interaction

Cite this

Park, K. H., & Lee, S. W. (2016). Movement intention decoding based on deep learning for multiuser myoelectric interfaces. In 4th International Winter Conference on Brain-Computer Interface, BCI 2016 [7457459] Institute of Electrical and Electronics Engineers Inc.. https://doi.org/10.1109/IWW-BCI.2016.7457459

Movement intention decoding based on deep learning for multiuser myoelectric interfaces. / Park, Ki Hee; Lee, Seong Whan.

4th International Winter Conference on Brain-Computer Interface, BCI 2016. Institute of Electrical and Electronics Engineers Inc., 2016. 7457459.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Park, KH & Lee, SW 2016, Movement intention decoding based on deep learning for multiuser myoelectric interfaces. in 4th International Winter Conference on Brain-Computer Interface, BCI 2016., 7457459, Institute of Electrical and Electronics Engineers Inc., 4th International Winter Conference on Brain-Computer Interface, BCI 2016, Gangwon Province, Korea, Republic of, 16/2/22. https://doi.org/10.1109/IWW-BCI.2016.7457459
Park KH, Lee SW. Movement intention decoding based on deep learning for multiuser myoelectric interfaces. In 4th International Winter Conference on Brain-Computer Interface, BCI 2016. Institute of Electrical and Electronics Engineers Inc. 2016. 7457459 https://doi.org/10.1109/IWW-BCI.2016.7457459
Park, Ki Hee ; Lee, Seong Whan. / Movement intention decoding based on deep learning for multiuser myoelectric interfaces. 4th International Winter Conference on Brain-Computer Interface, BCI 2016. Institute of Electrical and Electronics Engineers Inc., 2016.
@inproceedings{5a60bed1a0d54a0193f74b14514f1875,
title = "Movement intention decoding based on deep learning for multiuser myoelectric interfaces",
abstract = "Recently, the development of practical myoelectric interfaces has resulted in the emergence of wearable rehabilitation robots such as arm prosthetics. In this paper, we propose a novel method of movement intention decoding based on the deep feature learning using electromyogram of human biosignals. In daily life, the inter-user variability cause decreases in performance by modulating target EMG patterns across different users. Therefore, we propose a user-adaptive decoding method for robust movement intention decoding in the inter-user variability, employing the convolutional neural network for the deep feature learning, trained by different users. In our experimental results, the proposed method predicted hand movement intention more accurately than a competing method.",
keywords = "Convolutional neural network, Deep feature learning, Electromyogram, Myoelectric interfaces",
author = "Park, {Ki Hee} and Lee, {Seong Whan}",
year = "2016",
month = "4",
day = "20",
doi = "10.1109/IWW-BCI.2016.7457459",
language = "English",
isbn = "9781467378413",
booktitle = "4th International Winter Conference on Brain-Computer Interface, BCI 2016",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - GEN

T1 - Movement intention decoding based on deep learning for multiuser myoelectric interfaces

AU - Park, Ki Hee

AU - Lee, Seong Whan

PY - 2016/4/20

Y1 - 2016/4/20

N2 - Recently, the development of practical myoelectric interfaces has resulted in the emergence of wearable rehabilitation robots such as arm prosthetics. In this paper, we propose a novel method of movement intention decoding based on the deep feature learning using electromyogram of human biosignals. In daily life, the inter-user variability cause decreases in performance by modulating target EMG patterns across different users. Therefore, we propose a user-adaptive decoding method for robust movement intention decoding in the inter-user variability, employing the convolutional neural network for the deep feature learning, trained by different users. In our experimental results, the proposed method predicted hand movement intention more accurately than a competing method.

AB - Recently, the development of practical myoelectric interfaces has resulted in the emergence of wearable rehabilitation robots such as arm prosthetics. In this paper, we propose a novel method of movement intention decoding based on the deep feature learning using electromyogram of human biosignals. In daily life, the inter-user variability cause decreases in performance by modulating target EMG patterns across different users. Therefore, we propose a user-adaptive decoding method for robust movement intention decoding in the inter-user variability, employing the convolutional neural network for the deep feature learning, trained by different users. In our experimental results, the proposed method predicted hand movement intention more accurately than a competing method.

KW - Convolutional neural network

KW - Deep feature learning

KW - Electromyogram

KW - Myoelectric interfaces

UR - http://www.scopus.com/inward/record.url?scp=84969142146&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84969142146&partnerID=8YFLogxK

U2 - 10.1109/IWW-BCI.2016.7457459

DO - 10.1109/IWW-BCI.2016.7457459

M3 - Conference contribution

AN - SCOPUS:84969142146

SN - 9781467378413

BT - 4th International Winter Conference on Brain-Computer Interface, BCI 2016

PB - Institute of Electrical and Electronics Engineers Inc.

ER -