Abstract
Recently, the development of practical myoelectric interfaces has resulted in the emergence of wearable rehabilitation robots such as arm prosthetics. In this paper, we propose a novel method of movement intention decoding based on the deep feature learning using electromyogram of human biosignals. In daily life, the inter-user variability cause decreases in performance by modulating target EMG patterns across different users. Therefore, we propose a user-adaptive decoding method for robust movement intention decoding in the inter-user variability, employing the convolutional neural network for the deep feature learning, trained by different users. In our experimental results, the proposed method predicted hand movement intention more accurately than a competing method.
Original language | English |
---|---|
Title of host publication | 4th International Winter Conference on Brain-Computer Interface, BCI 2016 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
ISBN (Print) | 9781467378413 |
DOIs | |
Publication status | Published - 2016 Apr 20 |
Event | 4th International Winter Conference on Brain-Computer Interface, BCI 2016 - Gangwon Province, Korea, Republic of Duration: 2016 Feb 22 → 2016 Feb 24 |
Other
Other | 4th International Winter Conference on Brain-Computer Interface, BCI 2016 |
---|---|
Country | Korea, Republic of |
City | Gangwon Province |
Period | 16/2/22 → 16/2/24 |
Keywords
- Convolutional neural network
- Deep feature learning
- Electromyogram
- Myoelectric interfaces
ASJC Scopus subject areas
- Human-Computer Interaction