TY - JOUR
T1 - Multimodal signal dataset for 11 intuitive movement tasks from single upper extremity during multiple recording sessions
AU - Jeong, Ji Hoon
AU - Cho, Jeong Hyun
AU - Shim, Kyung Hwan
AU - Kwon, Byoung Hee
AU - Lee, Byeong Hoo
AU - Lee, Do Yeun
AU - Lee, Dae Hyeok
AU - Lee, Seong Whan
N1 - Publisher Copyright:
© The Author(s) 2020. Published by Oxford University Press GigaScience.
Copyright:
This record is sourced from MEDLINE/PubMed, a database of the U.S. National Library of Medicine
PY - 2020/10/7
Y1 - 2020/10/7
N2 - BACKGROUND: Non-invasive brain-computer interfaces (BCIs) have been developed for realizing natural bi-directional interaction between users and external robotic systems. However, the communication between users and BCI systems through artificial matching is a critical issue. Recently, BCIs have been developed to adopt intuitive decoding, which is the key to solving several problems such as a small number of classes and manually matching BCI commands with device control. Unfortunately, the advances in this area have been slow owing to the lack of large and uniform datasets. This study provides a large intuitive dataset for 11 different upper extremity movement tasks obtained during multiple recording sessions. The dataset includes 60-channel electroencephalography, 7-channel electromyography, and 4-channel electro-oculography of 25 healthy participants collected over 3-day sessions for a total of 82,500 trials across all the participants. FINDINGS: We validated our dataset via neurophysiological analysis. We observed clear sensorimotor de-/activation and spatial distribution related to real-movement and motor imagery, respectively. Furthermore, we demonstrated the consistency of the dataset by evaluating the classification performance of each session using a baseline machine learning method. CONCLUSIONS: The dataset includes the data of multiple recording sessions, various classes within the single upper extremity, and multimodal signals. This work can be used to (i) compare the brain activities associated with real movement and imagination, (ii) improve the decoding performance, and (iii) analyze the differences among recording sessions. Hence, this study, as a Data Note, has focused on collecting data required for further advances in the BCI technology.
AB - BACKGROUND: Non-invasive brain-computer interfaces (BCIs) have been developed for realizing natural bi-directional interaction between users and external robotic systems. However, the communication between users and BCI systems through artificial matching is a critical issue. Recently, BCIs have been developed to adopt intuitive decoding, which is the key to solving several problems such as a small number of classes and manually matching BCI commands with device control. Unfortunately, the advances in this area have been slow owing to the lack of large and uniform datasets. This study provides a large intuitive dataset for 11 different upper extremity movement tasks obtained during multiple recording sessions. The dataset includes 60-channel electroencephalography, 7-channel electromyography, and 4-channel electro-oculography of 25 healthy participants collected over 3-day sessions for a total of 82,500 trials across all the participants. FINDINGS: We validated our dataset via neurophysiological analysis. We observed clear sensorimotor de-/activation and spatial distribution related to real-movement and motor imagery, respectively. Furthermore, we demonstrated the consistency of the dataset by evaluating the classification performance of each session using a baseline machine learning method. CONCLUSIONS: The dataset includes the data of multiple recording sessions, various classes within the single upper extremity, and multimodal signals. This work can be used to (i) compare the brain activities associated with real movement and imagination, (ii) improve the decoding performance, and (iii) analyze the differences among recording sessions. Hence, this study, as a Data Note, has focused on collecting data required for further advances in the BCI technology.
KW - brain–computer interface
KW - intuitive upper extremity movements
KW - multimodal signals
KW - multiple sessions
UR - http://www.scopus.com/inward/record.url?scp=85092802931&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85092802931&partnerID=8YFLogxK
U2 - 10.1093/gigascience/giaa098
DO - 10.1093/gigascience/giaa098
M3 - Article
C2 - 33034634
AN - SCOPUS:85092802931
VL - 9
JO - GigaScience
JF - GigaScience
SN - 2047-217X
IS - 10
ER -