TY - GEN
T1 - Towards Neurohaptics
T2 - 9th IEEE International Winter Conference on Brain-Computer Interface, BCI 2021
AU - Cho, Jeong Hyun
AU - Jeong, Ji Hoon
AU - Kim, Myoung Ki
AU - Lee, Seong Whan
N1 - Funding Information:
This work was partly supported by Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) (No. 2017-0-00432, Development of NonInvasive Integrated BCI SW Platform to Control Home Appliances and External Devices by User’s Thought via AR/VR Interface; No. 2017-0-00451, Development of BCI based Brain and Cognitive Computing Technology for Recognizing User’s Intentions using Deep Learning; No. 2019-0-00079, Artificial Intelligence Graduate School Program (Korea University)).
Publisher Copyright:
© 2021 IEEE.
PY - 2021/2/22
Y1 - 2021/2/22
N2 - Noninvasive brain-computer interface (BCI) is widely used to recognize users' intentions. Especially, BCI related to tactile and sensation decoding could provide various effects on many industrial fields such as manufacturing advanced touch displays, controlling robotic devices, and more immersive virtual reality or augmented reality. In this paper, we introduce haptic and sensory perception-based BCI systems called neurohaptics. It is a preliminary study for a variety of scenarios using actual touch and touch imagery paradigms. We designed a novel experimental environment and a device that could acquire brain signals under touching designated materials to generate natural touch and texture sensations. Through the experiment, we collected the electroencephalogram (EEG) signals with respect to four different texture objects. Seven subjects were recruited for the experiment and evaluated classification performances using machine learning and deep learning approaches. Hence, we could confirm the feasibility of decoding actual touch and touch imagery on EEG signals to develop practical neurohaptics.
AB - Noninvasive brain-computer interface (BCI) is widely used to recognize users' intentions. Especially, BCI related to tactile and sensation decoding could provide various effects on many industrial fields such as manufacturing advanced touch displays, controlling robotic devices, and more immersive virtual reality or augmented reality. In this paper, we introduce haptic and sensory perception-based BCI systems called neurohaptics. It is a preliminary study for a variety of scenarios using actual touch and touch imagery paradigms. We designed a novel experimental environment and a device that could acquire brain signals under touching designated materials to generate natural touch and texture sensations. Through the experiment, we collected the electroencephalogram (EEG) signals with respect to four different texture objects. Seven subjects were recruited for the experiment and evaluated classification performances using machine learning and deep learning approaches. Hence, we could confirm the feasibility of decoding actual touch and touch imagery on EEG signals to develop practical neurohaptics.
KW - brain-computer interface
KW - electroencephalogram
KW - haptic sensation analysis
KW - tactile information
KW - touch imagery
UR - http://www.scopus.com/inward/record.url?scp=85104825490&partnerID=8YFLogxK
U2 - 10.1109/BCI51272.2021.9385331
DO - 10.1109/BCI51272.2021.9385331
M3 - Conference contribution
AN - SCOPUS:85104825490
T3 - 9th IEEE International Winter Conference on Brain-Computer Interface, BCI 2021
BT - 9th IEEE International Winter Conference on Brain-Computer Interface, BCI 2021
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 22 February 2021 through 24 February 2021
ER -