TY - JOUR
T1 - ERAD-Fe
T2 - Emotion Recognition-Assisted Deep Learning Framework
AU - Kim, Sun Hee
AU - Nguyen, Ngoc Anh Thi
AU - Yang, Hyung Jeong
AU - Lee, Seong Whan
N1 - Funding Information:
This work was supported in part by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education under Grant NRF-2021R1I1A1A01048455, in part by Vietnam National Foundation for Science and Technology Development (NAFOS-TED) under Grant 102.01-2020.27, in part by the National Research Foundation of Korea (NRF) Grant funded by the Korean Government (MSIT) under Grant NRF-2020R1A4A1019191, and in part by the Institute for Information and Communications Technology Promotion (IITP) Grant funded by the Government of South Korea (Development of BCI-Based Brain and Cognitive Computing Technology for Recognizing User?s Intentions using Deep Learning under Grant 2017-0-00451; Artificial Intelligence Graduate School Program, Korea University, under Grant 2019-0- 00079).
Publisher Copyright:
© 1963-2012 IEEE.
PY - 2021
Y1 - 2021
N2 - With recent advancements in artificial intelligence technologies and human-computer interaction, strategies to identify the inner emotional states of humans through physiological signals such as electroencephalography (EEG) have been actively investigated and applied in various fields. Thus, there is an increasing demand for emotion analysis and recognition via EEG signals in real time. In this article, we proposed a new framework, emotion recognition-assisted deep learning framework from eeg signal (eRAD-Fe), to achieve the best recognition result from EEG signals. eRAD-Fe integrates three aspects by exploiting sliding-window segmentation method to enlarge the size of the training dataset, configuring the energy threshold-based multiclass common spatial patterns to extract the prominent features, and improving emotional state recognition performance based on a long short-term memory model. With our proposed recognition-assisted framework, the emotional classification accuracies were 82%, 72%, and 81% on three publicly available EEG datasets, such as SEED, DEAP, and DREAMER, respectively.
AB - With recent advancements in artificial intelligence technologies and human-computer interaction, strategies to identify the inner emotional states of humans through physiological signals such as electroencephalography (EEG) have been actively investigated and applied in various fields. Thus, there is an increasing demand for emotion analysis and recognition via EEG signals in real time. In this article, we proposed a new framework, emotion recognition-assisted deep learning framework from eeg signal (eRAD-Fe), to achieve the best recognition result from EEG signals. eRAD-Fe integrates three aspects by exploiting sliding-window segmentation method to enlarge the size of the training dataset, configuring the energy threshold-based multiclass common spatial patterns to extract the prominent features, and improving emotional state recognition performance based on a long short-term memory model. With our proposed recognition-assisted framework, the emotional classification accuracies were 82%, 72%, and 81% on three publicly available EEG datasets, such as SEED, DEAP, and DREAMER, respectively.
KW - Deep learning
KW - electroencephalogram
KW - emotion recognition
KW - energy threshold
KW - feature extraction
KW - long short-term memory
KW - multiclass common spatial pattern (CSP)
UR - http://www.scopus.com/inward/record.url?scp=85115730479&partnerID=8YFLogxK
U2 - 10.1109/TIM.2021.3115195
DO - 10.1109/TIM.2021.3115195
M3 - Article
AN - SCOPUS:85115730479
VL - 70
JO - IEEE Transactions on Instrumentation and Measurement
JF - IEEE Transactions on Instrumentation and Measurement
SN - 0018-9456
ER -