ERAD-Fe: Emotion Recognition-Assisted Deep Learning Framework

Sun Hee Kim, Ngoc Anh Thi Nguyen, Hyung Jeong Yang, Seong Whan Lee

Research output: Contribution to journalArticlepeer-review

Abstract

With recent advancements in artificial intelligence technologies and human-computer interaction, strategies to identify the inner emotional states of humans through physiological signals such as electroencephalography (EEG) have been actively investigated and applied in various fields. Thus, there is an increasing demand for emotion analysis and recognition via EEG signals in real time. In this article, we proposed a new framework, emotion recognition-assisted deep learning framework from eeg signal (eRAD-Fe), to achieve the best recognition result from EEG signals. eRAD-Fe integrates three aspects by exploiting sliding-window segmentation method to enlarge the size of the training dataset, configuring the energy threshold-based multiclass common spatial patterns to extract the prominent features, and improving emotional state recognition performance based on a long short-term memory model. With our proposed recognition-assisted framework, the emotional classification accuracies were 82%, 72%, and 81% on three publicly available EEG datasets, such as SEED, DEAP, and DREAMER, respectively.

Original languageEnglish
JournalIEEE Transactions on Instrumentation and Measurement
Volume70
DOIs
Publication statusPublished - 2021

Keywords

  • Deep learning
  • electroencephalogram
  • emotion recognition
  • energy threshold
  • feature extraction
  • long short-term memory
  • multiclass common spatial pattern (CSP)

ASJC Scopus subject areas

  • Instrumentation
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'ERAD-Fe: Emotion Recognition-Assisted Deep Learning Framework'. Together they form a unique fingerprint.

Cite this