Automatic emotional expression of a face robot by using a reactive behavior decision model

Kyung geune Oh, Myoung soo Jang, Seung-Jong Kim

Research output: Contribution to journalArticle

3 Citations (Scopus)

Abstract

This paper introduces a face robot named 'Buddy' which can perform facial expressions, such as eye-tracking and lip synchronization, via movements of its facial elements (i.e., eyeballs, eyebrows, eyelids, and lips). Buddy has 14 degrees of freedom. To produce the realistic motion of Buddy, we built a 'Reactive Behavior Decision Model' which decides not only how to control the rotation angles and speed of facial elements, but to exhibit as well particular emotions that could express the robot's personality. Buddy's personality is formed in the model by the accumulated external stimuli and internal status. The process to automatically achieve reactive behavior in the model is classified into three steps: (1) to analyze the external stimuli and identify variations in Buddy's internal status; (2) to decide the type and degree of emotion based on the robot's personality; and (3) to generate specific facial expressions and gestures by combining the appropriate primitive behaviors chosen from emotion databases. By using this model, we have proven that Buddy can display various facial expressions and behaviors, at times very reasonable but quite unexpected.

Original languageEnglish
Pages (from-to)769-774
Number of pages6
JournalJournal of Mechanical Science and Technology
Volume24
Issue number3
DOIs
Publication statusPublished - 2010 Mar 1
Externally publishedYes

Keywords

  • Emotional expression
  • Face robot
  • Human-robot interaction (HRI)
  • Intelligent robot

ASJC Scopus subject areas

  • Mechanics of Materials
  • Mechanical Engineering

Fingerprint Dive into the research topics of 'Automatic emotional expression of a face robot by using a reactive behavior decision model'. Together they form a unique fingerprint.

  • Cite this