Interaction intent analysis of multiple persons using nonverbal behavior features

Sang Seok Yun, Munsang Kim, Mun Taek Choi, Jae-Bok Song

Research output: Contribution to journalArticle

Abstract

According to the cognitive science research, the interaction intent of humans can be estimated through an analysis of the representing behaviors. This paper proposes a novel methodology for reliable intention analysis of humans by applying this approach. To identify the intention, 8 behavioral features are extracted from the 4 characteristics in human-human interaction and we outline a set of core components for nonverbal behavior of humans. These nonverbal behaviors are associated with various recognition modules including multimodal sensors which have each modality with localizing sound source of the speaker in the audition part, recognizing frontal face and facial expression in the vision part, and estimating human trajectories, body pose and leaning, and hand gesture in the spatial part. As a post-processing step, temporal confidential reasoning is utilized to improve the recognition performance and integrated human model is utilized to quantitatively classify the intention from multi-dimensional cues by applying the weight factor. Thus, interactive robots can make informed engagement decision to effectively interact with multiple persons. Experimental results show that the proposed scheme works successfully between human users and a robot in human-robot interaction.

Original languageEnglish
Pages (from-to)738-744
Number of pages7
JournalJournal of Institute of Control, Robotics and Systems
Volume19
Issue number8
DOIs
Publication statusPublished - 2013 Nov 12

Fingerprint

Person
Robots
Human robot interaction
Audition
Interaction
Trajectories
Acoustic waves
Sensors
Processing
Robot
Temporal Reasoning
Cognitive Science
Human-robot Interaction
Human
Facial Expression
Gesture
Post-processing
Modality
Classify
Face

Keywords

  • Confidential reasoning
  • Human intention analysis
  • Human-robot interaction
  • Multiple-person interactions

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Applied Mathematics

Cite this

Interaction intent analysis of multiple persons using nonverbal behavior features. / Yun, Sang Seok; Kim, Munsang; Choi, Mun Taek; Song, Jae-Bok.

In: Journal of Institute of Control, Robotics and Systems, Vol. 19, No. 8, 12.11.2013, p. 738-744.

Research output: Contribution to journalArticle

@article{46bf569ee70040eda509432108d3d889,
title = "Interaction intent analysis of multiple persons using nonverbal behavior features",
abstract = "According to the cognitive science research, the interaction intent of humans can be estimated through an analysis of the representing behaviors. This paper proposes a novel methodology for reliable intention analysis of humans by applying this approach. To identify the intention, 8 behavioral features are extracted from the 4 characteristics in human-human interaction and we outline a set of core components for nonverbal behavior of humans. These nonverbal behaviors are associated with various recognition modules including multimodal sensors which have each modality with localizing sound source of the speaker in the audition part, recognizing frontal face and facial expression in the vision part, and estimating human trajectories, body pose and leaning, and hand gesture in the spatial part. As a post-processing step, temporal confidential reasoning is utilized to improve the recognition performance and integrated human model is utilized to quantitatively classify the intention from multi-dimensional cues by applying the weight factor. Thus, interactive robots can make informed engagement decision to effectively interact with multiple persons. Experimental results show that the proposed scheme works successfully between human users and a robot in human-robot interaction.",
keywords = "Confidential reasoning, Human intention analysis, Human-robot interaction, Multiple-person interactions",
author = "Yun, {Sang Seok} and Munsang Kim and Choi, {Mun Taek} and Jae-Bok Song",
year = "2013",
month = "11",
day = "12",
doi = "10.5302/J.ICROS.2013.13.1893",
language = "English",
volume = "19",
pages = "738--744",
journal = "Journal of Institute of Control, Robotics and Systems",
issn = "1976-5622",
publisher = "Institute of Control, Robotics and Systems",
number = "8",

}

TY - JOUR

T1 - Interaction intent analysis of multiple persons using nonverbal behavior features

AU - Yun, Sang Seok

AU - Kim, Munsang

AU - Choi, Mun Taek

AU - Song, Jae-Bok

PY - 2013/11/12

Y1 - 2013/11/12

N2 - According to the cognitive science research, the interaction intent of humans can be estimated through an analysis of the representing behaviors. This paper proposes a novel methodology for reliable intention analysis of humans by applying this approach. To identify the intention, 8 behavioral features are extracted from the 4 characteristics in human-human interaction and we outline a set of core components for nonverbal behavior of humans. These nonverbal behaviors are associated with various recognition modules including multimodal sensors which have each modality with localizing sound source of the speaker in the audition part, recognizing frontal face and facial expression in the vision part, and estimating human trajectories, body pose and leaning, and hand gesture in the spatial part. As a post-processing step, temporal confidential reasoning is utilized to improve the recognition performance and integrated human model is utilized to quantitatively classify the intention from multi-dimensional cues by applying the weight factor. Thus, interactive robots can make informed engagement decision to effectively interact with multiple persons. Experimental results show that the proposed scheme works successfully between human users and a robot in human-robot interaction.

AB - According to the cognitive science research, the interaction intent of humans can be estimated through an analysis of the representing behaviors. This paper proposes a novel methodology for reliable intention analysis of humans by applying this approach. To identify the intention, 8 behavioral features are extracted from the 4 characteristics in human-human interaction and we outline a set of core components for nonverbal behavior of humans. These nonverbal behaviors are associated with various recognition modules including multimodal sensors which have each modality with localizing sound source of the speaker in the audition part, recognizing frontal face and facial expression in the vision part, and estimating human trajectories, body pose and leaning, and hand gesture in the spatial part. As a post-processing step, temporal confidential reasoning is utilized to improve the recognition performance and integrated human model is utilized to quantitatively classify the intention from multi-dimensional cues by applying the weight factor. Thus, interactive robots can make informed engagement decision to effectively interact with multiple persons. Experimental results show that the proposed scheme works successfully between human users and a robot in human-robot interaction.

KW - Confidential reasoning

KW - Human intention analysis

KW - Human-robot interaction

KW - Multiple-person interactions

UR - http://www.scopus.com/inward/record.url?scp=84887181651&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84887181651&partnerID=8YFLogxK

U2 - 10.5302/J.ICROS.2013.13.1893

DO - 10.5302/J.ICROS.2013.13.1893

M3 - Article

AN - SCOPUS:84887181651

VL - 19

SP - 738

EP - 744

JO - Journal of Institute of Control, Robotics and Systems

JF - Journal of Institute of Control, Robotics and Systems

SN - 1976-5622

IS - 8

ER -