Gesture spotting and recognition for human-robot interaction

Hee Deok Yang, A. Yeon Park, Seong Whan Lee

Research output: Contribution to journalArticle

96 Citations (Scopus)

Abstract

Visual interpretation of gestures can be useful in accomplishing natural human-robot interaction (HRI). Previous HRI research focused on issues such as hand gestures, sign language, and command gesture recognition. Automatic recognition of whole-body gestures is required in order for HRI to operate naturally. This presents a challenging problem, because describing and modeling meaningful gesture patterns from whole-body gestures is a complex task. This paper presents a new method for recognition of whole-body key gestures in HRI. A human subject is first described by a set of features, encoding the angular relationship between a dozen body parts in 3-D. A feature vector is then mapped to a codeword of hidden Markov models. In order to spot key gestures accurately, a sophisticated method of designing a transition gesture model is proposed. To reduce the states of the transition gesture model, model reduction which merges similar states based on data-dependent statistics and relative entropy is used. The experimental results demonstrate that the proposed method can be efficient and effective in HRI, for automatic recognition of whole-body key gestures from motion sequences.

Original languageEnglish
Pages (from-to)256-270
Number of pages15
JournalIEEE Transactions on Robotics
Volume23
Issue number2
DOIs
Publication statusPublished - 2007 Apr 1

Fingerprint

Human robot interaction
Gesture recognition
End effectors
Hidden Markov models
Entropy
Statistics

Keywords

  • Gesture spotting
  • Hidden Markov model (HMM)
  • Human-robot interaction (HRI)
  • Mobile robot
  • Transition gesture model
  • Whole-body gesture recognition

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Electrical and Electronic Engineering

Cite this

Gesture spotting and recognition for human-robot interaction. / Yang, Hee Deok; Park, A. Yeon; Lee, Seong Whan.

In: IEEE Transactions on Robotics, Vol. 23, No. 2, 01.04.2007, p. 256-270.

Research output: Contribution to journalArticle

Yang, Hee Deok ; Park, A. Yeon ; Lee, Seong Whan. / Gesture spotting and recognition for human-robot interaction. In: IEEE Transactions on Robotics. 2007 ; Vol. 23, No. 2. pp. 256-270.
@article{cac519ba3e5b4138a06b3af17edd0808,
title = "Gesture spotting and recognition for human-robot interaction",
abstract = "Visual interpretation of gestures can be useful in accomplishing natural human-robot interaction (HRI). Previous HRI research focused on issues such as hand gestures, sign language, and command gesture recognition. Automatic recognition of whole-body gestures is required in order for HRI to operate naturally. This presents a challenging problem, because describing and modeling meaningful gesture patterns from whole-body gestures is a complex task. This paper presents a new method for recognition of whole-body key gestures in HRI. A human subject is first described by a set of features, encoding the angular relationship between a dozen body parts in 3-D. A feature vector is then mapped to a codeword of hidden Markov models. In order to spot key gestures accurately, a sophisticated method of designing a transition gesture model is proposed. To reduce the states of the transition gesture model, model reduction which merges similar states based on data-dependent statistics and relative entropy is used. The experimental results demonstrate that the proposed method can be efficient and effective in HRI, for automatic recognition of whole-body key gestures from motion sequences.",
keywords = "Gesture spotting, Hidden Markov model (HMM), Human-robot interaction (HRI), Mobile robot, Transition gesture model, Whole-body gesture recognition",
author = "Yang, {Hee Deok} and Park, {A. Yeon} and Lee, {Seong Whan}",
year = "2007",
month = "4",
day = "1",
doi = "10.1109/TRO.2006.889491",
language = "English",
volume = "23",
pages = "256--270",
journal = "IEEE Transactions on Robotics",
issn = "1552-3098",
publisher = "Institute of Electrical and Electronics Engineers Inc.",
number = "2",

}

TY - JOUR

T1 - Gesture spotting and recognition for human-robot interaction

AU - Yang, Hee Deok

AU - Park, A. Yeon

AU - Lee, Seong Whan

PY - 2007/4/1

Y1 - 2007/4/1

N2 - Visual interpretation of gestures can be useful in accomplishing natural human-robot interaction (HRI). Previous HRI research focused on issues such as hand gestures, sign language, and command gesture recognition. Automatic recognition of whole-body gestures is required in order for HRI to operate naturally. This presents a challenging problem, because describing and modeling meaningful gesture patterns from whole-body gestures is a complex task. This paper presents a new method for recognition of whole-body key gestures in HRI. A human subject is first described by a set of features, encoding the angular relationship between a dozen body parts in 3-D. A feature vector is then mapped to a codeword of hidden Markov models. In order to spot key gestures accurately, a sophisticated method of designing a transition gesture model is proposed. To reduce the states of the transition gesture model, model reduction which merges similar states based on data-dependent statistics and relative entropy is used. The experimental results demonstrate that the proposed method can be efficient and effective in HRI, for automatic recognition of whole-body key gestures from motion sequences.

AB - Visual interpretation of gestures can be useful in accomplishing natural human-robot interaction (HRI). Previous HRI research focused on issues such as hand gestures, sign language, and command gesture recognition. Automatic recognition of whole-body gestures is required in order for HRI to operate naturally. This presents a challenging problem, because describing and modeling meaningful gesture patterns from whole-body gestures is a complex task. This paper presents a new method for recognition of whole-body key gestures in HRI. A human subject is first described by a set of features, encoding the angular relationship between a dozen body parts in 3-D. A feature vector is then mapped to a codeword of hidden Markov models. In order to spot key gestures accurately, a sophisticated method of designing a transition gesture model is proposed. To reduce the states of the transition gesture model, model reduction which merges similar states based on data-dependent statistics and relative entropy is used. The experimental results demonstrate that the proposed method can be efficient and effective in HRI, for automatic recognition of whole-body key gestures from motion sequences.

KW - Gesture spotting

KW - Hidden Markov model (HMM)

KW - Human-robot interaction (HRI)

KW - Mobile robot

KW - Transition gesture model

KW - Whole-body gesture recognition

UR - http://www.scopus.com/inward/record.url?scp=34247223015&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=34247223015&partnerID=8YFLogxK

U2 - 10.1109/TRO.2006.889491

DO - 10.1109/TRO.2006.889491

M3 - Article

VL - 23

SP - 256

EP - 270

JO - IEEE Transactions on Robotics

JF - IEEE Transactions on Robotics

SN - 1552-3098

IS - 2

ER -