Human-robot interaction by whole body gesture spotting and recognition

Hee Deok Yang, A. Yeon Park, Seong Whan Lee

Research output: Chapter in Book/Report/Conference proceedingConference contribution

10 Citations (Scopus)

Abstract

An intelligent robot is required for natural interaction with humans. Visual interpretation of gestures can be useful in accomplishing natural Human-Robot Interaction (HRI). Previous HRI research focused on issues such as hand gesture, sign language, and command gesture recognition. Automatic recognition of whole body gestures is required in order for HRI to operate naturally. This presents a challenging problem, because describing and modeling meaningful gesture patterns from whole body gestures, is a complex task. This paper presents a new method for recognition of whole body key gestures in HRI. A human subject is first described by a set of features, encoding the angular relationship between a dozen body parts in 3D. A feature vector is then mapped to a codeword of gesture HMMs. In order to spot key gestures accurately, a sophisticated method of designing a garbage gesture model is proposed; model reduction, which merges similar states, based on data-dependent statistics and relative entropy. The proposed method has been tested with 20 persons' samples and 200 synthetic data. The proposed method achieved a reliability rate of 94.8% in spotting task and a recognition rate of 97.4% from an isolated gesture.

Original languageEnglish
Title of host publicationProceedings - International Conference on Pattern Recognition
Pages774-777
Number of pages4
Volume4
DOIs
Publication statusPublished - 2006
Event18th International Conference on Pattern Recognition, ICPR 2006 - Hong Kong, China
Duration: 2006 Aug 202006 Aug 24

Other

Other18th International Conference on Pattern Recognition, ICPR 2006
CountryChina
CityHong Kong
Period06/8/2006/8/24

Fingerprint

Human robot interaction
Intelligent robots
Gesture recognition
End effectors
Entropy
Statistics

ASJC Scopus subject areas

  • Electrical and Electronic Engineering
  • Computer Vision and Pattern Recognition
  • Hardware and Architecture

Cite this

Yang, H. D., Park, A. Y., & Lee, S. W. (2006). Human-robot interaction by whole body gesture spotting and recognition. In Proceedings - International Conference on Pattern Recognition (Vol. 4, pp. 774-777). [1699955] https://doi.org/10.1109/ICPR.2006.642

Human-robot interaction by whole body gesture spotting and recognition. / Yang, Hee Deok; Park, A. Yeon; Lee, Seong Whan.

Proceedings - International Conference on Pattern Recognition. Vol. 4 2006. p. 774-777 1699955.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Yang, HD, Park, AY & Lee, SW 2006, Human-robot interaction by whole body gesture spotting and recognition. in Proceedings - International Conference on Pattern Recognition. vol. 4, 1699955, pp. 774-777, 18th International Conference on Pattern Recognition, ICPR 2006, Hong Kong, China, 06/8/20. https://doi.org/10.1109/ICPR.2006.642
Yang HD, Park AY, Lee SW. Human-robot interaction by whole body gesture spotting and recognition. In Proceedings - International Conference on Pattern Recognition. Vol. 4. 2006. p. 774-777. 1699955 https://doi.org/10.1109/ICPR.2006.642
Yang, Hee Deok ; Park, A. Yeon ; Lee, Seong Whan. / Human-robot interaction by whole body gesture spotting and recognition. Proceedings - International Conference on Pattern Recognition. Vol. 4 2006. pp. 774-777
@inproceedings{f45ed26734ca4decad116a98dbf103c1,
title = "Human-robot interaction by whole body gesture spotting and recognition",
abstract = "An intelligent robot is required for natural interaction with humans. Visual interpretation of gestures can be useful in accomplishing natural Human-Robot Interaction (HRI). Previous HRI research focused on issues such as hand gesture, sign language, and command gesture recognition. Automatic recognition of whole body gestures is required in order for HRI to operate naturally. This presents a challenging problem, because describing and modeling meaningful gesture patterns from whole body gestures, is a complex task. This paper presents a new method for recognition of whole body key gestures in HRI. A human subject is first described by a set of features, encoding the angular relationship between a dozen body parts in 3D. A feature vector is then mapped to a codeword of gesture HMMs. In order to spot key gestures accurately, a sophisticated method of designing a garbage gesture model is proposed; model reduction, which merges similar states, based on data-dependent statistics and relative entropy. The proposed method has been tested with 20 persons' samples and 200 synthetic data. The proposed method achieved a reliability rate of 94.8{\%} in spotting task and a recognition rate of 97.4{\%} from an isolated gesture.",
author = "Yang, {Hee Deok} and Park, {A. Yeon} and Lee, {Seong Whan}",
year = "2006",
doi = "10.1109/ICPR.2006.642",
language = "English",
volume = "4",
pages = "774--777",
booktitle = "Proceedings - International Conference on Pattern Recognition",

}

TY - GEN

T1 - Human-robot interaction by whole body gesture spotting and recognition

AU - Yang, Hee Deok

AU - Park, A. Yeon

AU - Lee, Seong Whan

PY - 2006

Y1 - 2006

N2 - An intelligent robot is required for natural interaction with humans. Visual interpretation of gestures can be useful in accomplishing natural Human-Robot Interaction (HRI). Previous HRI research focused on issues such as hand gesture, sign language, and command gesture recognition. Automatic recognition of whole body gestures is required in order for HRI to operate naturally. This presents a challenging problem, because describing and modeling meaningful gesture patterns from whole body gestures, is a complex task. This paper presents a new method for recognition of whole body key gestures in HRI. A human subject is first described by a set of features, encoding the angular relationship between a dozen body parts in 3D. A feature vector is then mapped to a codeword of gesture HMMs. In order to spot key gestures accurately, a sophisticated method of designing a garbage gesture model is proposed; model reduction, which merges similar states, based on data-dependent statistics and relative entropy. The proposed method has been tested with 20 persons' samples and 200 synthetic data. The proposed method achieved a reliability rate of 94.8% in spotting task and a recognition rate of 97.4% from an isolated gesture.

AB - An intelligent robot is required for natural interaction with humans. Visual interpretation of gestures can be useful in accomplishing natural Human-Robot Interaction (HRI). Previous HRI research focused on issues such as hand gesture, sign language, and command gesture recognition. Automatic recognition of whole body gestures is required in order for HRI to operate naturally. This presents a challenging problem, because describing and modeling meaningful gesture patterns from whole body gestures, is a complex task. This paper presents a new method for recognition of whole body key gestures in HRI. A human subject is first described by a set of features, encoding the angular relationship between a dozen body parts in 3D. A feature vector is then mapped to a codeword of gesture HMMs. In order to spot key gestures accurately, a sophisticated method of designing a garbage gesture model is proposed; model reduction, which merges similar states, based on data-dependent statistics and relative entropy. The proposed method has been tested with 20 persons' samples and 200 synthetic data. The proposed method achieved a reliability rate of 94.8% in spotting task and a recognition rate of 97.4% from an isolated gesture.

UR - http://www.scopus.com/inward/record.url?scp=34147121435&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=34147121435&partnerID=8YFLogxK

U2 - 10.1109/ICPR.2006.642

DO - 10.1109/ICPR.2006.642

M3 - Conference contribution

AN - SCOPUS:34147121435

VL - 4

SP - 774

EP - 777

BT - Proceedings - International Conference on Pattern Recognition

ER -