Automatic gesture recognition for intelligent human-robot interaction

Research output: Chapter in Book/Report/Conference proceedingConference contribution

45 Citations (Scopus)

Abstract

An intelligent robot requires natural interaction with humans. Visual interpretation of gestures can be useful in accomplishing natural Human-Robot Interaction (HRI). Previous HRI researches were focused on issues such as hand gesture, sign language, and command gesture recognition. However, automatic recognition of whole body gestures is required in order to operate HRI naturally. This can be a challenging problem because describing and modeling meaningful gesture patterns from whole body gestures are complex tasks. This paper presents a new method for spotting and recognizing whole body key gestures at the same time on a mobile robot. Our method is simultaneously used with other HRI approaches such as speech recognition, face recognition, and so forth. In this regard, both of execution speed and recognition performance should be considered. For efficient and natural operation, we used several approaches at each step of gesture recognition; learning and extraction of articulated joint information, representing gesture as a sequence of clusters, spotting and recognizing a gesture with HMM. In addition, we constructed a large gesture database, with which we verified our method. As a result, our method is successfully included and operated in a mobile robot.

Original languageEnglish
Title of host publicationFGR 2006: Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition
Pages645-650
Number of pages6
Volume2006
DOIs
Publication statusPublished - 2006 Nov 14
EventFGR 2006: 7th International Conference on Automatic Face and Gesture Recognition - Southampton, United Kingdom
Duration: 2006 Apr 102006 Apr 12

Other

OtherFGR 2006: 7th International Conference on Automatic Face and Gesture Recognition
CountryUnited Kingdom
CitySouthampton
Period06/4/1006/4/12

Fingerprint

Gesture recognition
Human robot interaction
Mobile robots
Intelligent robots
End effectors
Face recognition
Speech recognition

ASJC Scopus subject areas

  • Engineering(all)

Cite this

Lee, S. W. (2006). Automatic gesture recognition for intelligent human-robot interaction. In FGR 2006: Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition (Vol. 2006, pp. 645-650). [1613091] https://doi.org/10.1109/FGR.2006.25

Automatic gesture recognition for intelligent human-robot interaction. / Lee, Seong Whan.

FGR 2006: Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition. Vol. 2006 2006. p. 645-650 1613091.

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Lee, SW 2006, Automatic gesture recognition for intelligent human-robot interaction. in FGR 2006: Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition. vol. 2006, 1613091, pp. 645-650, FGR 2006: 7th International Conference on Automatic Face and Gesture Recognition, Southampton, United Kingdom, 06/4/10. https://doi.org/10.1109/FGR.2006.25
Lee SW. Automatic gesture recognition for intelligent human-robot interaction. In FGR 2006: Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition. Vol. 2006. 2006. p. 645-650. 1613091 https://doi.org/10.1109/FGR.2006.25
Lee, Seong Whan. / Automatic gesture recognition for intelligent human-robot interaction. FGR 2006: Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition. Vol. 2006 2006. pp. 645-650
@inproceedings{58feffe6ed1741d1a010b809adb51e5c,
title = "Automatic gesture recognition for intelligent human-robot interaction",
abstract = "An intelligent robot requires natural interaction with humans. Visual interpretation of gestures can be useful in accomplishing natural Human-Robot Interaction (HRI). Previous HRI researches were focused on issues such as hand gesture, sign language, and command gesture recognition. However, automatic recognition of whole body gestures is required in order to operate HRI naturally. This can be a challenging problem because describing and modeling meaningful gesture patterns from whole body gestures are complex tasks. This paper presents a new method for spotting and recognizing whole body key gestures at the same time on a mobile robot. Our method is simultaneously used with other HRI approaches such as speech recognition, face recognition, and so forth. In this regard, both of execution speed and recognition performance should be considered. For efficient and natural operation, we used several approaches at each step of gesture recognition; learning and extraction of articulated joint information, representing gesture as a sequence of clusters, spotting and recognizing a gesture with HMM. In addition, we constructed a large gesture database, with which we verified our method. As a result, our method is successfully included and operated in a mobile robot.",
author = "Lee, {Seong Whan}",
year = "2006",
month = "11",
day = "14",
doi = "10.1109/FGR.2006.25",
language = "English",
isbn = "0769525032",
volume = "2006",
pages = "645--650",
booktitle = "FGR 2006: Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition",

}

TY - GEN

T1 - Automatic gesture recognition for intelligent human-robot interaction

AU - Lee, Seong Whan

PY - 2006/11/14

Y1 - 2006/11/14

N2 - An intelligent robot requires natural interaction with humans. Visual interpretation of gestures can be useful in accomplishing natural Human-Robot Interaction (HRI). Previous HRI researches were focused on issues such as hand gesture, sign language, and command gesture recognition. However, automatic recognition of whole body gestures is required in order to operate HRI naturally. This can be a challenging problem because describing and modeling meaningful gesture patterns from whole body gestures are complex tasks. This paper presents a new method for spotting and recognizing whole body key gestures at the same time on a mobile robot. Our method is simultaneously used with other HRI approaches such as speech recognition, face recognition, and so forth. In this regard, both of execution speed and recognition performance should be considered. For efficient and natural operation, we used several approaches at each step of gesture recognition; learning and extraction of articulated joint information, representing gesture as a sequence of clusters, spotting and recognizing a gesture with HMM. In addition, we constructed a large gesture database, with which we verified our method. As a result, our method is successfully included and operated in a mobile robot.

AB - An intelligent robot requires natural interaction with humans. Visual interpretation of gestures can be useful in accomplishing natural Human-Robot Interaction (HRI). Previous HRI researches were focused on issues such as hand gesture, sign language, and command gesture recognition. However, automatic recognition of whole body gestures is required in order to operate HRI naturally. This can be a challenging problem because describing and modeling meaningful gesture patterns from whole body gestures are complex tasks. This paper presents a new method for spotting and recognizing whole body key gestures at the same time on a mobile robot. Our method is simultaneously used with other HRI approaches such as speech recognition, face recognition, and so forth. In this regard, both of execution speed and recognition performance should be considered. For efficient and natural operation, we used several approaches at each step of gesture recognition; learning and extraction of articulated joint information, representing gesture as a sequence of clusters, spotting and recognizing a gesture with HMM. In addition, we constructed a large gesture database, with which we verified our method. As a result, our method is successfully included and operated in a mobile robot.

UR - http://www.scopus.com/inward/record.url?scp=33750815390&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=33750815390&partnerID=8YFLogxK

U2 - 10.1109/FGR.2006.25

DO - 10.1109/FGR.2006.25

M3 - Conference contribution

AN - SCOPUS:33750815390

SN - 0769525032

SN - 9780769525037

VL - 2006

SP - 645

EP - 650

BT - FGR 2006: Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition

ER -