TY - GEN
T1 - Real-time 3D pointing gesture recognition in mobile space
AU - Park, Chang Beom
AU - Roh, Myung Cheol
AU - Lee, Seong Whan
PY - 2008
Y1 - 2008
N2 - In this paper, we present a real-time 3D pointing gesture recognition algorithm for natural human-robot interaction (HRI). The recognition errors in previous pointing gesture recognition algorithms are mainly caused by the low performance of the hands tracking module and by the unreliability of the direction estimate itself, therefore our proposed algorithm uses 3D particle filter for achieving reliability in hand tracking and cascade hidden Markov model (HMM) for a robust estimate for the pointing direction. When someone enters the field of view of the camera, his or her face and two hands are located and tracked using the particle filters. The first stage HMM takes the hand position estimate and maps it to a more accurate position by modeling the kinematic characteristics of finger pointing. The resulting 3D coordinates are used as an input to the second stage HMM that discriminates pointing gestures from others. Finally the pointing direction is estimated in the case of pointing state. The proposed method can deal with both large and small pointing gestures. The experiment shows better than 89% gesture recognition results and 99% target selection results.
AB - In this paper, we present a real-time 3D pointing gesture recognition algorithm for natural human-robot interaction (HRI). The recognition errors in previous pointing gesture recognition algorithms are mainly caused by the low performance of the hands tracking module and by the unreliability of the direction estimate itself, therefore our proposed algorithm uses 3D particle filter for achieving reliability in hand tracking and cascade hidden Markov model (HMM) for a robust estimate for the pointing direction. When someone enters the field of view of the camera, his or her face and two hands are located and tracked using the particle filters. The first stage HMM takes the hand position estimate and maps it to a more accurate position by modeling the kinematic characteristics of finger pointing. The resulting 3D coordinates are used as an input to the second stage HMM that discriminates pointing gestures from others. Finally the pointing direction is estimated in the case of pointing state. The proposed method can deal with both large and small pointing gestures. The experiment shows better than 89% gesture recognition results and 99% target selection results.
UR - http://www.scopus.com/inward/record.url?scp=67650677323&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=67650677323&partnerID=8YFLogxK
U2 - 10.1109/AFGR.2008.4813448
DO - 10.1109/AFGR.2008.4813448
M3 - Conference contribution
AN - SCOPUS:67650677323
SN - 9781424421541
T3 - 2008 8th IEEE International Conference on Automatic Face and Gesture Recognition, FG 2008
BT - 2008 8th IEEE International Conference on Automatic Face and Gesture Recognition, FG 2008
T2 - 2008 8th IEEE International Conference on Automatic Face and Gesture Recognition, FG 2008
Y2 - 17 September 2008 through 19 September 2008
ER -