TY - GEN
T1 - Gaze-directed hands-free interface for mobile interaction
AU - Park, Gie Seo
AU - Ahn, Jong Gil
AU - Kim, Gerard J.
N1 - Funding Information:
This research was supported in part by the Strategic Technology Lab. Program (Multimodal Entertainment Platform area) and the Core Industrial Tech. Development Program (Digital Textile based Around Body Computing area) of the Korea Ministry of Knowledge Economy (MKE).
PY - 2011
Y1 - 2011
N2 - While mobile devices have allowed people to carry out various computing and communication tasks everywhere, it has generally lacked the support for task execution while the user is in motion. This is because the interaction schemes of most mobile applications are centered around the device visual display and when in motion (with the important body parts, such as the head and hands, moving), it is difficult for the user to recognize the visual output on the small hand-carried device display and respond to make the timely and proper input. In this paper, we propose an interface which allows the user to interact with the mobile devices during motion without having to look at it or use one's hands. More specifically, the user interacts, by gaze and head motion gestures, with an invisible virtual interface panel with the help of a head-worn gyro sensor and aural feedback. Since the menu is one of the most prevailing methods of interaction, we investigate and focus on the various forms of menu presentation such as the layout and the number of comfortably selectable menu items. With head motion, it turns out 4×2 or 3×3 grid menu is more effective. The results of this study can be further extended for developing a more sophisticated non-visual oriented mobile interface.
AB - While mobile devices have allowed people to carry out various computing and communication tasks everywhere, it has generally lacked the support for task execution while the user is in motion. This is because the interaction schemes of most mobile applications are centered around the device visual display and when in motion (with the important body parts, such as the head and hands, moving), it is difficult for the user to recognize the visual output on the small hand-carried device display and respond to make the timely and proper input. In this paper, we propose an interface which allows the user to interact with the mobile devices during motion without having to look at it or use one's hands. More specifically, the user interacts, by gaze and head motion gestures, with an invisible virtual interface panel with the help of a head-worn gyro sensor and aural feedback. Since the menu is one of the most prevailing methods of interaction, we investigate and focus on the various forms of menu presentation such as the layout and the number of comfortably selectable menu items. With head motion, it turns out 4×2 or 3×3 grid menu is more effective. The results of this study can be further extended for developing a more sophisticated non-visual oriented mobile interface.
KW - Gaze
KW - Hands-free
KW - Head-controlled
KW - Mobile interface
KW - Non-visual interface
UR - http://www.scopus.com/inward/record.url?scp=79960323446&partnerID=8YFLogxK
U2 - 10.1007/978-3-642-21605-3_34
DO - 10.1007/978-3-642-21605-3_34
M3 - Conference contribution
AN - SCOPUS:79960323446
SN - 9783642216046
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 304
EP - 313
BT - Human-Computer Interaction
T2 - 14th International Conference on Human-Computer Interaction, HCI International 2011
Y2 - 9 July 2011 through 14 July 2011
ER -