A Robust Extrinsic Calibration Method for Non-contact Gaze Tracking in the 3-D Space

Mun Cheon Kang, Cheol Hwan Yoo, Kwang Hyun Uhm, Dae Hong Lee, Sung-Jea Ko

Research output: Contribution to journalArticle

Abstract

In general, three-dimensional (3-D) gaze tracking methods employ both a frontal-viewing camera and an eye-capturing camera facing the opposite direction to precisely estimate the point-of-regard (POR) in the 3-D space. The extrinsic calibration of these two cameras for accurate 3-D gaze tracking is a challenging task. This paper presents a robust extrinsic calibration method for non-contact gaze tracking in the 3-D space. Even in a noisy environment, the extrinsic calibration parameters are precisely estimated by minimizing the proposed cost function consisting of both the angular and the Euclidean errors. Furthermore, using the estimated parameters, the 3-D POR is exactly determined based on the two-view geometry. Compared with the conventional methods, the proposed method provides superior results in experiments considering various factors such as the noise level, head movement, and camera configuration. In real experiments, we achieved an average Euclidean error of 12.6 cm and the average angular error of 0.98° when estimating the 3-D coordinates of PORs that were 4-8 m away from the user.

Original languageEnglish
JournalIEEE Access
DOIs
Publication statusAccepted/In press - 2018 Aug 24

Fingerprint

Cameras
Calibration
Cost functions
Experiments
Geometry

Keywords

  • Calibration
  • Calibration
  • Cameras
  • Computer vision
  • Cost function
  • Gaze tracking
  • Gaze tracking
  • Human computer interaction
  • Measurement uncertainty
  • Robustness
  • Solid modeling

ASJC Scopus subject areas

  • Computer Science(all)
  • Materials Science(all)
  • Engineering(all)

Cite this

A Robust Extrinsic Calibration Method for Non-contact Gaze Tracking in the 3-D Space. / Kang, Mun Cheon; Yoo, Cheol Hwan; Uhm, Kwang Hyun; Lee, Dae Hong; Ko, Sung-Jea.

In: IEEE Access, 24.08.2018.

Research output: Contribution to journalArticle

Kang, Mun Cheon ; Yoo, Cheol Hwan ; Uhm, Kwang Hyun ; Lee, Dae Hong ; Ko, Sung-Jea. / A Robust Extrinsic Calibration Method for Non-contact Gaze Tracking in the 3-D Space. In: IEEE Access. 2018.
@article{1271073f768a46bf9a843eca4f1c19f1,
title = "A Robust Extrinsic Calibration Method for Non-contact Gaze Tracking in the 3-D Space",
abstract = "In general, three-dimensional (3-D) gaze tracking methods employ both a frontal-viewing camera and an eye-capturing camera facing the opposite direction to precisely estimate the point-of-regard (POR) in the 3-D space. The extrinsic calibration of these two cameras for accurate 3-D gaze tracking is a challenging task. This paper presents a robust extrinsic calibration method for non-contact gaze tracking in the 3-D space. Even in a noisy environment, the extrinsic calibration parameters are precisely estimated by minimizing the proposed cost function consisting of both the angular and the Euclidean errors. Furthermore, using the estimated parameters, the 3-D POR is exactly determined based on the two-view geometry. Compared with the conventional methods, the proposed method provides superior results in experiments considering various factors such as the noise level, head movement, and camera configuration. In real experiments, we achieved an average Euclidean error of 12.6 cm and the average angular error of 0.98° when estimating the 3-D coordinates of PORs that were 4-8 m away from the user.",
keywords = "Calibration, Calibration, Cameras, Computer vision, Cost function, Gaze tracking, Gaze tracking, Human computer interaction, Measurement uncertainty, Robustness, Solid modeling",
author = "Kang, {Mun Cheon} and Yoo, {Cheol Hwan} and Uhm, {Kwang Hyun} and Lee, {Dae Hong} and Sung-Jea Ko",
year = "2018",
month = "8",
day = "24",
doi = "10.1109/ACCESS.2018.2867235",
language = "English",
journal = "IEEE Access",
issn = "2169-3536",
publisher = "Institute of Electrical and Electronics Engineers Inc.",

}

TY - JOUR

T1 - A Robust Extrinsic Calibration Method for Non-contact Gaze Tracking in the 3-D Space

AU - Kang, Mun Cheon

AU - Yoo, Cheol Hwan

AU - Uhm, Kwang Hyun

AU - Lee, Dae Hong

AU - Ko, Sung-Jea

PY - 2018/8/24

Y1 - 2018/8/24

N2 - In general, three-dimensional (3-D) gaze tracking methods employ both a frontal-viewing camera and an eye-capturing camera facing the opposite direction to precisely estimate the point-of-regard (POR) in the 3-D space. The extrinsic calibration of these two cameras for accurate 3-D gaze tracking is a challenging task. This paper presents a robust extrinsic calibration method for non-contact gaze tracking in the 3-D space. Even in a noisy environment, the extrinsic calibration parameters are precisely estimated by minimizing the proposed cost function consisting of both the angular and the Euclidean errors. Furthermore, using the estimated parameters, the 3-D POR is exactly determined based on the two-view geometry. Compared with the conventional methods, the proposed method provides superior results in experiments considering various factors such as the noise level, head movement, and camera configuration. In real experiments, we achieved an average Euclidean error of 12.6 cm and the average angular error of 0.98° when estimating the 3-D coordinates of PORs that were 4-8 m away from the user.

AB - In general, three-dimensional (3-D) gaze tracking methods employ both a frontal-viewing camera and an eye-capturing camera facing the opposite direction to precisely estimate the point-of-regard (POR) in the 3-D space. The extrinsic calibration of these two cameras for accurate 3-D gaze tracking is a challenging task. This paper presents a robust extrinsic calibration method for non-contact gaze tracking in the 3-D space. Even in a noisy environment, the extrinsic calibration parameters are precisely estimated by minimizing the proposed cost function consisting of both the angular and the Euclidean errors. Furthermore, using the estimated parameters, the 3-D POR is exactly determined based on the two-view geometry. Compared with the conventional methods, the proposed method provides superior results in experiments considering various factors such as the noise level, head movement, and camera configuration. In real experiments, we achieved an average Euclidean error of 12.6 cm and the average angular error of 0.98° when estimating the 3-D coordinates of PORs that were 4-8 m away from the user.

KW - Calibration

KW - Calibration

KW - Cameras

KW - Computer vision

KW - Cost function

KW - Gaze tracking

KW - Gaze tracking

KW - Human computer interaction

KW - Measurement uncertainty

KW - Robustness

KW - Solid modeling

UR - http://www.scopus.com/inward/record.url?scp=85052705706&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85052705706&partnerID=8YFLogxK

U2 - 10.1109/ACCESS.2018.2867235

DO - 10.1109/ACCESS.2018.2867235

M3 - Article

JO - IEEE Access

JF - IEEE Access

SN - 2169-3536

ER -