A comparison of geometric- and regression-based mobile gaze-tracking

Björn Browatzki, Heinrich Bulthoff, Lewis L. Chuang

Research output: Contribution to journalArticle

6 Citations (Scopus)

Abstract

Video-based gaze-tracking systems are typically restricted in terms of their effective tracking space. This constraint limits the use of eyetrackers in studying mobile human behavior. Here, we compare two possible approaches for estimating the gaze of participants who are free to walk in a large space whilst looking at different regions of a large display. Geometrically, we linearly combined eye-in-head rotations and head-in-world coordinates to derive a gaze vector and its intersection with a planar display, by relying on the use of a head-mounted eyetrackerand body-motion tracker. Alternatively, we employed Gaussian process regression to estimate the gaze intersection directly from the input data itself. Our evaluation of both methods indicates that a regression approach can deliver comparable results to a geometric approach. The regression approach is favored, given that it has the potential for further optimization, provides confidence bounds for its gaze estimates and offers greater flexibility in its implementation. Open-source software for the methods reported here is also provided for user implementation.

Original languageEnglish
Article number200
JournalFrontiers in Human Neuroscience
Volume8
Issue number1 APR
DOIs
Publication statusPublished - 2014 Apr 8

Fingerprint

Head
Software

Keywords

  • Active vision
  • Calibration method
  • Eye movement
  • Eye tracking
  • Gaussian processes
  • Gaze measurement

ASJC Scopus subject areas

  • Psychiatry and Mental health
  • Neurology
  • Biological Psychiatry
  • Behavioral Neuroscience
  • Neuropsychology and Physiological Psychology

Cite this

A comparison of geometric- and regression-based mobile gaze-tracking. / Browatzki, Björn; Bulthoff, Heinrich; Chuang, Lewis L.

In: Frontiers in Human Neuroscience, Vol. 8, No. 1 APR, 200, 08.04.2014.

Research output: Contribution to journalArticle

@article{d3476c2853874a598162ac3307a896b3,
title = "A comparison of geometric- and regression-based mobile gaze-tracking",
abstract = "Video-based gaze-tracking systems are typically restricted in terms of their effective tracking space. This constraint limits the use of eyetrackers in studying mobile human behavior. Here, we compare two possible approaches for estimating the gaze of participants who are free to walk in a large space whilst looking at different regions of a large display. Geometrically, we linearly combined eye-in-head rotations and head-in-world coordinates to derive a gaze vector and its intersection with a planar display, by relying on the use of a head-mounted eyetrackerand body-motion tracker. Alternatively, we employed Gaussian process regression to estimate the gaze intersection directly from the input data itself. Our evaluation of both methods indicates that a regression approach can deliver comparable results to a geometric approach. The regression approach is favored, given that it has the potential for further optimization, provides confidence bounds for its gaze estimates and offers greater flexibility in its implementation. Open-source software for the methods reported here is also provided for user implementation.",
keywords = "Active vision, Calibration method, Eye movement, Eye tracking, Gaussian processes, Gaze measurement",
author = "Bj{\"o}rn Browatzki and Heinrich Bulthoff and Chuang, {Lewis L.}",
year = "2014",
month = "4",
day = "8",
doi = "10.3389/fnhum.2014.00200",
language = "English",
volume = "8",
journal = "Frontiers in Human Neuroscience",
issn = "1662-5161",
publisher = "Frontiers Research Foundation",
number = "1 APR",

}

TY - JOUR

T1 - A comparison of geometric- and regression-based mobile gaze-tracking

AU - Browatzki, Björn

AU - Bulthoff, Heinrich

AU - Chuang, Lewis L.

PY - 2014/4/8

Y1 - 2014/4/8

N2 - Video-based gaze-tracking systems are typically restricted in terms of their effective tracking space. This constraint limits the use of eyetrackers in studying mobile human behavior. Here, we compare two possible approaches for estimating the gaze of participants who are free to walk in a large space whilst looking at different regions of a large display. Geometrically, we linearly combined eye-in-head rotations and head-in-world coordinates to derive a gaze vector and its intersection with a planar display, by relying on the use of a head-mounted eyetrackerand body-motion tracker. Alternatively, we employed Gaussian process regression to estimate the gaze intersection directly from the input data itself. Our evaluation of both methods indicates that a regression approach can deliver comparable results to a geometric approach. The regression approach is favored, given that it has the potential for further optimization, provides confidence bounds for its gaze estimates and offers greater flexibility in its implementation. Open-source software for the methods reported here is also provided for user implementation.

AB - Video-based gaze-tracking systems are typically restricted in terms of their effective tracking space. This constraint limits the use of eyetrackers in studying mobile human behavior. Here, we compare two possible approaches for estimating the gaze of participants who are free to walk in a large space whilst looking at different regions of a large display. Geometrically, we linearly combined eye-in-head rotations and head-in-world coordinates to derive a gaze vector and its intersection with a planar display, by relying on the use of a head-mounted eyetrackerand body-motion tracker. Alternatively, we employed Gaussian process regression to estimate the gaze intersection directly from the input data itself. Our evaluation of both methods indicates that a regression approach can deliver comparable results to a geometric approach. The regression approach is favored, given that it has the potential for further optimization, provides confidence bounds for its gaze estimates and offers greater flexibility in its implementation. Open-source software for the methods reported here is also provided for user implementation.

KW - Active vision

KW - Calibration method

KW - Eye movement

KW - Eye tracking

KW - Gaussian processes

KW - Gaze measurement

UR - http://www.scopus.com/inward/record.url?scp=84898722207&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84898722207&partnerID=8YFLogxK

U2 - 10.3389/fnhum.2014.00200

DO - 10.3389/fnhum.2014.00200

M3 - Article

AN - SCOPUS:84898722207

VL - 8

JO - Frontiers in Human Neuroscience

JF - Frontiers in Human Neuroscience

SN - 1662-5161

IS - 1 APR

M1 - 200

ER -