Bayesian motion estimation accounts for a surprising bias in 3D vision

Andrew E. Welchman, Judith M. Lam, Heinrich Bulthoff

Research output: Contribution to journalArticle

30 Citations (Scopus)

Abstract

Determining the approach of a moving object is a vital survival skill that depends on the brain combining information about lateral translation and motion-in-depth. Given the importance of sensing motion for obstacle avoidance, it is surprising that humans make errors, reporting an object will miss them when it is on a collision course with their head. Here we provide evidence that biases observed when participants estimate movement in depth result from the brain's use of a "prior" favoring slow velocity. We formulate a Bayesian model for computing 3D motion using independently estimated parameters for the shape of the visual system's slow velocity prior. We demonstrate the success of this model in accounting for human behavior in separate experiments that assess both sensitivity and bias in 3D motion estimation. Our results show that a surprising perceptual error in 3D motion perception reflects the importance of prior probabilities when estimating environmental properties.

Original languageEnglish
Pages (from-to)12087-12092
Number of pages6
JournalProceedings of the National Academy of Sciences of the United States of America
Volume105
Issue number33
DOIs
Publication statusPublished - 2008 Aug 19
Externally publishedYes

Fingerprint

Motion Perception
Brain
Head
Survival

Keywords

  • Bayes
  • Binocular disparity
  • Motion perception
  • Stereopsis

ASJC Scopus subject areas

  • Genetics
  • General

Cite this

Bayesian motion estimation accounts for a surprising bias in 3D vision. / Welchman, Andrew E.; Lam, Judith M.; Bulthoff, Heinrich.

In: Proceedings of the National Academy of Sciences of the United States of America, Vol. 105, No. 33, 19.08.2008, p. 12087-12092.

Research output: Contribution to journalArticle

@article{0da81d7bfbc84c5188e8804c0839b64e,
title = "Bayesian motion estimation accounts for a surprising bias in 3D vision",
abstract = "Determining the approach of a moving object is a vital survival skill that depends on the brain combining information about lateral translation and motion-in-depth. Given the importance of sensing motion for obstacle avoidance, it is surprising that humans make errors, reporting an object will miss them when it is on a collision course with their head. Here we provide evidence that biases observed when participants estimate movement in depth result from the brain's use of a {"}prior{"} favoring slow velocity. We formulate a Bayesian model for computing 3D motion using independently estimated parameters for the shape of the visual system's slow velocity prior. We demonstrate the success of this model in accounting for human behavior in separate experiments that assess both sensitivity and bias in 3D motion estimation. Our results show that a surprising perceptual error in 3D motion perception reflects the importance of prior probabilities when estimating environmental properties.",
keywords = "Bayes, Binocular disparity, Motion perception, Stereopsis",
author = "Welchman, {Andrew E.} and Lam, {Judith M.} and Heinrich Bulthoff",
year = "2008",
month = "8",
day = "19",
doi = "10.1073/pnas.0804378105",
language = "English",
volume = "105",
pages = "12087--12092",
journal = "Proceedings of the National Academy of Sciences of the United States of America",
issn = "0027-8424",
number = "33",

}

TY - JOUR

T1 - Bayesian motion estimation accounts for a surprising bias in 3D vision

AU - Welchman, Andrew E.

AU - Lam, Judith M.

AU - Bulthoff, Heinrich

PY - 2008/8/19

Y1 - 2008/8/19

N2 - Determining the approach of a moving object is a vital survival skill that depends on the brain combining information about lateral translation and motion-in-depth. Given the importance of sensing motion for obstacle avoidance, it is surprising that humans make errors, reporting an object will miss them when it is on a collision course with their head. Here we provide evidence that biases observed when participants estimate movement in depth result from the brain's use of a "prior" favoring slow velocity. We formulate a Bayesian model for computing 3D motion using independently estimated parameters for the shape of the visual system's slow velocity prior. We demonstrate the success of this model in accounting for human behavior in separate experiments that assess both sensitivity and bias in 3D motion estimation. Our results show that a surprising perceptual error in 3D motion perception reflects the importance of prior probabilities when estimating environmental properties.

AB - Determining the approach of a moving object is a vital survival skill that depends on the brain combining information about lateral translation and motion-in-depth. Given the importance of sensing motion for obstacle avoidance, it is surprising that humans make errors, reporting an object will miss them when it is on a collision course with their head. Here we provide evidence that biases observed when participants estimate movement in depth result from the brain's use of a "prior" favoring slow velocity. We formulate a Bayesian model for computing 3D motion using independently estimated parameters for the shape of the visual system's slow velocity prior. We demonstrate the success of this model in accounting for human behavior in separate experiments that assess both sensitivity and bias in 3D motion estimation. Our results show that a surprising perceptual error in 3D motion perception reflects the importance of prior probabilities when estimating environmental properties.

KW - Bayes

KW - Binocular disparity

KW - Motion perception

KW - Stereopsis

UR - http://www.scopus.com/inward/record.url?scp=50149095742&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=50149095742&partnerID=8YFLogxK

U2 - 10.1073/pnas.0804378105

DO - 10.1073/pnas.0804378105

M3 - Article

VL - 105

SP - 12087

EP - 12092

JO - Proceedings of the National Academy of Sciences of the United States of America

JF - Proceedings of the National Academy of Sciences of the United States of America

SN - 0027-8424

IS - 33

ER -