Multisensory integration in the estimation of walked distances

Jennifer L. Campos, John S. Butler, Heinrich Bulthoff

Research output: Contribution to journalArticle

36 Citations (Scopus)

Abstract

When walking through space, both dynamic visual information (optic flow) and body-based information (proprioceptive and vestibular) jointly specify the magnitude of distance travelled. While recent evidence has demonstrated the extent to which each of these cues can be used independently, less is known about how they are integrated when simultaneously present. Many studies have shown that sensory information is integrated using a weighted linear sum, yet little is known about whether this holds true for the integration of visual and body-based cues for travelled distance perception. In this study using Virtual Reality technologies, participants first travelled a predefined distance and subsequently matched this distance by adjusting an egocentric, in-depth target. The visual stimulus consisted of a long hallway and was presented in stereo via a head-mounted display. Body-based cues were provided either by walking in a fully tracked free-walking space (Exp. 1) or by being passively moved in a wheelchair (Exp. 2). Travelled distances were provided either through optic flow alone, body-based cues alone or through both cues combined. In the combined condition, visually specified distances were either congruent (1.09) or incongruent (0.7 × or 1.4 ×) with distances specified by body-based cues. Responses reflect a consistent combined effect of both visual and body-based information, with an overall higher influence of body-based cues when walking and a higher influence of visual cues during passive movement. When comparing the results of Experiments 1 and 2, it is clear that both proprioceptive and vestibular cues contribute to travelled distance estimates during walking. These observed results were effectively described using a basic linear weighting model.

Original languageEnglish
Pages (from-to)551-565
Number of pages15
JournalExperimental Brain Research
Volume218
Issue number4
DOIs
Publication statusPublished - 2012 May 1

Fingerprint

Cues
Walking
Optic Flow
Distance Perception
Wheelchairs
Linear Models
Head
Technology

Keywords

  • Distance estimation
  • Multisensory integration
  • Optic flow
  • Proprioception
  • Self-motion
  • Vestibular

ASJC Scopus subject areas

  • Neuroscience(all)

Cite this

Multisensory integration in the estimation of walked distances. / Campos, Jennifer L.; Butler, John S.; Bulthoff, Heinrich.

In: Experimental Brain Research, Vol. 218, No. 4, 01.05.2012, p. 551-565.

Research output: Contribution to journalArticle

Campos, Jennifer L. ; Butler, John S. ; Bulthoff, Heinrich. / Multisensory integration in the estimation of walked distances. In: Experimental Brain Research. 2012 ; Vol. 218, No. 4. pp. 551-565.
@article{4abb5a7fde1149b9955cea84b941d75c,
title = "Multisensory integration in the estimation of walked distances",
abstract = "When walking through space, both dynamic visual information (optic flow) and body-based information (proprioceptive and vestibular) jointly specify the magnitude of distance travelled. While recent evidence has demonstrated the extent to which each of these cues can be used independently, less is known about how they are integrated when simultaneously present. Many studies have shown that sensory information is integrated using a weighted linear sum, yet little is known about whether this holds true for the integration of visual and body-based cues for travelled distance perception. In this study using Virtual Reality technologies, participants first travelled a predefined distance and subsequently matched this distance by adjusting an egocentric, in-depth target. The visual stimulus consisted of a long hallway and was presented in stereo via a head-mounted display. Body-based cues were provided either by walking in a fully tracked free-walking space (Exp. 1) or by being passively moved in a wheelchair (Exp. 2). Travelled distances were provided either through optic flow alone, body-based cues alone or through both cues combined. In the combined condition, visually specified distances were either congruent (1.09) or incongruent (0.7 × or 1.4 ×) with distances specified by body-based cues. Responses reflect a consistent combined effect of both visual and body-based information, with an overall higher influence of body-based cues when walking and a higher influence of visual cues during passive movement. When comparing the results of Experiments 1 and 2, it is clear that both proprioceptive and vestibular cues contribute to travelled distance estimates during walking. These observed results were effectively described using a basic linear weighting model.",
keywords = "Distance estimation, Multisensory integration, Optic flow, Proprioception, Self-motion, Vestibular",
author = "Campos, {Jennifer L.} and Butler, {John S.} and Heinrich Bulthoff",
year = "2012",
month = "5",
day = "1",
doi = "10.1007/s00221-012-3048-1",
language = "English",
volume = "218",
pages = "551--565",
journal = "Experimental Brain Research",
issn = "0014-4819",
publisher = "Springer Verlag",
number = "4",

}

TY - JOUR

T1 - Multisensory integration in the estimation of walked distances

AU - Campos, Jennifer L.

AU - Butler, John S.

AU - Bulthoff, Heinrich

PY - 2012/5/1

Y1 - 2012/5/1

N2 - When walking through space, both dynamic visual information (optic flow) and body-based information (proprioceptive and vestibular) jointly specify the magnitude of distance travelled. While recent evidence has demonstrated the extent to which each of these cues can be used independently, less is known about how they are integrated when simultaneously present. Many studies have shown that sensory information is integrated using a weighted linear sum, yet little is known about whether this holds true for the integration of visual and body-based cues for travelled distance perception. In this study using Virtual Reality technologies, participants first travelled a predefined distance and subsequently matched this distance by adjusting an egocentric, in-depth target. The visual stimulus consisted of a long hallway and was presented in stereo via a head-mounted display. Body-based cues were provided either by walking in a fully tracked free-walking space (Exp. 1) or by being passively moved in a wheelchair (Exp. 2). Travelled distances were provided either through optic flow alone, body-based cues alone or through both cues combined. In the combined condition, visually specified distances were either congruent (1.09) or incongruent (0.7 × or 1.4 ×) with distances specified by body-based cues. Responses reflect a consistent combined effect of both visual and body-based information, with an overall higher influence of body-based cues when walking and a higher influence of visual cues during passive movement. When comparing the results of Experiments 1 and 2, it is clear that both proprioceptive and vestibular cues contribute to travelled distance estimates during walking. These observed results were effectively described using a basic linear weighting model.

AB - When walking through space, both dynamic visual information (optic flow) and body-based information (proprioceptive and vestibular) jointly specify the magnitude of distance travelled. While recent evidence has demonstrated the extent to which each of these cues can be used independently, less is known about how they are integrated when simultaneously present. Many studies have shown that sensory information is integrated using a weighted linear sum, yet little is known about whether this holds true for the integration of visual and body-based cues for travelled distance perception. In this study using Virtual Reality technologies, participants first travelled a predefined distance and subsequently matched this distance by adjusting an egocentric, in-depth target. The visual stimulus consisted of a long hallway and was presented in stereo via a head-mounted display. Body-based cues were provided either by walking in a fully tracked free-walking space (Exp. 1) or by being passively moved in a wheelchair (Exp. 2). Travelled distances were provided either through optic flow alone, body-based cues alone or through both cues combined. In the combined condition, visually specified distances were either congruent (1.09) or incongruent (0.7 × or 1.4 ×) with distances specified by body-based cues. Responses reflect a consistent combined effect of both visual and body-based information, with an overall higher influence of body-based cues when walking and a higher influence of visual cues during passive movement. When comparing the results of Experiments 1 and 2, it is clear that both proprioceptive and vestibular cues contribute to travelled distance estimates during walking. These observed results were effectively described using a basic linear weighting model.

KW - Distance estimation

KW - Multisensory integration

KW - Optic flow

KW - Proprioception

KW - Self-motion

KW - Vestibular

UR - http://www.scopus.com/inward/record.url?scp=84862839623&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84862839623&partnerID=8YFLogxK

U2 - 10.1007/s00221-012-3048-1

DO - 10.1007/s00221-012-3048-1

M3 - Article

C2 - 22411581

AN - SCOPUS:84862839623

VL - 218

SP - 551

EP - 565

JO - Experimental Brain Research

JF - Experimental Brain Research

SN - 0014-4819

IS - 4

ER -