Contributions of visual and proprioceptive information to travelled distance estimation during changing sensory congruencies

Jennifer L. Campos, John S. Butler, Heinrich Bulthoff

Research output: Contribution to journalArticle

16 Citations (Scopus)

Abstract

Recent research has provided evidence that visual and body-based cues (vestibular, proprioceptive and efference copy) are integrated using a weighted linear sum during walking and passive transport. However, little is known about the specific weighting of visual information when combined with proprioceptive inputs alone, in the absence of vestibular information about forward self-motion. Therefore, in this study, participants walked in place on a stationary treadmill while dynamic visual information was updated in real time via a head-mounted display. The task required participants to travel a predefined distance and subsequently match this distance by adjusting an egocentric, in-depth target using a game controller. Travelled distance information was provided either through visual cues alone, proprioceptive cues alone or both cues combined. In the combined cue condition, the relationship between the two cues was manipulated by either changing the visual gain across trials (0.7×, 1.0×, 1.4×; Exp. 1) or the proprioceptive gain across trials (0.7×, 1.0×, 1.4×; Exp. 2). Results demonstrated an overall higher weighting of proprioception over vision. These weights were scaled, however, as a function of which sensory input provided more stable information across trials. Specifically, when visual gain was constantly manipulated, proprioceptive weights were higher than when proprioceptive gain was constantly manipulated. These results therefore reveal interesting characteristics of cue-weighting within the context of unfolding spatio-temporal cue dynamics.

Original languageEnglish
Pages (from-to)3277-3289
Number of pages13
JournalExperimental Brain Research
Volume232
Issue number10
DOIs
Publication statusPublished - 2014 Jan 1

Fingerprint

Cues
Weights and Measures
Proprioception
Walking
Head
Research

ASJC Scopus subject areas

  • Neuroscience(all)

Cite this

Contributions of visual and proprioceptive information to travelled distance estimation during changing sensory congruencies. / Campos, Jennifer L.; Butler, John S.; Bulthoff, Heinrich.

In: Experimental Brain Research, Vol. 232, No. 10, 01.01.2014, p. 3277-3289.

Research output: Contribution to journalArticle

@article{a1b1fd622bc446cd82b9990e1c1cd241,
title = "Contributions of visual and proprioceptive information to travelled distance estimation during changing sensory congruencies",
abstract = "Recent research has provided evidence that visual and body-based cues (vestibular, proprioceptive and efference copy) are integrated using a weighted linear sum during walking and passive transport. However, little is known about the specific weighting of visual information when combined with proprioceptive inputs alone, in the absence of vestibular information about forward self-motion. Therefore, in this study, participants walked in place on a stationary treadmill while dynamic visual information was updated in real time via a head-mounted display. The task required participants to travel a predefined distance and subsequently match this distance by adjusting an egocentric, in-depth target using a game controller. Travelled distance information was provided either through visual cues alone, proprioceptive cues alone or both cues combined. In the combined cue condition, the relationship between the two cues was manipulated by either changing the visual gain across trials (0.7×, 1.0×, 1.4×; Exp. 1) or the proprioceptive gain across trials (0.7×, 1.0×, 1.4×; Exp. 2). Results demonstrated an overall higher weighting of proprioception over vision. These weights were scaled, however, as a function of which sensory input provided more stable information across trials. Specifically, when visual gain was constantly manipulated, proprioceptive weights were higher than when proprioceptive gain was constantly manipulated. These results therefore reveal interesting characteristics of cue-weighting within the context of unfolding spatio-temporal cue dynamics.",
keywords = "Cue conflict, Distance estimation, Multisensory integration, Optic flow, Proprioception, Self-motion",
author = "Campos, {Jennifer L.} and Butler, {John S.} and Heinrich Bulthoff",
year = "2014",
month = "1",
day = "1",
doi = "10.1007/s00221-014-4011-0",
language = "English",
volume = "232",
pages = "3277--3289",
journal = "Experimental Brain Research",
issn = "0014-4819",
publisher = "Springer Verlag",
number = "10",

}

TY - JOUR

T1 - Contributions of visual and proprioceptive information to travelled distance estimation during changing sensory congruencies

AU - Campos, Jennifer L.

AU - Butler, John S.

AU - Bulthoff, Heinrich

PY - 2014/1/1

Y1 - 2014/1/1

N2 - Recent research has provided evidence that visual and body-based cues (vestibular, proprioceptive and efference copy) are integrated using a weighted linear sum during walking and passive transport. However, little is known about the specific weighting of visual information when combined with proprioceptive inputs alone, in the absence of vestibular information about forward self-motion. Therefore, in this study, participants walked in place on a stationary treadmill while dynamic visual information was updated in real time via a head-mounted display. The task required participants to travel a predefined distance and subsequently match this distance by adjusting an egocentric, in-depth target using a game controller. Travelled distance information was provided either through visual cues alone, proprioceptive cues alone or both cues combined. In the combined cue condition, the relationship between the two cues was manipulated by either changing the visual gain across trials (0.7×, 1.0×, 1.4×; Exp. 1) or the proprioceptive gain across trials (0.7×, 1.0×, 1.4×; Exp. 2). Results demonstrated an overall higher weighting of proprioception over vision. These weights were scaled, however, as a function of which sensory input provided more stable information across trials. Specifically, when visual gain was constantly manipulated, proprioceptive weights were higher than when proprioceptive gain was constantly manipulated. These results therefore reveal interesting characteristics of cue-weighting within the context of unfolding spatio-temporal cue dynamics.

AB - Recent research has provided evidence that visual and body-based cues (vestibular, proprioceptive and efference copy) are integrated using a weighted linear sum during walking and passive transport. However, little is known about the specific weighting of visual information when combined with proprioceptive inputs alone, in the absence of vestibular information about forward self-motion. Therefore, in this study, participants walked in place on a stationary treadmill while dynamic visual information was updated in real time via a head-mounted display. The task required participants to travel a predefined distance and subsequently match this distance by adjusting an egocentric, in-depth target using a game controller. Travelled distance information was provided either through visual cues alone, proprioceptive cues alone or both cues combined. In the combined cue condition, the relationship between the two cues was manipulated by either changing the visual gain across trials (0.7×, 1.0×, 1.4×; Exp. 1) or the proprioceptive gain across trials (0.7×, 1.0×, 1.4×; Exp. 2). Results demonstrated an overall higher weighting of proprioception over vision. These weights were scaled, however, as a function of which sensory input provided more stable information across trials. Specifically, when visual gain was constantly manipulated, proprioceptive weights were higher than when proprioceptive gain was constantly manipulated. These results therefore reveal interesting characteristics of cue-weighting within the context of unfolding spatio-temporal cue dynamics.

KW - Cue conflict

KW - Distance estimation

KW - Multisensory integration

KW - Optic flow

KW - Proprioception

KW - Self-motion

UR - http://www.scopus.com/inward/record.url?scp=84912026012&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84912026012&partnerID=8YFLogxK

U2 - 10.1007/s00221-014-4011-0

DO - 10.1007/s00221-014-4011-0

M3 - Article

C2 - 24961739

AN - SCOPUS:84912026012

VL - 232

SP - 3277

EP - 3289

JO - Experimental Brain Research

JF - Experimental Brain Research

SN - 0014-4819

IS - 10

ER -