MPI cybermotion simulator: Implementation of a novel motion simulator to investigate multisensory path integration in three dimensions

Michael Barnett-Cowan, Tobias Meilinger, Manuel Vidal, Harald Teufel, Heinrich H. Bülthoff

Research output: Contribution to journalArticle

18 Citations (Scopus)

Abstract

Path integration is a process in which self-motion is integrated over time to obtain an estimate of one's current position relative to a starting point 1. Humans can do path integration based exclusively on visual 2-3, auditory 4, or inertial cues 5. However, with multiple cues present, inertial cues - particularly kinaesthetic - seem to dominate 6-7. In the absence of vision, humans tend to overestimate short distances (<5 m) and turning angles (<30°), but underestimate longer ones 5. Movement through physical space therefore does not seem to be accurately represented by the brain. Extensive work has been done on evaluating path integration in the horizontal plane, but little is known about vertical movement (see 3 for virtual movement from vision alone). One reason for this is that traditional motion simulators have a small range of motion restricted mainly to the horizontal plane. Here we take advantage of a motion simulator 8-9 with a large range of motion to assess whether path integration is similar between horizontal and vertical planes. The relative contributions of inertial and visual cues for path navigation were also assessed. 16 observers sat upright in a seat mounted to the flange of a modified KUKA anthropomorphic robot arm. Sensory information was manipulated by providing visual (optic flow, limited lifetime star field), vestibular-kinaesthetic (passive self motion with eyes closed), or visual and vestibularkinaesthetic motion cues. Movement trajectories in the horizontal, sagittal and frontal planes consisted of two segment lengths (1st: 0.4 m, 2nd: 1 m; ±0.24 m/s 2 peak acceleration). The angle of the two segments was either 45° or 90°. Observers pointed back to their origin by moving an arrow that was superimposed on an avatar presented on the screen. Observers were more likely to underestimate angle size for movement in the horizontal plane compared to the vertical planes. In the frontal plane observers were more likely to overestimate angle size while there was no such bias in the sagittal plane. Finally, observers responded slower when answering based on vestibular-kinaesthetic information alone. Human path integration based on vestibular-kinaesthetic information alone thus takes longer than when visual information is present. That pointing is consistent with underestimating and overestimating the angle one has moved through in the horizontal and vertical planes respectively, suggests that the neural representation of self-motion through space is nonsymmetrical which may relate to the fact that humans experience movement mostly within the horizontal plane.

Original languageEnglish
Article numbere3436
JournalJournal of Visualized Experiments
Issue number63
DOIs
Publication statusPublished - 2012 May 10

Keywords

  • Cybernetics
  • Issue 63
  • Motion simulator
  • Multisensory integration
  • Neuroscience
  • Path integration
  • Robotics
  • Space perception
  • Vestibular
  • Vision

ASJC Scopus subject areas

  • Neuroscience(all)
  • Chemical Engineering(all)
  • Biochemistry, Genetics and Molecular Biology(all)
  • Immunology and Microbiology(all)

Fingerprint Dive into the research topics of 'MPI cybermotion simulator: Implementation of a novel motion simulator to investigate multisensory path integration in three dimensions'. Together they form a unique fingerprint.

  • Cite this