TY - JOUR
T1 - A multisensory approach to spatial updating
T2 - The case of mental rotations
AU - Vidal, Manuel
AU - Lehmann, Alexandre
AU - Bülthoff, Heinrich H.
N1 - Funding Information:
Acknowledgments This work was presented at the IMRF 2008 conference. Manuel Vidal received a post-doctoral scholarship from the Max Planck Society and Alexandre Lehmann received a doctoral scholarship from the Centre Nationale pour la Recherche ScientiWque. We are grateful to the workshop of the Max Planck Institute for the construction of the table set-up.
PY - 2009/7
Y1 - 2009/7
N2 - Mental rotation is the capacity to predict the outcome of spatial relationships after a change in viewpoint. These changes arise either from the rotation of the test object array or from the rotation of the observer. Previous studies showed that the cognitive cost of mental rotations is reduced when viewpoint changes result from the observer's motion, which was explained by the spatial updating mechanism involved during self-motion. However, little is known about how various sensory cues available might contribute to the updating performance. We used a Virtual Reality setup in a series of experiments to investigate table-top mental rotations under different combinations of modalities among vision, body and audition. We found that mental rotation performance gradually improved when adding sensory cues to the moving observer (from None to Body or Vision and then to Body & Audition or Body & Vision), but that the processing time drops to the same level for any of the sensory contexts. These results are discussed in terms of an additive contribution when sensory modalities are co-activated to the spatial updating mechanism involved during self-motion. Interestingly, this multisensory approach can account for different findings reported in the literature.
AB - Mental rotation is the capacity to predict the outcome of spatial relationships after a change in viewpoint. These changes arise either from the rotation of the test object array or from the rotation of the observer. Previous studies showed that the cognitive cost of mental rotations is reduced when viewpoint changes result from the observer's motion, which was explained by the spatial updating mechanism involved during self-motion. However, little is known about how various sensory cues available might contribute to the updating performance. We used a Virtual Reality setup in a series of experiments to investigate table-top mental rotations under different combinations of modalities among vision, body and audition. We found that mental rotation performance gradually improved when adding sensory cues to the moving observer (from None to Body or Vision and then to Body & Audition or Body & Vision), but that the processing time drops to the same level for any of the sensory contexts. These results are discussed in terms of an additive contribution when sensory modalities are co-activated to the spatial updating mechanism involved during self-motion. Interestingly, this multisensory approach can account for different findings reported in the literature.
KW - Mental rotations
KW - Multisensory
KW - Spatial updating
KW - Virtual reality
UR - http://www.scopus.com/inward/record.url?scp=67749120193&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=67749120193&partnerID=8YFLogxK
U2 - 10.1007/s00221-009-1892-4
DO - 10.1007/s00221-009-1892-4
M3 - Article
C2 - 19544058
AN - SCOPUS:67749120193
VL - 197
SP - 59
EP - 68
JO - Experimental Brain Research
JF - Experimental Brain Research
SN - 0014-4819
IS - 1
ER -