Visual and haptic perceptual spaces from parametrically-defined to natural objects

Nina Gaißert, Kirstin Ulrichs, Christian Wallraven

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

In this study we show that humans form very similar perceptual spaces when they explore parametrically-defined shell-shaped objects visually or haptically. A physical object space was generated by varying three shape parameters. Sighted participants explored pictures of these objects while blindfolded participants haptically explored 3D printouts of the objects. Similarity ratings were performed and analyzed using multidimensional scaling (MDS) techniques. Visual and haptic similarity ratings highly correlate and resulted in very similar visual and haptic MDS maps providing evidence for one shared perceptual space underlying both modalities. To investigate to which degree these results are transferrable to natural objects, we performed the same visual and haptic similarity ratings and multidimensional scaling analyses using a set of natural sea shells.

Original languageEnglish
Title of host publicationAAAI Spring Symposium - Technical Report
Pages2-7
Number of pages6
VolumeSS-10-02
Publication statusPublished - 2010 Oct 21
Externally publishedYes
Event2010 AAAI Spring Symposium - Stanford, CA, United States
Duration: 2010 Mar 222010 Mar 24

Other

Other2010 AAAI Spring Symposium
CountryUnited States
CityStanford, CA
Period10/3/2210/3/24

ASJC Scopus subject areas

  • Artificial Intelligence

Cite this

Gaißert, N., Ulrichs, K., & Wallraven, C. (2010). Visual and haptic perceptual spaces from parametrically-defined to natural objects. In AAAI Spring Symposium - Technical Report (Vol. SS-10-02, pp. 2-7)