TY - JOUR
T1 - Similarity and categorization
T2 - From vision to touch
AU - Gaißert, Nina
AU - Bülthoff, Heinrich H.
AU - Wallraven, Christian
N1 - Funding Information:
This research was supported by a PhD stipend from the Max Planck Society. Part of this research was also supported by the World Class University (WCU) program through the National Research Foundation of Korea funded by the Ministry of Education, Science and Technology ( R31-2008-000-10008-0 ).
PY - 2011/9
Y1 - 2011/9
N2 - Even though human perceptual development relies on combining multiple modalities, most categorization studies so far have focused on the visual modality. To better understand the mechanisms underlying multisensory categorization, we analyzed visual and haptic perceptual spaces and compared them with human categorization behavior. As stimuli we used a three-dimensional object space of complex, parametrically-defined objects. First, we gathered similarity ratings for all objects and analyzed the perceptual spaces of both modalities using multidimensional scaling analysis. Next, we performed three different categorization tasks which are representative of every-day learning scenarios: in a fully unconstrained task, objects were freely categorized, in a semi-constrained task, exactly three groups had to be created, whereas in a constrained task, participants received three prototype objects and had to assign all other objects accordingly. We found that the haptic modality was on par with the visual modality both in recovering the topology of the physical space and in solving the categorization tasks. We also found that within-category similarity was consistently higher than across-category similarity for all categorization tasks and thus show how perceptual spaces based on similarity can explain visual and haptic object categorization. Our results suggest that both modalities employ similar processes in forming categories of complex objects.
AB - Even though human perceptual development relies on combining multiple modalities, most categorization studies so far have focused on the visual modality. To better understand the mechanisms underlying multisensory categorization, we analyzed visual and haptic perceptual spaces and compared them with human categorization behavior. As stimuli we used a three-dimensional object space of complex, parametrically-defined objects. First, we gathered similarity ratings for all objects and analyzed the perceptual spaces of both modalities using multidimensional scaling analysis. Next, we performed three different categorization tasks which are representative of every-day learning scenarios: in a fully unconstrained task, objects were freely categorized, in a semi-constrained task, exactly three groups had to be created, whereas in a constrained task, participants received three prototype objects and had to assign all other objects accordingly. We found that the haptic modality was on par with the visual modality both in recovering the topology of the physical space and in solving the categorization tasks. We also found that within-category similarity was consistently higher than across-category similarity for all categorization tasks and thus show how perceptual spaces based on similarity can explain visual and haptic object categorization. Our results suggest that both modalities employ similar processes in forming categories of complex objects.
KW - Categorization
KW - Haptics
KW - Multisensory perception
KW - Perceptual spaces
KW - Similarity
KW - Vision
UR - http://www.scopus.com/inward/record.url?scp=80052875119&partnerID=8YFLogxK
U2 - 10.1016/j.actpsy.2011.06.007
DO - 10.1016/j.actpsy.2011.06.007
M3 - Article
C2 - 21752344
AN - SCOPUS:80052875119
SN - 0001-6918
VL - 138
SP - 219
EP - 230
JO - Acta Psychologica
JF - Acta Psychologica
IS - 1
ER -