Do we judge similarity between two objects to be the same using touch and vision? We investigated this using psychophysical experiments in which subjects rated the similarity between objects presented either visually or haptically. The stimuli were a family of novel, three-dimensional objects whose microgeometry ("texture") and macrogeometry ("shape") were parametrically varied, Multidimensional scaling of the similarity data was used to reconstruct haptic and visual perceptual spaces. For both modalities, a two-dimensional perceptual space was found whose dimensions clearly corresponded to shape and texture. Interestingly, shape dominated in visual space, whereas both shape and texture were important in haptic space. Furthermore, stimuli clusters were observed in this space, suggesting the emergence of category structure based on similarity relationships. The same category boundaries were confirmed in a visual free sorting experiment. This study reveals differences in object processing across modality and demonstrates an approach for analyzing such differences in multisensory visualizations.