Object Feature Validation Using Visual and Haptic Similarity Ratings

Theresa Cooke, Sebastian Kannengiesser, Christian Wallraven, Heinrich Bulthoff

Research output: Contribution to journalArticle

13 Citations (Scopus)

Abstract

The perceived similarity between objects may well vary according to the sensory modality/modalities in which they are experienced, an important consideration for the design of multimodal interfaces. In this study, we present a similarity-based method for comparing the perceptual importance of object properties in touch and in vision and show how the method can also be used to validate computational measures of object properties. Using either vision or touch, human subjects judged the similarity between novel, 3D objects which varied parametrically in shape and texture. Similarities were also computed using a set of state-of-the art 2D and 3D computational measures. Two resolutions of 2D and 3D object data were used for these computations in order to test for scale dependencies. Multidimensional scaling (MDS) was then performed on all similarity data, yielding maps of the stimuli in both perceptual and computational spaces, as well as the relative weight of shape and texture dimensions. For this object set, we found that visual subjects accorded more importance to shape than texture, while haptic subjects weighted them roughly evenly. Fit errors between human and computational maps were then calculated to assess each feature's perceptual validity. Shape-biased features provided good overall fits to the human visual data; however, no single feature yielded a good overall fit to the haptic data, in which we observed large individual differences. This work demonstrates how MDS techniques can be used to evaluate computational object features using the criterion of perceptual similarity. It also demonstrates a way of assessing how the perceptual validity of a feature varies as a function of parameters such as the target modality and the resolution of object data. Potential applications of this method for the design of unimodal and multimodal human–machine interfaces are discussed.

Original languageEnglish
Pages (from-to)239-261
Number of pages23
JournalACM Transactions on Applied Perception
Volume3
Issue number3
DOIs
Publication statusPublished - 2006
Externally publishedYes

Fingerprint

Haptics
Textures
Touch
Modality
Multimodal Interfaces
Texture
Individuality
Weights and Measures
Vary
Scaling
Human-machine Interface
Individual Differences
Object
Vision
Similarity
Demonstrate
Biased
Target
Evaluate

Keywords

  • Experimentation
  • features
  • haptic
  • Human Factors
  • Measurement
  • multidimensional scaling
  • perception
  • shape
  • Similarity
  • texture
  • touch
  • validation
  • vision

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)
  • Experimental and Cognitive Psychology

Cite this

Object Feature Validation Using Visual and Haptic Similarity Ratings. / Cooke, Theresa; Kannengiesser, Sebastian; Wallraven, Christian; Bulthoff, Heinrich.

In: ACM Transactions on Applied Perception, Vol. 3, No. 3, 2006, p. 239-261.

Research output: Contribution to journalArticle

@article{c6fcc67974b141eebf3c009cda4a8672,
title = "Object Feature Validation Using Visual and Haptic Similarity Ratings",
abstract = "The perceived similarity between objects may well vary according to the sensory modality/modalities in which they are experienced, an important consideration for the design of multimodal interfaces. In this study, we present a similarity-based method for comparing the perceptual importance of object properties in touch and in vision and show how the method can also be used to validate computational measures of object properties. Using either vision or touch, human subjects judged the similarity between novel, 3D objects which varied parametrically in shape and texture. Similarities were also computed using a set of state-of-the art 2D and 3D computational measures. Two resolutions of 2D and 3D object data were used for these computations in order to test for scale dependencies. Multidimensional scaling (MDS) was then performed on all similarity data, yielding maps of the stimuli in both perceptual and computational spaces, as well as the relative weight of shape and texture dimensions. For this object set, we found that visual subjects accorded more importance to shape than texture, while haptic subjects weighted them roughly evenly. Fit errors between human and computational maps were then calculated to assess each feature's perceptual validity. Shape-biased features provided good overall fits to the human visual data; however, no single feature yielded a good overall fit to the haptic data, in which we observed large individual differences. This work demonstrates how MDS techniques can be used to evaluate computational object features using the criterion of perceptual similarity. It also demonstrates a way of assessing how the perceptual validity of a feature varies as a function of parameters such as the target modality and the resolution of object data. Potential applications of this method for the design of unimodal and multimodal human–machine interfaces are discussed.",
keywords = "Experimentation, features, haptic, Human Factors, Measurement, multidimensional scaling, perception, shape, Similarity, texture, touch, validation, vision",
author = "Theresa Cooke and Sebastian Kannengiesser and Christian Wallraven and Heinrich Bulthoff",
year = "2006",
doi = "10.1145/1166087.1166093",
language = "English",
volume = "3",
pages = "239--261",
journal = "ACM Transactions on Applied Perception",
issn = "1544-3558",
publisher = "Association for Computing Machinery (ACM)",
number = "3",

}

TY - JOUR

T1 - Object Feature Validation Using Visual and Haptic Similarity Ratings

AU - Cooke, Theresa

AU - Kannengiesser, Sebastian

AU - Wallraven, Christian

AU - Bulthoff, Heinrich

PY - 2006

Y1 - 2006

N2 - The perceived similarity between objects may well vary according to the sensory modality/modalities in which they are experienced, an important consideration for the design of multimodal interfaces. In this study, we present a similarity-based method for comparing the perceptual importance of object properties in touch and in vision and show how the method can also be used to validate computational measures of object properties. Using either vision or touch, human subjects judged the similarity between novel, 3D objects which varied parametrically in shape and texture. Similarities were also computed using a set of state-of-the art 2D and 3D computational measures. Two resolutions of 2D and 3D object data were used for these computations in order to test for scale dependencies. Multidimensional scaling (MDS) was then performed on all similarity data, yielding maps of the stimuli in both perceptual and computational spaces, as well as the relative weight of shape and texture dimensions. For this object set, we found that visual subjects accorded more importance to shape than texture, while haptic subjects weighted them roughly evenly. Fit errors between human and computational maps were then calculated to assess each feature's perceptual validity. Shape-biased features provided good overall fits to the human visual data; however, no single feature yielded a good overall fit to the haptic data, in which we observed large individual differences. This work demonstrates how MDS techniques can be used to evaluate computational object features using the criterion of perceptual similarity. It also demonstrates a way of assessing how the perceptual validity of a feature varies as a function of parameters such as the target modality and the resolution of object data. Potential applications of this method for the design of unimodal and multimodal human–machine interfaces are discussed.

AB - The perceived similarity between objects may well vary according to the sensory modality/modalities in which they are experienced, an important consideration for the design of multimodal interfaces. In this study, we present a similarity-based method for comparing the perceptual importance of object properties in touch and in vision and show how the method can also be used to validate computational measures of object properties. Using either vision or touch, human subjects judged the similarity between novel, 3D objects which varied parametrically in shape and texture. Similarities were also computed using a set of state-of-the art 2D and 3D computational measures. Two resolutions of 2D and 3D object data were used for these computations in order to test for scale dependencies. Multidimensional scaling (MDS) was then performed on all similarity data, yielding maps of the stimuli in both perceptual and computational spaces, as well as the relative weight of shape and texture dimensions. For this object set, we found that visual subjects accorded more importance to shape than texture, while haptic subjects weighted them roughly evenly. Fit errors between human and computational maps were then calculated to assess each feature's perceptual validity. Shape-biased features provided good overall fits to the human visual data; however, no single feature yielded a good overall fit to the haptic data, in which we observed large individual differences. This work demonstrates how MDS techniques can be used to evaluate computational object features using the criterion of perceptual similarity. It also demonstrates a way of assessing how the perceptual validity of a feature varies as a function of parameters such as the target modality and the resolution of object data. Potential applications of this method for the design of unimodal and multimodal human–machine interfaces are discussed.

KW - Experimentation

KW - features

KW - haptic

KW - Human Factors

KW - Measurement

KW - multidimensional scaling

KW - perception

KW - shape

KW - Similarity

KW - texture

KW - touch

KW - validation

KW - vision

UR - http://www.scopus.com/inward/record.url?scp=84979998071&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84979998071&partnerID=8YFLogxK

U2 - 10.1145/1166087.1166093

DO - 10.1145/1166087.1166093

M3 - Article

AN - SCOPUS:84979998071

VL - 3

SP - 239

EP - 261

JO - ACM Transactions on Applied Perception

JF - ACM Transactions on Applied Perception

SN - 1544-3558

IS - 3

ER -