Categorizing natural objects: a comparison of the visual and the haptic modalities

Exp Brain Res. 2012 Jan;216(1):123-34. doi: 10.1007/s00221-011-2916-4. Epub 2011 Nov 3.

Abstract

Although the hands are the most important tool for humans to manipulate objects, only little is known about haptic processing of natural objects. Here, we selected a unique set of natural objects, namely seashells, which vary along a variety of object features, while others are shared across all stimuli. To correctly interact with objects, they have to be identified or categorized. For both processes, measuring similarities between objects is crucial. Our goal is to better understand the haptic similarity percept by comparing it to the visual similarity percept. First, direct similarity measures were analyzed using multidimensional scaling techniques to visualize the perceptual spaces of both modalities. We find that the visual and the haptic modality form almost identical perceptual spaces. Next, we performed three different categorization tasks. All tasks exhibit a highly accurate processing of complex shapes of the haptic modality. Moreover, we find that objects grouped into the same category form regions within the perceptual space. Hence, in both modalities, perceived similarity constitutes the basis for categorizing objects. Moreover, both modalities focus on shape to form categories. Taken together, our results lead to the assumption that the same cognitive processes link haptic and visual similarity perception and the resulting categorization behavior.

Publication types

  • Comparative Study
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Animal Shells
  • Animals
  • Concept Formation*
  • Discrimination, Psychological*
  • Humans
  • Pattern Recognition, Visual / physiology*
  • Physical Stimulation
  • Reaction Time
  • Space Perception / physiology*
  • Statistics, Nonparametric
  • Touch / physiology
  • Touch Perception / physiology*