Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2016 Aug 23;7(4):2041669516664530.
doi: 10.1177/2041669516664530. eCollection 2016 Jul-Aug.

Touching and Hearing Unseen Objects: Multisensory Effects on Scene Recognition

Affiliations
Free PMC article

Touching and Hearing Unseen Objects: Multisensory Effects on Scene Recognition

Simon J Hazenberg et al. Iperception. .
Free PMC article

Abstract

In three experiments, we investigated the influence of object-specific sounds on haptic scene recognition without vision. Blindfolded participants had to recognize, through touch, spatial scenes comprising six objects that were placed on a round platform. Critically, in half of the trials, object-specific sounds were played when objects were touched (bimodal condition), while sounds were turned off in the other half of the trials (unimodal condition). After first exploring the scene, two objects were swapped and the task was to report, which of the objects swapped positions. In Experiment 1, geometrical objects and simple sounds were used, while in Experiment 2, the objects comprised toy animals that were matched with semantically compatible animal sounds. In Experiment 3, we replicated Experiment 1, but now a tactile-auditory object identification task preceded the experiment in which the participants learned to identify the objects based on tactile and auditory input. For each experiment, the results revealed a significant performance increase only after the switch from bimodal to unimodal. Thus, it appears that the release of bimodal identification, from audio-tactile to tactile-only produces a benefit that is not achieved when having the reversed order in which sound was added after having experience with haptic-only. We conclude that task-related factors other than mere bimodal identification cause the facilitation when switching from bimodal to unimodal conditions.

Keywords: haptics; multisensory integration; scene recognition; spatial updating.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
Set-up of Experiment 1. In each trial, the start position was the chair on the right. In the scene rotation condition, participants remained seated while the platform rotated 90° in counter clockwise direction. In the observer movement condition, participants walked to the other chair, while the platform remained stationary.
Figure 2.
Figure 2.
A schematic of the platform viewed from the top. The black disk in the middle represents the hole in the platform through which the cables run. The other smaller symbols depict the holes in which objects can be placed. Each of the five configurations that are used to create a scene is depicted by a gray symbol. The white disks are holes that were not used in the experiments.
Figure 3.
Figure 3.
Results of Experiment 1. Performance in mean proportion correct for the first two blocks and the last two blocks. The left two bars depict the performance for participants who started with audio-tactile blocks and ended with tactile-only blocks and the right two bars depict the performance of participants who started with tactile-only blocks and ended with audio-tactile blocks. Error bars depict one standard error of the mean.
Figure 4.
Figure 4.
Set-up of Experiment 2.
Figure 5.
Figure 5.
Results of Experiment 2. Performance in mean proportion correct for the first two blocks (black bars) and the last two blocks (light gray bars). The left two bars depict the performance for participants who started with audio-tactile blocks and ended with tactile-only blocks and the right two bars depict the performance for participants who started with tactile-only blocks and ended with audio-tactile blocks. Error bars depict one standard error of the mean.
Figure 6.
Figure 6.
Results of Experiment 3. Performance in mean proportion correct for the first two blocks (black bars) and the last two blocks (light gray bars). The left two bars depict the performance for participants who started with audio-tactile blocks and ended with tactile-only blocks and the right two bars depict the performance for participants who started with tactile-only blocks and ended with audio-tactile blocks. Error bars depict one standard error of the mean.
None
None

Similar articles

References

    1. Bertelson P., Aschersleben G. (1998) Automatic visual bias of perceived auditory location. Psychonomic Bulletin & Review 5: 482–489.
    1. Burgess N. (2006) Spatial memory: How egocentric and allocentric combine. Trends in Cognitive Sciences 10: 551–558. - PubMed
    1. Chan J. S., Newell F. N. (2013) The effect of non-informative spatial sounds on haptic scene recognition. International Journal of Autonomous and Adaptive Communication Systems 6: 342–365.
    1. Eimer M. (2004) Multisensory integration: How visual experience shapes spatial perception. Current Biology 14: R115–R117. - PubMed
    1. Guest S., Catmur C., Lloyd D., Spence C. (2002) Audiotactile interactions in roughness perception. Experimental Brain Research 146: 161–171. - PubMed

LinkOut - more resources