Vision is considered the dominant sense for spatial perception. Yet, how vision contributes to its refinement in other modalities remains unclear. Consequently, we investigated the development of audio-tactile spatial integration using a localization task in which participants had to determine the position of auditory, tactile, and audio-tactile stimuli and the influence of visual experience in this process. We tested sighted and blind children at different ages. We found that in sighted children, tactile spatial perception stabilizes earlier than the auditory one, and optimal audio-tactile integration is achieved only after 12 years-of-age. Conversely, blind children showed higher uni-sensory precisions from a younger age, although multisensory performance exhibited minimal improvement through age. Overall, our findings suggest that optimal audio-tactile spatial integration develops late during childhood and that vision might play a pivotal role in this process, that is, the absence of vision prompts earlier development of other sensory modalities when processing bodily stimuli. SUMMARY: Sighted children achieve optimal audio-tactile spatial integration only after 12 years, aligning bimodal precision with MLE predictions in adolescence. Blind children show superior early uni-modal sensory localization precision compared to sighted peers. Tactile precision stabilizes earlier than auditory in sighted children, whereas blind children show the opposite developmental trajectory for localization.
Keywords: audio localization; blindness; development; multisensory integration; tactile localization.
© 2025 The Author(s). Developmental Science published by John Wiley & Sons Ltd.