Sound facilitates visual learning
- PMID: 16860741
- DOI: 10.1016/j.cub.2006.05.048
Sound facilitates visual learning
Abstract
Numerous studies show that practice can result in performance improvements on low-level visual perceptual tasks [1-5]. However, such learning is characteristically difficult and slow, requiring many days of training [6-8]. Here, we show that a multisensory audiovisual training procedure facilitates visual learning and results in significantly faster learning than unisensory visual training. We trained one group of subjects with an audiovisual motion-detection task and a second group with a visual motion-detection task, and compared performance on trials containing only visual signals across ten days of training. Whereas observers in both groups showed improvements of visual sensitivity with training, subjects trained with multisensory stimuli showed significantly more learning both within and across training sessions. These benefits of multisensory training are particularly surprising given that the learning of visual motion stimuli is generally thought to be mediated by low-level visual brain areas [6, 9, 10]. Although crossmodal interactions are ubiquitous in human perceptual processing [11-13], the contribution of crossmodal information to perceptual learning has not been studied previously. Our results show that multisensory interactions can be exploited to yield more efficient learning of sensory information and suggest that multisensory training programs would be most effective for the acquisition of new skills.
Similar articles
-
Benefits of multisensory learning.Trends Cogn Sci. 2008 Nov;12(11):411-7. doi: 10.1016/j.tics.2008.07.006. Trends Cogn Sci. 2008. PMID: 18805039 Review.
-
Irrelevant visual stimuli improve auditory task performance.Neuroreport. 2008 Mar 26;19(5):553-7. doi: 10.1097/WNR.0b013e3282f8b1b6. Neuroreport. 2008. PMID: 18388737
-
The role of multisensory memories in unisensory object discrimination.Brain Res Cogn Brain Res. 2005 Jul;24(2):326-34. doi: 10.1016/j.cogbrainres.2005.02.005. Brain Res Cogn Brain Res. 2005. PMID: 15993770
-
Selective integration of auditory-visual looming cues by humans.Neuropsychologia. 2009 Mar;47(4):1045-52. doi: 10.1016/j.neuropsychologia.2008.11.003. Epub 2008 Nov 12. Neuropsychologia. 2009. PMID: 19041883
-
A unified model for perceptual learning.Trends Cogn Sci. 2005 Jul;9(7):329-34. doi: 10.1016/j.tics.2005.05.010. Trends Cogn Sci. 2005. PMID: 15955722 Review.
Cited by
-
The emergence of the multisensory brain: From the womb to the first steps.iScience. 2023 Dec 15;27(1):108758. doi: 10.1016/j.isci.2023.108758. eCollection 2024 Jan 19. iScience. 2023. PMID: 38230260 Free PMC article. Review.
-
Stimulus-locked auditory information facilitates real-time visuo-motor sequence learning.Psychon Bull Rev. 2023 Sep 21. doi: 10.3758/s13423-023-02378-z. Online ahead of print. Psychon Bull Rev. 2023. PMID: 37735341
-
Comparative Effectiveness of eConsent: Systematic Review.J Med Internet Res. 2023 Sep 1;25:e43883. doi: 10.2196/43883. J Med Internet Res. 2023. PMID: 37656499 Free PMC article. Review.
-
Crossmodal interactions in human learning and memory.Front Hum Neurosci. 2023 May 17;17:1181760. doi: 10.3389/fnhum.2023.1181760. eCollection 2023. Front Hum Neurosci. 2023. PMID: 37266327 Free PMC article. Review.
-
The role of visual and auditory information in social event segmentation.Q J Exp Psychol (Hove). 2024 Mar;77(3):626-638. doi: 10.1177/17470218231176471. Epub 2023 May 26. Q J Exp Psychol (Hove). 2024. PMID: 37154602 Free PMC article.
Publication types
MeSH terms
Grants and funding
LinkOut - more resources
Full Text Sources
Other Literature Sources
