Learning to integrate arbitrary signals from vision and touch
- PMID: 18217847
- DOI: 10.1167/7.5.7
Learning to integrate arbitrary signals from vision and touch
Abstract
When different perceptual signals of the same physical property are integrated, for example, an objects' size, which can be seen and felt, they form a more reliable sensory estimate (e.g., M. O. Ernst & M. S. Banks, 2002). This, however, implies that the sensory system already knows which signals belong together and how they relate. In other words, the system has to know the mapping between the signals. In a Bayesian model of cue integration, this prior knowledge can be made explicit. Here, we ask whether such a mapping between two arbitrary sensory signals from vision and touch can be learned from their statistical co-occurrence such that they become integrated. In the Bayesian framework, this means changing the belief about the distribution of the stimuli. To this end, we trained subjects with stimuli that are usually unrelated in the world--the luminance of an object (visual signal) and its stiffness (haptic signal). In the training phase, we then presented subjects with combinations of these two signals, which were artificially correlated, and thus, we introduced a new mapping between them. For example, the stiffer the object, the brighter it was. We measured the influence of learning by comparing discrimination performance before and after training. The prediction is that integration makes discrimination worse for stimuli, which are incongruent with the newly learned mapping, because integration would cause this incongruency to disappear perceptually. The more certain subjects are about the new mapping, the stronger should the influence be on discrimination performance. Thus, learning in this context is about acquiring beliefs. We found a significant change in discrimination performance before and after training when comparing trials with congruent and incongruent stimuli. After training, discrimination thresholds for the incongruent stimuli are increased relative to thresholds for congruent stimuli, suggesting that subjects learned effectively to integrate the two formerly unrelated signals.
Similar articles
-
Transfer of object category knowledge across visual and haptic modalities: experimental and computational studies.Cognition. 2013 Feb;126(2):135-48. doi: 10.1016/j.cognition.2012.08.005. Epub 2012 Oct 25. Cognition. 2013. PMID: 23102553
-
Robust cue integration: a Bayesian model and evidence from cue-conflict studies with stereoscopic and figure cues to slant.J Vis. 2007 May 23;7(7):5.1-24. doi: 10.1167/7.7.5. J Vis. 2007. PMID: 17685801
-
Vision and touch are automatically integrated for the perception of sequences of events.J Vis. 2006 Apr 20;6(5):554-64. doi: 10.1167/6.5.2. J Vis. 2006. PMID: 16881788
-
Tactual perception: a review of experimental variables and procedures.Cogn Process. 2012 Nov;13(4):285-301. doi: 10.1007/s10339-012-0443-2. Epub 2012 Jun 6. Cogn Process. 2012. PMID: 22669262 Review.
-
Visual-haptic mapping and the origin of cross-modal identity.Optom Vis Sci. 2009 Jun;86(6):595-8. doi: 10.1097/OPX.0b013e3181a72999. Optom Vis Sci. 2009. PMID: 19417705 Free PMC article. Review.
Cited by
-
Learning from vision-to-touch is different than learning from touch-to-vision.Front Integr Neurosci. 2012 Nov 20;6:105. doi: 10.3389/fnint.2012.00105. eCollection 2012. Front Integr Neurosci. 2012. PMID: 23181012 Free PMC article.
-
Why we are not all synesthetes (not even weakly so).Psychon Bull Rev. 2013 Aug;20(4):643-64. doi: 10.3758/s13423-013-0387-2. Psychon Bull Rev. 2013. PMID: 23413012 Review.
-
On the Role of Interoception in Body and Object Perception: A Multisensory-Integration Account.Perspect Psychol Sci. 2023 Mar;18(2):321-339. doi: 10.1177/17456916221096138. Epub 2022 Aug 22. Perspect Psychol Sci. 2023. PMID: 35994810 Free PMC article.
-
How haptic size sensations improve distance perception.PLoS Comput Biol. 2011 Jun;7(6):e1002080. doi: 10.1371/journal.pcbi.1002080. Epub 2011 Jun 30. PLoS Comput Biol. 2011. PMID: 21738457 Free PMC article.
-
Humans use predictive kinematic models to calibrate visual cues to three-dimensional surface slant.J Neurosci. 2014 Jul 30;34(31):10394-401. doi: 10.1523/JNEUROSCI.1000-14.2014. J Neurosci. 2014. PMID: 25080598 Free PMC article.
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources
