Transfer of object category knowledge across visual and haptic modalities: experimental and computational studies
- PMID: 23102553
- DOI: 10.1016/j.cognition.2012.08.005
Transfer of object category knowledge across visual and haptic modalities: experimental and computational studies
Abstract
We study people's abilities to transfer object category knowledge across visual and haptic domains. If a person learns to categorize objects based on inputs from one sensory modality, can the person categorize these same objects when the objects are perceived through another modality? Can the person categorize novel objects from the same categories when these objects are, again, perceived through another modality? Our work makes three contributions. First, by fabricating Fribbles (3-D, multi-part objects with a categorical structure), we developed visual-haptic stimuli that are highly complex and realistic, and thus more ecologically valid than objects that are typically used in haptic or visual-haptic experiments. Based on these stimuli, we developed the See and Grasp data set, a data set containing both visual and haptic features of the Fribbles, and are making this data set freely available on the world wide web. Second, complementary to previous research such as studies asking if people transfer knowledge of object identity across visual and haptic domains, we conducted an experiment evaluating whether people transfer object category knowledge across these domains. Our data clearly indicate that we do. Third, we developed a computational model that learns multisensory representations of prototypical 3-D shape. Similar to previous work, the model uses shape primitives to represent parts, and spatial relations among primitives to represent multi-part objects. However, it is distinct in its use of a Bayesian inference algorithm allowing it to acquire multisensory representations, and sensory-specific forward models allowing it to predict visual or haptic features from multisensory representations. The model provides an excellent qualitative account of our experimental data, thereby illustrating the potential importance of multisensory representations and sensory-specific forward models to multisensory perception.
Copyright © 2012 Elsevier B.V. All rights reserved.
Similar articles
-
Similarity and categorization: from vision to touch.Acta Psychol (Amst). 2011 Sep;138(1):219-30. doi: 10.1016/j.actpsy.2011.06.007. Epub 2011 Jul 14. Acta Psychol (Amst). 2011. PMID: 21752344
-
Vision holds a greater share in visuo-haptic object recognition than touch.Neuroimage. 2013 Jan 15;65:59-68. doi: 10.1016/j.neuroimage.2012.09.054. Epub 2012 Sep 29. Neuroimage. 2013. PMID: 23032487
-
The left fusiform gyrus hosts trisensory representations of manipulable objects.Neuroimage. 2011 Jun 1;56(3):1566-77. doi: 10.1016/j.neuroimage.2011.02.032. Epub 2011 Feb 18. Neuroimage. 2011. PMID: 21334444
-
Multisensory object representation: insights from studies of vision and touch.Prog Brain Res. 2011;191:165-76. doi: 10.1016/B978-0-444-53752-2.00006-0. Prog Brain Res. 2011. PMID: 21741551 Review.
-
The experience of force: the role of haptic experience of forces in visual perception of object motion and interactions, mental simulation, and motion-related judgments.Psychol Bull. 2012 Jul;138(4):589-615. doi: 10.1037/a0025587. Psychol Bull. 2012. PMID: 22730922 Review.
Cited by
-
Learning multisensory representations for auditory-visual transfer of sequence category knowledge: a probabilistic language of thought approach.Psychon Bull Rev. 2015 Jun;22(3):673-86. doi: 10.3758/s13423-014-0734-y. Epub 2014 Oct 23. Psychon Bull Rev. 2015. PMID: 25338656
-
The computational origin of representation.Minds Mach (Dordr). 2021 Mar;31:1-58. doi: 10.1007/s11023-020-09540-9. Epub 2020 Nov 3. Minds Mach (Dordr). 2021. PMID: 34305318 Free PMC article.
-
One model for the learning of language.Proc Natl Acad Sci U S A. 2022 Feb 1;119(5):e2021865119. doi: 10.1073/pnas.2021865119. Proc Natl Acad Sci U S A. 2022. PMID: 35074868 Free PMC article.
-
An integrative computational architecture for object-driven cortex.Curr Opin Neurobiol. 2019 Apr;55:73-81. doi: 10.1016/j.conb.2019.01.010. Epub 2019 Feb 28. Curr Opin Neurobiol. 2019. PMID: 30825704 Free PMC article. Review.
-
The look and feel of soft are similar across different softness dimensions.J Vis. 2021 Sep 1;21(10):20. doi: 10.1167/jov.21.10.20. J Vis. 2021. PMID: 34581768 Free PMC article.
Publication types
MeSH terms
LinkOut - more resources
Full Text Sources
Other Literature Sources
