Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2021 May:182:58-68.
doi: 10.1016/j.visres.2021.01.008. Epub 2021 Feb 17.

Multisensory perception in Argus II retinal prosthesis patients: Leveraging auditory-visual mappings to enhance prosthesis outcomes

Affiliations

Multisensory perception in Argus II retinal prosthesis patients: Leveraging auditory-visual mappings to enhance prosthesis outcomes

Noelle R B Stiles et al. Vision Res. 2021 May.

Abstract

Crossmodal mappings associate features (such as spatial location) between audition and vision, thereby aiding sensory binding and perceptual accuracy. Previously, it has been unclear whether patients with artificial vision will develop crossmodal mappings despite the low spatial and temporal resolution of their visual perception (particularly in light of the remodeling of the retina and visual cortex that takes place during decades of vision loss). To address this question, we studied crossmodal mappings psychophysically in Retinitis Pigmentosa patients with partial visual restoration by means of Argus II retinal prostheses, which incorporate an electrode array implanted on the retinal surface that stimulates still-viable ganglion cells with a video stream from a head-mounted camera. We found that Argus II patients (N = 10) exhibit significant crossmodal mappings between auditory location and visual location, and between auditory pitch and visual elevation, equivalent to those of age-matched sighted controls (N = 10). Furthermore, Argus II patients (N = 6) were able to use crossmodal mappings to locate a visual target more quickly with auditory cueing than without. Overall, restored artificial vision was shown to interact with audition via crossmodal mappings, which implies that the reorganization during blindness and the limitations of artificial vision did not prevent the relearning of crossmodal mappings. In particular, cueing based on crossmodal mappings was shown to improve visual search with a retinal prosthesis. This result represents a key first step toward leveraging crossmodal interactions for improved patient visual functionality.

Keywords: Auditory-visual integration; Crossmodal mappings; Retinal prostheses; Vision restoration and blindness.

PubMed Disclaimer

Conflict of interest statement

Competing Interests

The authors declare no competing interests.

Figures

Fig. 1.
Fig. 1.. Argus II Retinal Prosthesis System Components
The external components of the Argus II Retinal Prosthesis System. The visual information is recorded by the glasses mounted camera and then fed to a Visual Processing Unit (VPU), which translates visual information into electrical stimulation parameters. The visual signal is then sent via wire to the coil on the glasses, which wirelessly communicates through an implanted coil to a microelectrode stimulator array that is proximity coupled to the retina. Additional details on the Argus II device are detailed in the Methods section.
Fig. 2.
Fig. 2.. Argus II Patient Timelines
Timeline for natural visual perception, vision loss, and vision restoration with the Argus II retinal prosthesis in each of the patients tested. See also Table 1.
Fig. 3.
Fig. 3.. Crossmodal Correspondences Results in Argus II and Sighted Subjects (Experiment 1)
The experimental setup at the University of Southern California for both crossmodal correspondence experiments is shown in Panel A. The fraction correct for matching auditory to visual stimuli is shown in Panel B for both Argus II patients (N = 10) and age-matched sighted controls (N = 10). The fraction correct for matching auditory to visual stimuli in Argus II patients (N = 10) is shown in Panels C and D (Panel C for the mapping of auditory location to visual location, and Panel D for the mapping of auditory pitch to visual elevation). The individual patient fraction correct results for each Argus II patient in Panels C and D do not have error bars as they represent one data point for each patient; however, the averages in Panels C and D do have error bars as they sum across patients. The error bars represent ± standard error of the mean across participants. The dashed lines represent chance. See also Figs. S4 and S5, and Table S2.
Fig. 4.
Fig. 4.. Visual Search Task Setup and Results with Argus II Patients (Experiment 3)
A representative image of the visual stimuli in the visual search task is shown in Panel A with the visual target and visual distractors labeled. The time to detection results for the visual search task in Argus II patients (N = 6) is shown in Panel B. The full length of the error bars represent ± standard error of the mean across trials for individual results, and across participants for average results. See also Fig. S6.

Similar articles

Cited by

References

    1. (2019). Second Sight, Our Therapy, FAQ. 2019 (Second Sight Medical Products Website: https://www.secondsight.com/faq/.
    1. Apte RS (2018). Gene therapy for retinal degeneration. Cell, 173 (1), 5. - PubMed
    1. Ayton LN, Barnes N, Dagnelie G, Fujikado T, Goetz G, Hornig R, Jones BW, Muqit MM, Rathbun DL, & Stingl K (2020). An update on retinal prostheses. Clinical Neurophysiology, 131 (6), 1383–1398. - PMC - PubMed
    1. Ben-Artzi E, & Marks LE (1995). Visual-auditory interaction in speeded classification: Role of stimulus difference. Perception & Psychophysics, 57 (8), 1151–1162. - PubMed
    1. Bernstein IH, & Edelstein BA (1971). Effects of some variations in auditory input upon visual choice reaction time. Journal of Experimental Psychology, 87 (2), 241. - PubMed

Publication types