Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Comparative Study
. 2005 May 4;25(18):4616-25.
doi: 10.1523/JNEUROSCI.0455-05.2005.

Multisensory space representations in the macaque ventral intraparietal area

Affiliations
Comparative Study

Multisensory space representations in the macaque ventral intraparietal area

Anja Schlack et al. J Neurosci. .

Abstract

Animals can use different sensory signals to localize objects in the environment. Depending on the situation, the brain either integrates information from multiple sensory sources or it chooses the modality conveying the most reliable information to direct behavior. This suggests that somehow, the brain has access to a modality-invariant representation of external space. Accordingly, neural structures encoding signals from more than one sensory modality are best suited for spatial information processing. In primates, the posterior parietal cortex (PPC) is a key structure for spatial representations. One substructure within human and macaque PPC is the ventral intraparietal area (VIP), known to represent visual, vestibular, and tactile signals. In the present study, we show for the first time that macaque area VIP neurons also respond to auditory stimulation. Interestingly, the strength of the responses to the acoustic stimuli greatly depended on the spatial location of the stimuli [i.e., most of the auditory responsive neurons had surprisingly small spatially restricted auditory receptive fields (RFs)]. Given this finding, we compared the auditory RF locations with the respective visual RF locations of individual area VIP neurons. In the vast majority of neurons, the auditory and visual RFs largely overlapped. Additionally, neurons with well aligned visual and auditory receptive fields tended to encode multisensory space in a common reference frame. This suggests that area VIP constitutes a part of a neuronal circuit involved in the computation of a modality-invariant representation of external space.

PubMed Disclaimer

Figures

Figure 1.
Figure 1.
Auditory and visual RF mapping. In both sensory modalities, our mapping range (A, B) covered the central 60 × 60° of frontal extrapersonal space. This mapping area was divided into a virtual square grid of 36 patches, each being 10 × 10° wide (dashed lines in A, B). In each trial, either four auditory (A) or six visual (B) stimuli appeared in a pseudorandomized order within the different grid positions. The numbers in A and B represent two example sequences for an auditory and visual trial, respectively. C depicts the time courses of auditory and visual trials. The numbers within the auditory and visual stimulus traces link the time courses of the sensory stimulations to the stimulus locations in A and B. The first stimulus in each trial appears 400 ms after the monkey had achieved central fixation, as indicated by the schematic traces of the horizontal and vertical eye position signals. Visual stimulation lasted for 200 ms followed by a 200 ms interval without stimulation. Auditory stimulation consisted of 80 ms white-noise bursts followed by 410 ms without stimulation. See Materials and Methods for details.
Figure 7.
Figure 7.
Example of the spatially congruent visual (left) and auditory (middle) RFs of an individual VIP neuron. Same conventions as for the middle panel in Figure 3. The data have been recorded while the monkey fixated a central target. The red and yellow sectors surrounded by the black outlines correspond to the RF locations (discharge >0.5 maximum discharge). The crosses indicate the hotspots (i.e., the locations of the highest discharge within the RFs). The right panel shows a superposition of outlines and hotspots of the RFs of both modalities. The two RFs largely overlapped, and the hotspots were almost identical. deg, Degrees.
Figure 10.
Figure 10.
Example of three neurons with head-centered, intermediate, and eye-centered encoding of auditory spatial information. The left column shows the RF locations determined while the monkey fixated either 10° to the left (black RF; fixation position indicated by the black cross) or 10° to the right (white RF; fixation position indicated by the white cross). The RFs are plotted in a head-centered reference frame. In the right column, the very same RFs are plotted in eye-centered coordinates (fixation position indicated by the gray cross). The first cell (first row) could be described best as encoding space in a head-centered coordinate system. The second cell (second row) fitted best an intermediate encoding scheme, whereas the third cell (third row) encoded auditory information in an eye-centered reference frame (see Results for details).
Figure 3.
Figure 3.
Auditory receptive field and variability map of a VIP neuron. In the left two panels, horizontal and vertical axes indicate the mapping range. The mean SE (left) or mean spike rate (middle) evoked by the stimulation of a given location is color coded. The color bar (bottom) shows which colors correspond to which spike rates. The level of spontaneous activity is indicated by the white line in the color bar. The red sector in the middle panel corresponds to the RF (see Materials and Methods). The discharge of this neuron was strongest for stimulations in the top central and left part of the mapping range. Stimulations in the region around the receptive field led to inhibition of the neuron (deep blue surround of the receptive field in the figure). The three panels on the right show peristimulus time histograms (PSTHs) and raster plots of the responses of this neuron to stimulation at three different locations (indicated by the black arrows). The vertical lines indicate the start of the auditory stimulus, and the horizontal line indicates the level of baseline activity (3.5 spikes/s). The top panel shows the response to stimulation at the preferred location. The middle panel depicts responses to a stimulation slightly below and to the right of the preferred stimulus location, leading to a response not different from spontaneous activity. The bottom panel shows responses to a stimulation 20° below the preferred stimulus location leading to an inhibitory response. The response latency of this neuron was 35 ms. The response of the neuron was significantly modulated by the stimulus for 150 ms (response duration, gray shaded area in the raster and PSTH plots). The receptive field diameter at a half-maximal response threshold was 30°. The VI for this neuron [i.e., the SE at the preferred location (see left panel) divided by the mean response at the same location (see middle panel)] was 0.36. deg, Degrees.
Figure 4.
Figure 4.
Auditory receptive field examples. Same convention as for the middle panel of Figure 3. The figure shows a variety of auditory receptive fields with different shapes, positions, and sizes. Each panel thereby corresponds to the auditory RF plot of one area VIP neuron. deg, Degrees.
Figure 2.
Figure 2.
Distribution of auditory latencies across VIP cells. Auditory responses occurred in a broad range of latencies, but there was a bias toward short latencies. More than one-half of the neurons (57%) had latencies shorter than 100 ms; 33% of the neurons had even latencies shorter than 50 ms.
Figure 5.
Figure 5.
Comparison of the mean activity caused by stimulation in the hot spot of the auditory (vertical axis) and visual (horizontal axis) receptive fields for individual neurons (n = 81). Each circle represents the firing rate ratio for a single neuron. The firing rate to visual stimulation tended to be higher than to auditory stimulation as indicated by the larger number of circles located under the bisector line.
Figure 6.
Figure 6.
Comparison of the auditory (vertical axis) and visual (horizontal axis) VI for individual neurons (n = 81). Each circle represents the VI ratio for a single cell. The dashed line represents the bisector line. Circles on this line indicate cells with identical variability of response to visual and auditory stimulation. The values for visual and auditory VIs basically cover the same range. However, there are more data points located above the bisector line than above. For such neurons, the auditory VI is higher than the visual VI (see Results for details).
Figure 8.
Figure 8.
Distribution of intermodal RF overlap normalized to RF sizes (n = 81). The figure shows the distribution of spatial overlap of the visual and auditory RFs of the individual cells in a population scheme. For the majority of neurons (72.8%), the RFs from the two sensory modalities overlapped by ≥50%.
Figure 9.
Figure 9.
Distribution of optimal intermodal RF offsets normalized to RF sizes (n = 80). Each open circle corresponds to the offset value of one neuron (see Materials and Methods). The gray area indicates the limit of 50% of the RF size. Symbols inside this area refer to neurons with visual RFs that had to be displaced less than one-half of their respective RF size to obtain the best possible intermodal RF match.
Figure 11.
Figure 11.
Visual and auditory latencies for neurons encoding space in head-centered, intermediate, or eye-centered coordinates. The top shows the mean latencies (±SE) of neurons that encoded auditory space in head-centered, intermediate, or eye-centered reference frames. The bottom shows the same for visual latencies. In both sensory modalities, latencies tended to be shortest for neurons using the native reference frame of the respective sensory system (eye-centered for the visual, head-centered for the auditory stimulus domain). Latencies were longest for neurons that used reference frames that required coordinate transformations to take place (see Results for details).
Figure 12.
Figure 12.
Distribution of reference frames within the auditory and visual RF population. The panel on the left shows the proportion of neurons encoding auditory space in the respective reference frames (n = 91). The panel on the right shows the respective data for visual space (n = 124).
Figure 13.
Figure 13.
Comparison between the respective reference frames in the visual and auditory domain for single neurons (n = 81). The auditory shift index is plotted against the visual shift index. Each data point indicates the value for a single neuron. The squares represent the limits of our classification in the three subclasses of reference frames (eye centered, intermediate, and head centered). Dots in the lighter squares (i.e., squares centered on the oblique line) indicate neurons with a similar visual and auditory reference frame. For neurons that are represented by dots below these sectors, the reference frame for visual space is shifted toward an eye-centered representation compared for the respective reference frame for auditory space. Dots above these sectors indicate neurons with a more eye-centered encoding of space in the auditory than in the visual domain. Filled dots represent neurons with spatially coinciding visual and auditory RFs during central fixation. Of these, the ones filled in gray mark cells with different reference frames, and the ones filled in black mark cells with similar reference frames.

Similar articles

Cited by

References

    1. Andersen RA (1989) Visual and eye movement functions of the posterior parietal cortex. Annu Rev Neurosci 12: 377-403. - PubMed
    1. Andersen RA, Essick GK, Siegel RM (1987) Neurons of area 7 activated by both visual stimuli and oculomotor behavior. Exp Brain Res 67: 316-322. - PubMed
    1. Benson DA, Hienz RD, Goldstein Jr MH (1981) Single-unit activity in the auditory cortex of monkeys actively localizing sound sources: spatial tuning and behavioral dependency. Brain Res 219: 249-267. - PubMed
    1. Blauert J (1997) Spatial hearing. Cambridge, MA: MIT.
    1. Bremmer F, Schlack A, Shah NJ, Zafiris O, Kubischik M, Hoffmann K, Zilles K, Fink GR (2001) Polymodal motion processing in posterior parietal and premotor cortex. A human fMRI study strongly implies equivalencies between humans and monkeys. Neuron 29: 287-296. - PubMed

Publication types

LinkOut - more resources