Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2014 Nov;17(11):1598-606.
doi: 10.1038/nn.3834. Epub 2014 Oct 5.

Anchoring the neural compass: coding of local spatial reference frames in human medial parietal lobe

Affiliations

Anchoring the neural compass: coding of local spatial reference frames in human medial parietal lobe

Steven A Marchette et al. Nat Neurosci. 2014 Nov.

Erratum in

Abstract

The neural systems that code for location and facing direction during spatial navigation have been investigated extensively; however, the mechanisms by which these quantities are referenced to external features of the world are not well understood. To address this issue, we examined behavioral priming and functional magnetic resonance imaging activity patterns while human subjects recalled spatial views from a recently learned virtual environment. Behavioral results indicated that imagined location and facing direction were represented during this task, and multivoxel pattern analyses indicated that the retrosplenial complex (RSC) was the anatomical locus of these spatial codes. Critically, in both cases, location and direction were defined on the basis of fixed elements of the local environment and generalized across geometrically similar local environments. These results suggest that RSC anchors internal spatial representations to local topographical features, thus allowing us to stay oriented while we navigate and retrieve from memory the experience of being in a particular place.

PubMed Disclaimer

Figures

Figure 1
Figure 1. Map and images of the virtual environment
A. Map of the virtual park and the four museums. Each museum was oriented at a unique direction with respect to the surrounding park. Objects were displayed within alcoves, which are indicated by grey squares. Each alcove could only be viewed from one direction. B. Images of the exteriors, interiors and alcoves of each museum. C. Example screen shots from the fMRI version of the judgment of relative direction (JRD) task. Participants imagined themselves facing the bicycle, and responded to indicate whether the lamp would be to their left or right from this view.
Figure 2
Figure 2. Summary of analysis scheme
A. Contrasts used to test for coding of facing direction. Top panel shows all views for two of the four museums; views that face the same direction as defined by the local museum frame are colored the same. Middle panel shows comparisons between view 1 and other views that face the same or different local direction, within and across museums. To partially control for location, view 1 is never compared to views in the same corner (i.e. views 1, 8, 9 and 16 are excluded). Bottom panel shows a test for direction coding that completely controls for location: in this case, the same-direction comparison view is located in the same corner as the different-direction comparison view. B. Contrasts used to test for coding of location. Top panel shows all views for two of the four museums; views located in the same corner of the environment (defined by the local museum frame) are colored the same. Middle panel shows comparisons between view 1 and other views located in the same or different corner, within and across museums. To partially control for direction, view 1 is never compared to views facing the same local direction (i.e. views 1, 2, 9 and 10 are excluded). Bottom panel shows a test for location coding that completely controls for direction: in this case, the same-location comparison view faces the same direction as the different-location comparison view.
Figure 3
Figure 3. Behavioral priming for facing direction and location in Experiment 1
A. Priming for facing direction in the local (museum-anchored) reference frame. Left panel shows that reaction times were faster when local direction (e.g. facing the back wall) was repeated across successive trials compared to trials in which local direction was not repeated, irrespective of whether the repetition was within or across museums. Right panel shows breakdown of reaction timess for different-museum/different local direction trials, based on whether global direction was repeated or not. Results indicate an absence of residual coding of direction in the global frame. B. Priming for location defined in the local reference frame. Left panel shows that reaction times were faster when location defined locally (e.g. back right corner) was repeated across successive trials compared to trials in which location was not repeated, irrespective of whether the repetition was within or across museums. Right panel shows breakdown of reaction times for different-museum/different local direction trials, based on whether location defined globally was repeated or not. Results indicate an absence of residual coding of location in the global frame. Error bars in both panels indicate standard error of the mean.
Figure 4
Figure 4. Coding of facing direction and location in RSC activation patterns in Experiment 2
A. Coding of local direction in the local (museum-referenced) frame in RSC. Left panel shows that pattern similarity between views that face the same direction in local space was greater than pattern similarity between views that face different local directions, both within and across museums. Right panel shows breakdown of pattern similarity for different-museum/different local direction trials, based on whether global direction was repeated or not. Results indicate an absence of residual coding of direction in the global frame. B. Coding of location defined the local reference frame in RSC. Left panel shows that pattern similarity between views in the same or geometrically-equivalent corners was greater than pattern similarity between views in different corners, both within and across museums. Right panel shows breakdown of pattern similarity for different-museum/different local direction trials, based on whether views were in the same location in the global reference frame. Results indicate an absence of residual coding of location in the global frame. Error bars in both panels indicate standard error of the mean.
Figure 5
Figure 5
Visualization of pattern similarities in RSC. This diagram summarizes all 256 pairwise relationships between views in terms of 16 possible spatial relationships. To create this diagram, we first created a pattern similarity map for each of the 16 views by calculating the similarity between that view (starting view) and the other 15 other views (comparison views). These maps were then averaged over spatially equivalent starting views while maintaining the spatial relationships between the starter view and the comparison views. For example, the value for the view directly to the right of the starter view (starter view indicated by the square) represents the average pattern similarity between all views that face the same local direction and are located within the same museum. Pattern similarities were converted to a range from 0 to 1 and colored according to this value. The highest level of similarity was between patterns corresponding to the same view, as indicated by the value of 1.0 for the starter view. Pattern similarities are also high for views facing the same direction, located in the same corner, and there is also an effect of distance between views.
Figure 6
Figure 6. Multivoxel patterns in RSC contain sufficient information about the spatial relations between views to reconstruct the spatial organization of the environment
A. Within museum view reconstruction. Left shows the average confusion matrix between views located within the same museum. Right shows reconstruction of view location from multidimensional scaling and Procrustes alignment. The estimated locations (colored diamonds) are close to the real locations (numbers in black outline). B. Across museum view reconstruction. Left shows the average confusion matrix between views located in different museums. Right shows reconstruction of view location from multidimensional scaling and Procrustes alignment. Although somewhat noisier than the within-museum reconstruction, locations were more accurate than would be expected by chance.
Figure 7
Figure 7
Whole-brain searchlight analysis of multivoxel coding of local direction. Voxels in yellow are significant (p < 0.05) after correcting for multiple comparisons across the entire brain. Consistent with the results of the ROI analyses, imagined facing direction could be decoded in right RSC, at the juncture of the calcarine sulcus and parietal occipital sulcus and just posterior to retrosplenial cortex proper (Brodmann Area (BA) 29/30), and in a slightly more posterior locus in the left hemisphere. Direction coding was also observed in the left superior parietal lobule (SPL). The outline of RSC was created by transforming individual subjects’ ROIs to standard space and computing a group t-statistic thresholded at p < 0.001. The outline of BA 29/30 was based on templates provided in MRIcron (http://www.mricro.com/mricron/install.html).

Comment in

Similar articles

Cited by

References

    1. O'Keefe J, Dostrovsky J. The hippocampus as a spatial map. Preliminary evidence from unit activity in the freely-moving rat. Brain Research. 1971;34:171–175. - PubMed
    1. Hafting T, Fyhn M, Molden S, Moser M-B, Moser EI. Microstructure of a spatial map in the entorhinal cortex. Nature. 2005;436:801–806. - PubMed
    1. Taube JS, Muller RU, Ranck JB. Head-direction cells recorded from the postsubiculum in freely moving rats. I. Description and quantitative analysis. The Journal of Neuroscience. 1990;10:420–435. - PMC - PubMed
    1. Vass LK, Epstein RA. Abstract Representations of Location and Facing Direction in the Human Brain. The Journal of Neuroscience. 2013;33:6133–6142. - PMC - PubMed
    1. Hassabis D, et al. Decoding Neuronal Ensembles in the Human Hippocampus. Current Biology. 2009;19:546–554. - PMC - PubMed

Methods References

    1. Aguirre GK. Continuous carry-over designs for fMRI. NeuroImage. 2007;35:1480–1494. - PMC - PubMed
    1. Jenkinson M, Bannister P, Brady M, Smith S. Improved Optimization for the Robust and Accurate Linear Registration and Motion Correction of Brain Images. NeuroImage. 2002;17:825–841. - PubMed
    1. Julian JB, Fedorenko E, Webster J, Kanwisher N. An algorithmic method for functionally defining regions of interest in the ventral visual pathway. NeuroImage. 2012;60:2357–2364. - PubMed
    1. Haxby JV, et al. Distributed and Overlapping Representations of Faces and Objects in Ventral Temporal Cortex. Science. 2001;293:2425–2430. - PubMed
    1. Jenkinson M, Beckmann CF, Behrens TEJ, Woolrich MW, Smith SM. FSL. NeuroImage. 2012;62:782–790. - PubMed

Publication types

LinkOut - more resources