Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2022 Dec 7;110(23):3897-3906.e5.
doi: 10.1016/j.neuron.2022.08.029. Epub 2022 Sep 21.

Joint coding of visual input and eye/head position in V1 of freely moving mice

Affiliations

Joint coding of visual input and eye/head position in V1 of freely moving mice

Philip R L Parker et al. Neuron. .

Abstract

Visual input during natural behavior is highly dependent on movements of the eyes and head, but how information about eye and head position is integrated with visual processing during free movement is unknown, as visual physiology is generally performed under head fixation. To address this, we performed single-unit electrophysiology in V1 of freely moving mice while simultaneously measuring the mouse's eye position, head orientation, and the visual scene from the mouse's perspective. From these measures, we mapped spatiotemporal receptive fields during free movement based on the gaze-corrected visual input. Furthermore, we found a significant fraction of neurons tuned for eye and head position, and these signals were integrated with visual responses through a multiplicative mechanism in the majority of modulated neurons. These results provide new insight into coding in the mouse V1 and, more generally, provide a paradigm for investigating visual physiology under natural conditions, including active sensing and ethological behavior.

Keywords: ecological perception; gain modulation; receptive fields; visual cortex; visual physiology.

PubMed Disclaimer

Conflict of interest statement

Declaration of interests The authors declare no competing interests.

Figures

Figure 1:
Figure 1:. Visual physiology in freely moving mice.
A) Schematic of recording preparation including 128-channel linear silicon probe for electrophysiological recording in V1 (yellow), miniature cameras for recording the mouse’s eye position (magenta) and visual scene (blue), and inertial measurement unit for measuring head orientation (green). B) Experimental design: controlled visual stimuli were first presented to the animal while head-fixed, then the same neurons were recorded under conditions of free movement. C) Sample data from a fifteen second period during free movement showing (from top) visual scene, horizontal and vertical eye position, head pitch and roll, and a raster plot of over 100 units. Note that the animal began moving at ~4 secs, accompanied by a shift in the dynamics of neural activity.
Figure 2:
Figure 2:. A generalized linear model accurately estimates spatiotemporal receptive fields during free movement.
A) Schematic of processing pipeline. Visual and positional information is used as input into the shifter network, which outputs parameters for an affine transformation of the world-camera image. The transformed image frame is then used as the input to the GLM network to predict neural activity. B) Four example freely moving spatiotemporal visual receptive fields. Scale bar for RFs represents 10 degrees. C) Example actual and predicted smoothed (2 s window) firing rates for unit 3 in B. D) Histogram of correlation coefficients (cc) for the population of units recorded. Average cc shown as gray dashed line. E) Example of a freely moving RF with the shifter network off (left) and on (right) at time lag 0 ms. Colormap same as B. F) Scatter plot showing cc of predicted versus actual firing rate for all units with the shifter network off vs on. Red point is the unit shown in E. G) Example receptive field calculated via STA (left) versus GLM (right). H) Scatter plot showing cc of predicted vs actual firing rate for all units, as calculated from STA or GLM. Red point is the unit shown in G.
Figure 3:
Figure 3:. Comparison of receptive fields measured under freely moving versus head-fixed conditions.
A) Fraction of units that were active (>1 Hz firing rate) and that had significant fits for predicting firing rate, in head-fixed and freely moving conditions. B) Example spatial receptive fields measured during free movement (top) and using a white noise mapping stimulus while head-fixed (bottom) at time lag 0 ms. Scale bar in top left is 10 deg. C) Histogram of correlation coefficients between freely moving and head-fixed RFs. Black color indicates units that fall outside two standard deviations of the shuffle distribution. Arrows indicate locations in the distribution for example units in A.
Figure 4:
Figure 4:. V1 neurons integrate visual and position signals.
A) Overlay of vertical eye angle (phi; gray) and the smoothed firing rate of an example unit (black). B) Example tuning curve for head pitch. Colored points denote the quartiles of phi corresponding to panel F. C) Scatter of the modulation indices for eye position and head orientation (N=268 units, 4 animals). Numbers at top of the plot represent the fraction of units with significant tuning. D) Same unit as A. Example trace of smoothed firing rates from neural recordings and predictions from position-only and visual-only fits. E) Scatter plot of cc for position-only and visual-only fits for all units. F) Gain curve for the same unit in A and C. Modulation of the actual firing rates based on phi indicated by color. G) Schematic of joint visual and position input training. H) Same unit as A, C, and E. Smoothed traces of the firing rates from the data, additive and multiplicative fits. I) Correlation coefficient for visual-only versus joint fits. Each point is one unit, color coded for the joint fit that performed best. J) Comparison of additive and multiplicative fits for each unit. Units characterized as multiplicative are to the right of the vertical dashed line, while additive ones are to the left. Horizontal dashed line represents threshold set for the visual fit, since in the absence of a predictive visual fit, a multiplicative modulation will be similar to an additive modulation. K) Histogram of the difference in cc between additive and multiplicative models. The visual threshold from I was applied to the data. L) Explained variance (r2) for position only (pos), speed and pupil only (sp), visual only (vis), multiplicative with eye/head position (mul_pos), multiplicative with speed and pupil (mul_sp), and multiplicative with eye/head position, speed and pupil (mul_all). M) The fraction of contribution of the weights for multiplicative fits with eye/head position, speed (spd) and pupil (pup). N) Same as M but summing together the contribution for eye/head position.

Similar articles

Cited by

References

    1. Andersen RA, and Mountcastle VB. 1983. “The Influence of the Angle of Gaze upon the Excitability of the Light-Sensitive Neurons of the Posterior Parietal Cortex.” The Journal of Neuroscience: The Official Journal of the Society for Neuroscience 3 (3): 532–48. - PMC - PubMed
    1. Ayaz Aslı, Saleem Aman B., Schölvinck Marieke L., and Carandini Matteo. 2013. “Locomotion Controls Spatial Integration in Mouse Visual Cortex.” Current Biology: CB 23 (10): 890–94. - PMC - PubMed
    1. Bashivan Pouya, Kar Kohitij, and DiCarlo James J.. 2019. “Neural Population Control via Deep Image Synthesis.” Science 364 (6439). 10.1126/science.aav9436. - DOI - PubMed
    1. Bonin Vincent, Histed Mark H., Yurgenson Sergey, and Reid R. Clay. 2011. “Local Diversity and Fine-Scale Organization of Receptive Fields in Mouse Visual Cortex.” The Journal of Neuroscience: The Official Journal of the Society for Neuroscience 31 (50): 18506–21. - PMC - PubMed
    1. Boone Howard C., Samonds Jason M., Crouse Emily C., Barr Carrie, Priebe Nicholas J., and McGee Aaron W.. 2021. “Natural Binocular Depth Discrimination Behavior in Mice Explained by Visual Cortical Activity.” Current Biology: CB 31 (10): 2191–98.e3. - PMC - PubMed

LinkOut - more resources