Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2020 May 20;106(4):662-674.e5.
doi: 10.1016/j.neuron.2020.02.023. Epub 2020 Mar 13.

Tracking the Mind's Eye: Primate Gaze Behavior during Virtual Visuomotor Navigation Reflects Belief Dynamics

Affiliations

Tracking the Mind's Eye: Primate Gaze Behavior during Virtual Visuomotor Navigation Reflects Belief Dynamics

Kaushik J Lakshminarasimhan et al. Neuron. .

Abstract

To take the best actions, we often need to maintain and update beliefs about variables that cannot be directly observed. To understand the principles underlying such belief updates, we need tools to uncover subjects' belief dynamics from natural behavior. We tested whether eye movements could be used to infer subjects' beliefs about latent variables using a naturalistic navigation task. Humans and monkeys navigated to a remembered goal location in a virtual environment that provided optic flow but lacked explicit position cues. We observed eye movements that appeared to continuously track the goal location even when no visible target was present there. Accurate goal tracking was associated with improved task performance, and inhibiting eye movements in humans impaired navigation precision. These results suggest that gaze dynamics play a key role in action selection during challenging visuomotor behaviors and may possibly serve as a window into the subject's dynamically evolving internal beliefs.

PubMed Disclaimer

Conflict of interest statement

Declaration of Interests The authors declare no competing interests.

Figures

Figure 1.
Figure 1.. Prim ates can navigate by integrating optic flow.
A. Monkeys and human subjects use a joystick to navigate to a cued target (yellow disc) using optic flow cues generated by ground plane elements (brown triangles). B. The time-course of linear (top) and angular (bottom) velocities during one example trial. Yellow shaded region corresponds to the time period when the target was visible on the screen. Time is also coded by color. C. Example trials showing incorrect (left) and correct (right) responses of a monkey. D. Left. Overhead view of the spatial distribution of target positions across trials. Middle: Movement trajectories of one monkey during a representative subset of trials. Blue dot denotes starting location. Right. First-person view of the trajectories of eye movements during the same trials. Abscissa and ordinate show horizontal version and elevation of the eyes. Blue dots represent the initial eye position (when the target was turned OFF) on each trial. E. Left. Comparison of the radial distance of the monkey’s response (stopping location) against radial distance of the target across trials. Right: Angular eccentricity of the response vs target angle. Black dashed lines have unity slope. The subject’s starting location was taken as the origin. F. Subjects’ accuracy in radial distance (top) and angular eccentricity (bottom) were quantified as the slopes of the corresponding linear regressions and plotted for individual monkeys and human subjects. Horizontal dashed lines denote the value of the slope that corresponds to unbiased behaviour. Error bars denote ±1 SEM across trials. G. Left. The proportion of correct trials of one monkey for various values of hypothetical reward window size (black). Shuffled estimates are shown in gray. Right. ROC curves for all subjects, obtained by plotting their true proportion of correct trials (from unshuffled data) against the corresponding chance-level proportions (from shuffled data) for a range of reward windows. Shaded area denotes standard deviation across subjects. Inset shows the average area under the curve (AUC) for monkeys and human subjects. See also Figure S1.
Figure 2.
Figure 2.. Eye movement dynamics during the task.
A. Time-course of vertical and horizontal (bottom) positions of the left and right eyes of a monkey during one example trial. Yellow region shows the period when a target was visible on the screen. Red dashed line corresponds to the end of steering in this trial. B. The time-course of the rate of saccades during the trial, averaged across all trials separately for each monkey. Trial-averaging was done by aligning trials relative to target onset (yellow region, before break on the x-axis) and end of steering (red dashed line, following the break). Grey line denotes mean saccade rate across monkeys during the period between trials. C. Joint probability density of the distribution over horizontal and vertical eye velocities, averaged across monkeys, while they steered towards the target. Marginals are shown in black. D. Comparison of the predicted and true eye positions in a subset of trials for all monkeys at the moment when the target was just turned OFF. E. Time-course of the eye position during a random subset of trials taken from one monkey. Blue and red dots denote the times at which the target was turned OFF and the end of steering, respectively. F. Target-tracking index when the target turned OFF for individual monkeys and humans. Error bars denote ±1 SEM obtained either by averaging across recording sessions (for monkeys) or bootstrapping (for humans). G. Time-course of the target-tracking index, averaged across monkeys and humans. Grey arrow denotes the chance level tracking-index verified by shuffling procedure. Shaded region denotes ±1 SEM across datasets. See also Figures S2–S4A.
Figure 3.
Figure 3.. Saccadic eye movements aid target-tracking.
A. Time-course of observed (black) and predicted (gray) vertical position of the eyes of a monkey. Black arrows indicate saccades made during three different task epochs (inter-trial, target presentation, and steering periods). Yellow region shows the period when a target was visible on the screen. B. Empirical cumulative distribution function of saccade amplitudes conditioned on the task epoch, averaged across monkeys. Inset shows amplitudes of individual saccades as a function of their timing. C. Average saccade-triggered target tracking error during a time-window around saccades made during steering. D. The time-course of coefficients obtained by linearly regressing the amplitudes of the two components of saccades (blue – vertical, red – horizontal) against the corresponding components of the target tracking error (Methods). E. Similar to D, but showing coefficients for regression done separately for the first, second, and third saccades made during steering. See also Figure S4B–C.
Figure 4.
Figure 4.. Accurate target-tracking is associated with increased task performance.
A. Time-course of the target-tracking index for one session computed using a monkey’s actual eye movements (black solid) and its theoretical upper-bound (black dashed) determined using variability in stopping positions (Methods, equations 3–4). B. Left. Overhead view of the spatial maps showing the standard deviation of stopping positions as a function of target location for individual monkeys and the average human subject. Each wedge corresponds to the map of one subject, calculated by binning target locations (see Fig. 1D) and smoothed using a Gaussian filter. The maps of monkey S & Q, and of the humans, have been rotated for compactness. Right: Comparison of the observed target-tracking index against the theoretical upper bound (averaged over the last 500ms of the trials) across all individual datasets. Dashed line has unity slope and error bars denote ±1 SEM obtained by bootstrapping. C. Top: Time-course of the target-tracking index for one example monkey shown separately for trials in which he stopped within the reward zone (blue), or stopped outside it (red). Shaded regions denote ± 1 standard error estimated by bootstrapping. Bottom: The difference between tracking coefficients the two sets of trials for all subjects. For human subjects, trials in which the subject’s final position was within 0.6m of the center of the target were considered ‘rewarded’. D. Top: We divided trials into five groups based on the magnitude of behavioural error. Time-courses of the target-tracking index for the five trial groups from one monkey (dark blue: most accurate; dark red: least accurate). Bottom: Average value of the target-tracking index just before the end of steering (brown region in the top panel) as a function of percentile accuracy for individual subjects. Solid lines show average across subjects. Across subjects (humans and monkeys), there was a significant correlation between accuracy and tracking coefficient (Pearson’s r = 0.68, p = 3.1 × 10−5). E. Top: Joint distribution of the behavioural error and the target-tracking error across trials of one session from one monkey. Bottom: The mean correlation between behavioural and target-tracking errors of individual subjects. Error bar denotes ±1 SEM obtained by bootstrapping. See also Figure S4D–E.
Figure 5.
Figure 5.. Steering-induced eye movements are not reflexive.
A. Top: Time-course of target-tracking index for one monkey during trials in which he performed the task (blue) or passively viewed the stimulus identical to the one generated when performing the task (red). Black trace shows the tracking index of the OFR model. Tracking indices at time points with negative variance explained were clipped to zero. Bottom: The time-course of the cumulative difference between the target-tracking index on active trials and the OFR model for individual monkeys. B. Top: Time-course of the tracking index of one monkey during trials in which the ground plane density was either high (blue) or low (red). Bottom: The difference between target-tracking index under high and low density conditions for individual monkeys. Brown shaded regions in the bottom panels correspond to the time-window considered for statistical testing. See also Figure S5.
Figure 6.
Figure 6.. Fixation affects task performance.
A. Trial- averaged temporal variability of subjects’ eye position, quantified by standard deviation (see Methods) during ‘Eyes-moving’ (blue) and ‘Eyes-fixed’ (red) trials. Error bars denote standard deviation across subjects (** p = 1.2 × 10−3, paired t-test). B. ROC curves averaged across subjects, for trials in the ‘Eyes-moving’ (blue) and the ‘Eyes-fixed’ condition (red). Inset shows the area under the two curves. Error bars denote standard deviation across subjects (* p = 2.5 × 10−3, paired t-test). C. Top: Comparison of the radial distances of the response and the target on trials under the two conditions. Different symbols denote different human subjects. Bottom: Comparison of the (absolute) angular eccentricity of the response and target. D. Top: Pearson’s correlation coefficient between the radial distance of subjects’ response and the target for all individual subjects. Bottom: Similar comparison for the absolute angular eccentricity of target and response under the two conditions. See also Figure S6.

Similar articles

Cited by

References

    1. Adams RA, Perrinet LU, and Friston K (2012). Smooth Pursuit and Visual Occlusion: Active Inference and Oculomotor Control in Schizophrenia. PLoS One. - PMC - PubMed
    1. Adams RA, Aponte E, Marshall L, and Friston KJ (2015). Active inference and oculomotor pursuit: The dynamic causal modelling of eye movements. J. Neurosci. Methods 242, 1–14. - PMC - PubMed
    1. Andersen RA (1989). Visual and eye movement functions of the posterior parietal cortex. Annu. Rev. Neurosci 12, 377–403. - PubMed
    1. Andersen RA, Essick GK, and Siegel RM (1987). Neurons of area 7 activated by both visual stimuli and oculomotor behavior. Exp. Brain Res 67, 316–322. - PubMed
    1. Avila E, Lakshminarasimhan KJ, DeAngelis GC, and Angelaki DE (2019). Visual and Vestibular Selectivity for Self-Motion in Macaque Posterior Parietal Area 7a. Cereb. Cortex - PMC - PubMed

Publication types

LinkOut - more resources