Skip to main page content
Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2015 Dec 30;256:91-105.
doi: 10.1016/j.jneumeth.2015.08.007. Epub 2015 Aug 14.

A System to Measure the Optokinetic and Optomotor Response in Mice

Affiliations
Free PMC article

A System to Measure the Optokinetic and Optomotor Response in Mice

Friedrich Kretschmer et al. J Neurosci Methods. .
Free PMC article

Abstract

Background: Visually evoked compensatory head movements (Optomotor responses) or eye movements (Optokinetic responses) are extensively used in experimental mouse models for developmental defects, pathological conditions, and testing the efficacy of therapeutic manipulations.

New method: We present an automated system to measure Optomotor and Optokinetic responses under identical stimulation conditions, enabling a direct comparison of the two reflexes. A semi-automated calibration procedure and a commercial eye tracker are used to record angular eye velocity in the restrained animal. Novel video tracking algorithms determine the location of the mouse head in real time and allow repositioning of the stimulus relative to the mouse head.

Results: Optomotor and Optokinetic responses yield comparable results with respect to determining visual acuity in mice. Our new head tracking algorithms enable a far more accurate analysis of head angle determination, and reveal individual head retractions, analogous to saccadic eye movements observed during Optokinetic Nystagmus.

Comparison with existing methods: To our knowledge this is the first apparatus allowing the direct comparison of Optomotor and Optokinetic responses in mice. Our tracking algorithms, which allow an objective determination of head movements are a significant increment over existing systems which rely on subjective human observation. The increased accuracy of the novel algorithms increases the robustness of automated Optomotor response determinations and reveals novel aspects of this reflex.

Conclusions: We provide the blueprints for inexpensive hardware, and release open source software for our system, and describe an accurate and accessible method for Optomotor and Optokinetic response determination in mice.

Keywords: Behavior; Eye movements; Head movements; Mouse; Optokinetic reflex; Optomotor response; Retina; Video tracking visual field; Vision.

Figures

Figure 1
Figure 1
The setup can easily be converted between condition (a) to measure the OMR in the freely behaving animal and condition (b) to measure OKR in the head fixed animal. In both cases the animal is located inside a virtual sphere (c) that is presented on the four computer screens surrounding the animal. This sphere is adjusted in such a way that the animal is always located in its center (inset in (c)). (d) The lid, mirror and platforms can easily be removed.
Figure 2
Figure 2
(a) Infrared illumination is provided by infrared LED strips attached in all four edges inside the setup. (b) Masks to cover the left binocular field (green), right binocular field (magenta), left monocular field (gray) and right monocular field (blue). The dashed lines represent the visual field that is covered by the four screens. Areas based on [25].
Figure 3
Figure 3
Software architecture of the system. The stimulus is presented on four screens by the program okr arena. This program receives images that are used as textures on the virtual sphere and protocols that describe the rotation of the sphere. These files were previously generated through the MATLAB routine patternGen. Head movements (OMR) are recorded with a camera at 25 Hz, which feeds the program omr monitor. This program records the video and determines the animal's head position. The head position is stored and communicated back to okr arena, which readjusts the position of the virtual sphere accordingly. To record eye movements (OKR) the ETL200 Eye tracker is remotely triggered by the stimulation software. The location of the pupil and cornea are read out at 120 Hz and are continuously buffered for storage. The experiment is controlled with a centralized graphical user interface.
Figure 4
Figure 4
The graphical user interface. New stimuli can be generated in the right panel of the window. A stimulus consists of an image that is used as a texture on the virtual sphere, a mask that can cover areas of the sphere and a protocol that describes the movement of the sphere and the mask over time. Stimuli can be added to a list, which can then be run without human interaction. The left panel shows a live view of the inside of the arena with the platform and a set of user interface elements to set the parameters of the tracking algorithm. This is also where the user can set the recording path and start and stop an experiment.
Figure 5
Figure 5
The new tracking algorithm allows us to determine the head orientation of animals of different coat color under various lighting conditions. To keep the lighting conditions more constant under all stimulus conditions regardless of stimulus type, an additional 25 mm infrared bandpass filter can be placed in front of the camera. a) A mouse with agouti coat color at photopic light condition without the additional infrared filter. b) A black coated animal at photopic light condition with infrared filter. c) An albino animal at scotopic light condition with infrared filter. d) A black coated animal at scotopic light condition with infrared filter. Colors in a-d: Green line: detected contour of the mouse, green dot: center of gravity, yellow dot: corrected center of mass, magenta dot: “tail” location, blue dot: “snout” location, blue circles: two radii around the snout location, cyan and yellow: the contour within the two radii, red dots: the two points that determine the head location e) Geometrical constraints to reliably separate the contour of the mouse from the background: 1. The detected contour needs to be within area limits. 2. The contour needs to cross three or more quadrants of the image (green area). 3. The contour needs to be the closest contour to the center of the image. f) Illustration of the “snout stabilization” method. The snout (green dot) is detected as the point farthest away from the center of gravity (yellow dot). 2. Incorrect snout detection: when the mouse changes it's posture the center of gravity shifts (blue) and a different location is detected as the snout (green dot) since it is further from the center of mass (new green segment) In the example the location jumps to the other side of the head. 2′, Corrected snout detection method: We rotate the center of gravity (blue dots) around the snout location from the frame 1 (black dot). The point farthest away from this rotated center of gravity lies on the same side as the location determined in 1 (green dot).
Figure 6
Figure 6
Quantifying head movements during OMR experiments. (a) Illustration of OMR quantification. The histogram plots the number of frames for which the animal's head was moving with a given velocity in either stimulus direction (“correct”, positive angle value) or against stimulus direction (“incorrect”, negative value). Windows of head velocities of amplitudes and orientations similar to the stimulus velocity (correct, green), are compared to the windows of identical amplitudes but reversed orientation (“incorrect”, magenta). The ratio between the sums in the correct and incorrect windows is defined as the Optomotor response. The histograms were calculated for all trials of one animal at the control condition (Left, 0.2 cyc/°, stimulus not moving), 0.05 cyc/° (Center) and 0.2 cyc/° (Right) (b) The effects of varying the thresholds for the windows defined in (a) on the determined value of the OMR. The x and y axes represent the slow and fast motion criteria, expressed in °/s. The heatmap color represents the amount of visually driven responses for each threshold criteria, expressed as number of tracking events in the correct vs. incorrect direction, illustrating the influence the thresholds have on the outcome of the analysis. The stippled lines determine the lower and upper bounds of the head velocity windows that were used for successive analysis. Data were collected from 10 animals, measured 10 times at each condition.
Figure 7
Figure 7
(a) A mouse restrained in the acrylic holder. To derive the azimuth and elevation of the eye, the system needs to be calibrated. (b) Two callipers enable a precise translation of the camera. A laser pointer facilitates alignment of the camera and the hot mirror. (c) By translating the camera and rotating the mirror we can take multiple images of the eye from different virtual angles to calibrate the system. (d,e) Validation of the calibration procedure. Adjusted and measured angles across a range of 0-10° in the horizontal (d) and vertical (e) plane. The mean error was calculated as the mean difference between the actual angle and the measured angle.
Figure 8
Figure 8
Exemplary trace of recorded head (OMR) and eye movements (OKR) at three different stimulus conditions. The Nullcondition (0.2 cyc/°, stimulus not moving), and two spatial frequencies 0.05 cyc/° and 0.2 cyc/° presented with a stimulus rotating at 12°/s. Note the different scaling on the y-axes in the inserts.
Figure 9
Figure 9
The mouse head does not perform a homogeneous rotation. (a) The movement of the head is instead composed of a rotation and translation. To illustrate this phenomenon the x and y pixel coordinates of the two tracked locations of the forehead are plotted separately. (b) The animal performs head saccades that consist of a fast retraction of the head rather than a rotation opposite to stimulus direction as described in OKR. This retraction can be visualized by calculating the translation of the mouse head along the axis of gaze (bottom trace in (a)). (c) The time course of six individual slow phases (green), followed by fast phases (magenta). These periods occurred consecutively at time points t0 to t5. Traces are shown for both markers (Top: Marker 1, Bottom: Marker 2).
Figure 10
Figure 10
Comparison of OKR and OMR in five C57BL/6 mice at three stimulus conditions. i) Nullcondition, s stationary grating of 0.2 cyc/°, ii) a very low spatial frequency of 0.05 cyc/° which is turning at 12 °/s and iii) the reported optimal spatial frequency of 0.2 cyc/° turning at 12°/s. OMR was quantified as described in section 2.2.1. OKR is represented by the Gain (eye velocity / stimulus velocity). Three trials were measured for each animal. In both OMR and OKR experiments, the individual trials for each of five mice, are represented in separate columns at each stimulus condition. The exemplary trials depicted for mouse 3 in figures 8a and 8b are marked with a cross. Note that random head movements under Nullcondition result in an Optomotor response index centered around 1. No eye movements are detected under the Nullcondition (OKR Gain = 0). Both OMR and OKR increase at optimal spatial frequency (0.2 cyc/°) compared to a lower special frequency (0.05 cyc/°).

Similar articles

See all similar articles

Cited by 16 articles

See all "Cited by" articles

Publication types

LinkOut - more resources

Feedback