In the early visual system, a contrast gain control mechanism sets the gain of responses based on the locally prevalent contrast. The measure of contrast used by this adaptation mechanism is commonly assumed to be the standard deviation of light intensities relative to the mean (root-mean-square contrast). A number of alternatives, however, are possible. For example, the measure of contrast might depend on the absolute deviations relative to the mean, or on the prevalence of the darkest or lightest intensities. We investigated the statistical computation underlying this measure of contrast in the cat's lateral geniculate nucleus, which relays signals from retina to cortex. Borrowing a method from psychophysics, we recorded responses to white noise stimuli whose distribution of intensities was precisely varied. We varied the standard deviation, skewness, and kurtosis of the distribution of intensities while keeping the mean luminance constant. We found that gain strongly depends on the standard deviation of the distribution. At constant standard deviation, moreover, gain is invariant to changes in skewness or kurtosis. These findings held for both ON and OFF cells, indicating that the measure of contrast is independent of the range of stimulus intensities signaled by the cells. These results confirm the long-held assumption that contrast gain control computes root-mean-square contrast. They also show that contrast gain control senses the full distribution of intensities and leaves unvaried the relative responses of the different cell types. The advantages to visual processing of this remarkably specific computation are not entirely known.