It has been suggested that the first steps in visual processing strive to compress as much information as possible about the outside world into the limited dynamic range of the visual channels. Here I compare measured neural images with theoretical calculations based on maximizing information, taking into account the statistical structure of natural images. Neural images were obtained by scanning an image while recording from a second-order neuron in the fly visual system. Over a 5.5-log-units-wide range of mean intensities, experiment and theory correspond well. At high mean intensities, redundancy in the image is reduced by spatial and temporal antagonism. At low mean intensities, spatial and temporal low-pass filtering combat noise and increase signal reliability.