The retinal images that people experience depend on the visual scene and where within the scene they look. It
has been argued that many properties of the visual system conform to the statistics of this stream of images.
But to test this, the statistics must be measured. We did this by developing an eye-and-scene tracker that
measures gaze direction and scene distances. Because it is mobile, participants could perform natural tasks
while we collected the data. In binocular vision, a fundamental problem is how to match up image points in
the left eye with the appropriate points in the right eye. We know that the visual system focuses the search
range on a small set of binocular disparities. We found that the distribution of disparities encountered in the
environment is matched to this smaller search range. We also found that the natural distribution explains the
perception of some ambiguous stereograms and that the disparity preferences of cortical neurons in primates
are consistent with the distribution. The retinal image is blurred when the viewer focuses at one distance, and
scene points occur at another distance. We examined the statistics of blur in natural viewing and found that blur
magnitude increases with increasing distance on the retina from the fovea. Interestingly, blur below the fovea is
generally due to scene points nearer than fixation; blur above the fovea is mostly due to points farther than
fixation. Additionally, large blurs are generally caused by points farther rather than nearer than fixation. These
data predict how ambiguously blurred stimuli should appear and the predictions were borne out by experiment.
Our data confirm that properties of the binocular visual system conform to the statistics of retinal images
encountered in everyday activities and that the interpretation of blur is predicted by regularities in the statistics
of retinal images.