Stereoacuity thresholds, measured with bar targets, rise as the absolute disparity of the bars is increased. One explanation for this rise is that, as the bars are moved away from the fixation plane, the stereo system uses coarser mechanisms to encode the bars' disparity; coarse mechanisms are insensitive to small changes in target disparity, resulting in higher thresholds. To test this explanation, we measured stereoacuity with a 6 degrees wide 3 cpd grating presented in a rectangular envelope. We varied the disparity of the grating and its edges (envelope) parametrically from 0 to 20 arcmin (i.e., through one full period). To force observers to make judgments based on carrier disparity, we then varied the interocular phase incrementally from trial-to-trial while keeping edge disparity fixed for a given block of trials. The pedestal phase disparity of the grating necessarily cycles through 360 degrees, back to zero disparity, as the edge disparity increases monotonically from 0 to 20 arcmin. Unlike mechanisms that respond to bars, the mechanism that responds to the interocular phase disparity of the grating should have the same sensitivity at 20 arcmin disparity (360 degrees of phase) as it has at zero disparity. So, if stereoacuity were determined by the most sensitive mechanism, thresholds should oscillate with the pedestal phase disparity. However, these gratings are perceived in depth at the disparity of their edges. If stereoacuity were instead determined by the stereo matching operations that generate perceived depth, thresholds should rise monotonically with increasing edge disparity. We found that the rise in grating thresholds with increasing edge disparity was monotonic and virtually identical to the rise in thresholds observed for bars. Stereoacuity is contingent on stereo matching.