An architecture of visual processing revealed by eye movements
Past Event Date:
Speaker:Jeff Mulligan NASA Ames, Mountain View
Meeting room:Room 204 - Main Conference Room
Eye movements provide a high-bandwidth, multidimensional stream of data, which is a veritable firehose compared to traditional psychophysics. In particular, eye movements (and motor responses in general) provide information about the fine-scale time course of visual processing. For example, the perceived direction of motion in the classic "barber-pole illusion" is determined by both the carrier grating and the window shape, but these two parameters influence eye movements with different latencies. Neglecting ocular torsion, there are four dimensions of eye movement: horizontal and vertical version (in which the two eyes move together), and horizontal and vertical vergence (where they move in opposite directions). I will present a novel technique for assessing latency by cross-correlating eye and stimulus velocities while a subject tracks a randomly-moving target. Of the four types of motion, only vertical vergence cannot be performed under voluntary control, and certain stimuli (all having relatively long latencies) are incapable of evoking it. In another experiment, we instructed observers to track one of two targets, and measured weak but reliable responses to the unattended target, in which the long-latency component of the response is abolished. Our results are consistent with a system containing two distinct processes, a fast reflexive process which responds to a restricted class of stimuli, and a slower voluntary process capable of following anything that can be seen, but incapable of controlling vertical vergence.