Integration and Segregation

Traditionally, smooth pursuit research has explored how eye movements are generated to follow small, isolated targets that fit within the fovea. Objects in a natural scene, however, are often larger and extend to peripheral retina. They also have components that move in different directions or at different speeds (e.g., wings, legs). To generate a single velocity command for smooth pursuit, motion information from the components must be integrated. Simultaneously, it may be necessary to attend to features of the object while pursuing it. Our goal is to understand attention allocation during pursuit of natural objects. In some experiments, we use random dot cinematograms (RDCs) as pursuit stimuli. Our RDCs consist of a pattern of randomly-spaced dots that usually move at a single velocity and in the same direction. We find that while pursuit of a foveal target is attentive and hinders performance on simultaneous attention-demanding tasks, pursuing an RDC improves pursuit and reduces saccades, and also improves performance on secondary tasks. This indicates that large stimuli release attention from pursuit, facilitating feature inspection. In other experiments, a multiple object tracking (MOT) cloud is pursued. The MOT cloud consists of a number of dots that move in random directions relative to one another in a virtual window that translates across the screen. When observers attentively tracked the targets within an MOT cloud, simultaneous pursuit of the cloud did not hinder performance on the attentive tracking task, suggesting that spatio-temporal integration of individual spot velocities for pursuit is also inattentive and leaves attention free for the segregation process.