Zoom Brown Bag: Eye Movement During Object Search and Its Comparison to Free Viewing

Zoom Brown Bag: Eye Movement During Object Search and Its Comparison to Free Viewing

Past Event Date: 

Event Category: 

Event Type: 

Abstract:  Eye movement is an observable behavior relating to visual attention, which can be characterized into two types: one a bottom-up process that is solely based on the visual input and the other a top-down process that is influenced by the behavioral goal. These two types of attention are largely considered to correspond to the eye movements made during free viewing and visual search tasks, respectively. Recent development of deep learning methods provides the opportunity for training models of fixation prediction and comparing their performance. However, most visual search studies that have recorded eye movement have been small-scale efforts limited to only dozens or a few hundreds of unique search images. There is no image dataset labeled with search fixations that is large and general enough for training deep network models, nor are there parallel datasets of search and free-viewing behavior to provide a direct comparison between these two tasks on the same images. To fill in this gap, we created COCO-Search18 and COCO-FreeView, large-scale datasets of eye fixations from people either searching for a target object or freely viewing the same images. We characterized eye movement behaviors in both datasets and trained deep network models to predict fixations on a disjoint test dataset. Additionally, we also collected COCO-CursorSearch, a third parallel dataset using the same images and 18 target categories as COCO-Search18 but with people using a “foveated” mouse-deblurring paradigm to manually search for targets. We validate our mouse movement approximation of search fixations and will discuss the potential that online data collection has for modeling attention. https://www.ski.org/users/yupei-chen

 
Improving Zoom accessibility for people with hearing impairments 
People with hearing impairments often use lipreading and speechreading to improve speech comprehension. This approach is helpful but only works if the speaker’s face and mouth are clearly visible. For the benefit of people with hearing impairments on Zoom calls, please enable your device’s camera whenever you are speaking on Zoom, and face the camera while you speak. (Feel free to disable your camera when you aren’t speaking.)