March 2022

Sunday Monday Tuesday Wednesday Thursday Friday Saturday
27
28
1
2
3
4
5
 
 
 
 
Zoom Brown Bag: The development of visual temporal processing
Event Date:

Abstract - The visual system must organize dynamic input into meaningful percepts across time, balancing between stability and sensitivity to change. The Temporal Integration Window (TIW) has been hypothesized to underlie this balance: if two or more stimuli fall within the same TIW, they are integrated into a single percept; those that fall in different windows are segmented. Visual TIWs have mainly been studied in adults, showing average windows of 65 ms. However, it is unclear how temporal windows develop throughout early childhood. Differences in TIWs can influence high-level cognitive and perceptual processes that require well-adapted timing, such as object individuation, apparent motion, action sequence perception, language processing, action planning, and pragmatic aspects of communication, such as interactional synchrony. Because of the fundamental role temporal processing plays in visual perception, it is important then, to understand not only the trajectory of how TIWs change over typical development (TD) but also neurodevelopmental disorders like autism spectrum disorder (ASD). My work uncovered the developmental trajectory of visual temporal processing in young children with and without autism as well as mapped the development of peak alpha frequency - a potential neural correlate of visual temporal processing. https://www.ski.org/users/julie-freschl

Improving Zoom accessibility for people with hearing impairments People with hearing impairments often use lipreading and speechreading to improve speech comprehension. This approach is helpful but only works if the speaker’s face and mouth are clearly visible. For the benefit of people with hearing impairments on Zoom calls, please enable your device’s camera whenever you are speaking on Zoom, and face the camera while you speak. (Feel free to disable your camera when you aren’t speaking.)

 
 
 
 
6
7
8
9
10
11
12
 
 
 
 
Zoom Brown Bag: "Solving Perplexing PowerPoint Puzzles: A presentation within a presentation."
Event Date:

Abstract - This seminar will cover many fun, sometimes perplexing, features of PowerPoint. Topics include cropping videos, animating text, sharpening shapes, taming icons, punching holes in shapes (!), working with layers, and managing morphs. All of these will be demonstrated with reference to a PowerPoint presentation about what parents of children with CVI have reported regarding their children’s ability to read – research coming out of Dr. Arvind Chandna’s lab.

Improving Zoom accessibility for people with hearing impairments People with hearing impairments often use lipreading and speechreading to improve speech comprehension. This approach is helpful but only works if the speaker’s face and mouth are clearly visible. For the benefit of people with hearing impairments on Zoom calls, please enable your device’s camera whenever you are speaking on Zoom, and face the camera while you speak. (Feel free to disable your camera when you aren’t speaking.)

 
 
 
 
13
14
15
16
17
18
19
 
 
 
 
Zoom Brown Bag: Eye Movement During Object Search and Its Comparison to Free Viewing
Event Date:

Abstract - Eye movement is an observable behavior relating to visual attention, which can be characterized into two types: one a bottom-up process that is solely based on the visual input and the other a top-down process that is influenced by the behavioral goal. These two types of attention are largely considered to correspond to the eye movements made during free viewing and visual search tasks, respectively. Recent development of deep learning methods provides the opportunity for training models of fixation prediction and comparing their performance. However, most visual search studies that have recorded eye movement have been small-scale efforts limited to only dozens or a few hundreds of unique search images. There is no image dataset labeled with search fixations that is large and general enough for training deep network models, nor are there parallel datasets of search and free-viewing behavior to provide a direct comparison between these two tasks on the same images. To fill in this gap, we created COCO-Search18 and COCO-FreeView, large-scale datasets of eye fixations from people either searching for a target object or freely viewing the same images. We characterized eye movement behaviors in both datasets and trained deep network models to predict fixations on a disjoint test dataset. Additionally, we also collected COCO-CursorSearch, a third parallel dataset using the same images and 18 target categories as COCO-Search18 but with people using a “foveated” mouse-deblurring paradigm to manually search for targets. We validate our mouse movement approximation of search fixations and will discuss the potential that online data collection has for modeling attention. https://www.ski.org/users/yupei-chen

Improving Zoom accessibility for people with hearing impairments People with hearing impairments often use lipreading and speechreading to improve speech comprehension. This approach is helpful but only works if the speaker’s face and mouth are clearly visible. For the benefit of people with hearing impairments on Zoom calls, please enable your device’s camera whenever you are speaking on Zoom, and face the camera while you speak. (Feel free to disable your camera when you aren’t speaking.)

 
 
 
 
20
21
22
23
24
25
26
 
 
 
 
Zoom Meeting: Career Paths Outside of Academia
Event Date:

Abstract - Vision Research at Apple

12:00 – 12:35 pm

Speakers:

Andrew Watson - Distinguished Chief Vision Scientist, Apple

Laura Walker - Sr. Engineer, Visual Health, Apple

Apple brings together the smartest and most talented people to make incredible products that impact lives around the globe. The role of vision science is critical to the visual experience our displays provide. Our team works with almost every team across Apple to ensure our displays and algorithms are delighting the human visual system. In this lunchtime chat, we will discuss ways in which vision and human perception experts impact the products we use every day.

Panel: Transitioning to Industry

12:35 – 1:15 pm

Panelists:

Zheng Ma - Research Scientist, Meta(Facebook)

Natalie Stepien-Bernabe - Human Factors Senior Scientist, Exponent

Panelists with PhDs in vision science and psychology will discuss the paths they took to develop their careers in the field of technology and scientific consulting. After sharing their experience, panelists will take questions from the audience.

Improving Zoom accessibility for people with hearing impairments 

People with hearing impairments often use lipreading and speechreading to improve speech comprehension. This approach is helpful but only works if the speaker’s face and mouth are clearly visible. For the benefit of people with hearing impairments on Zoom calls, please enable your device’s camera whenever you are speaking on Zoom, and face the camera while you speak. (Feel free to disable your camera when you aren’t speaking.) 

 
 
 
 
27
28
29
30
31
1
2
 
 
 
 
Zoom Brown Bag: Modeling the impairment of smooth pursuit eye movements in macular degeneration
Event Date:

Abstract - Age-related macular degeneration (AMD) is the most prevalent cause of central visual field loss. Since the fovea (oculomotor locus) is often impaired, individuals with AMD typically have difficulties with saccadic and smooth pursuit eye movements (Verghese, Vullings, & Shanidze, 2021). We propose that smooth pursuit eye movements are impaired in macular degeneration due to two factors: 1) the transient disappearance of the target into the scotoma and 2) noise that depends on the eccentricity of the oculomotor locus from the target. To assess this claim, we measured performance in a perceptual baseball task where observers had to determine whether a target would cross or miss a rectangular region (plate) after being extinguished (Kim, Badler, & Heinen, 2005), when instructed to either fixate a marker or smoothly track the target. We recorded eye movements of 4 AMD eyes and 6 control eyes with simulated scotomata (matching those of individual AMD participants) during the task. We found that controls with simulated scotomata could better discriminate strikes from balls compared to AMD participants, particularly in the smooth pursuit condition. We also developed a model that predicted performance on the task using visible portions of the target trajectory given the scotoma and position uncertainty given the eccentricity of the eye from the target. The model showed a similar trend to participant results, with better discrimination for simulations using control eye position data (foveal oculomotor locus) than for MD data (peripheral oculomotor loci). However, the model's discrimination performance was largely better than actual participant performance. These findings suggest that while the disappearance of the target due to the scotoma and noise due to the eccentricity of the peripheral oculomotor locus from the target in AMD affect perceptual discrimination, these factors account only partially for the impairments. https://www.ski.org/users/jason-rubinstein

Improving Zoom accessibility for people with hearing impairments People with hearing impairments often use lipreading and speechreading to improve speech comprehension. This approach is helpful but only works if the speaker’s face and mouth are clearly visible. For the benefit of people with hearing impairments on Zoom calls, please enable your device’s camera whenever you are speaking on Zoom, and face the camera while you speak. (Feel free to disable your camera when you aren’t speaking.)