Zoom Brown Bag: Neural representation of physical and perceived environmental acoustics

Zoom Brown Bag: Neural representation of physical and perceived environmental acoustics

Past Event Date: 

Speaker: 

Postdoctoral Fellow, Haydée G García-Lázaro

Event Category: 

Event Type: 

Abstract:

Most real-world hearing occurs in acoustically cluttered, reverberant environments, making perceptual segregation of sound sources critical. Reverberation signal carries environmental spatial information of potential use to blind and visually impaired persons. Understanding the neural mechanisms of auditory scene analysis can help identify points of failure in high-level hearing loss and guide behavioral or technological therapeutic interventions. During this talk, I will present some preliminary results of an EEG experiment describing the neural representation of statistical regularities of real-world reverberant environments by using Multivariate Pattern Analysis (MVPA).  https://www.ski.org/users/haydee-garcia-lazaro

Improving Zoom accessibility for people with hearing impairments 
People with hearing impairments often use lipreading and speechreading to improve speech comprehension. This approach is helpful but only works if the speaker’s face and mouth are clearly visible. For the benefit of people with hearing impairments on Zoom calls, please enable your device’s camera whenever you are speaking on Zoom, and face the camera while you speak. (Feel free to disable your camera when you aren’t speaking.)