Abstract
Most real-world hearing occurs in acoustically cluttered, reverberant environments, making perceptual segregation of sound sources critical. Reverberation signal carries environmental spatial information of potential use to blind and visually impaired persons. Understanding the neural mechanisms of auditory scene analysis can help identify points of failure in high-level hearing loss and guide behavioral or technological therapeutic interventions. During this talk, I will present some preliminary results of an EEG experiment describing the neural representation of statistical regularities of real-world reverberant environments by using Multivariate Pattern Analysis (MVPA). https://www.ski.org/users/haydee-garcia-lazaro