We aim to better understand how people perceive, interact with, and move through the world, especially when vision is unavailable. To this end, the lab studies perception and sensory processing in multiple sensory modalities, with particular interests in echolocation and braille reading in blind persons. We are also interested in mobility and navigation, including assistive technology using nonvisual cues. These are wide-ranging topics, which we approach using a combination of psychophysical, neurophysiological, engineering, and computational tools.
We aim to investigate the nature of auditory perception and how the brain learns rules for interpreting sounds.
The world is rich in sounds and their echoes from reflecting surfaces, making acoustic reverberation a ubiquitous part of everyday life. We usually think of reverberation as a nuisance to overcome (it makes understanding speech or locating sound sources harder), but it also carries useful...
This project was to measure the neural correlates of grouping and perception in different types of amblyopia. We found that strabismus generates significant abnormalities at both early and later stages of cortical processing and, importantly, that these abnormalities are independent of visual-acuity deficits