Optimizing Echoacoustics: An online perceptual study (Under construction)
For blind people, producing sounds from sources like footsteps, cane taps, or tongue clicks and listening to the echoes can be a useful source of information about the world around them. Like bats or dolphins, the most skilled of these echolocators gain remarkably precise information from these echoes. are exploring the potential of a wearable device to produce and modify echoes to make echolocation easier and more useful for millions more people.
Goals of the study
We hope to learn more about how we can transform ultrasound echoes while still preserving the information they carry that makes them useful for navigating the world. This may point us in the direction of a more useful echolocation device, and help us understand more about how humans perceive sound in general.
What will you do?
In our online study, we will play you some echoes generated from recordings or computer simulations and ask you to make a judgment about them, for example, which direction they seem to be coming from. For this, you will need a pair of stereo headphones. A session will take about 20 minutes to complete and you will be paid $5 for your participation.
How to participate
The experiment is hosted at Prolific.co. You will need an account.
Email the lead researcher at firstname.lastname@example.org.
We aim to better understand how people perceive, interact with, and move through the world, especially when vision is unavailable. To this end, the lab studies perception and sensory processing in multiple sensory modalities, with particular interests in echolocation and braille reading in blind persons. We are also interested in mobility and navigation, including assistive technology using nonvisual cues. These are wide-ranging topics, which we approach using a combination of psychophysical, neurophysiological, engineering, and computational tools.