Abstract
Echolocation is an active-sensing strategy used by bats, dolphins, and some blind humans, who listen to self-generated echoes to create mental representations of their surroundings. Bats and dolphins echolocate at ultrasonic frequencies and can thus access higher spatial resolution information compared to humans. When slowed to human-audible frequencies, ultrasonic echoes can be more informative to humans than unprocessed audible echoes. Here we explore how ultrasonic echoes can be optimized for human perception, using "Robin," a user-customizable echolocation device that emits ultrasonic signals, records their echoes, and slows them for playback. In this study, we investigated how the “slowdown factor” (a linear time-stretch/frequency-shift) applied to ultrasound signals affects echoacoustic object perception in humans. In E1, 25 novice-sighted adults performed a 2-AFC auditory match-to-reference task, distinguishing Robin-processed echoes from four different furniture objects presented pairwise at three slowdown factors (10, 20, 30). In E2, 15 novice-sighted adults performed a cross-modal, 2-AFC match-to-sample task, where they listened to the same object echoes from E1, and matched them to images of the corresponding objects. Overall, participants performed above chance across all conditions in E1, but only in the highest slowdown condition in E2. Greater slowdown improved performance on both unimodal and cross-modal tasks. These findings suggest that time-stretching ultrasonic echo temporal structure improves behaviorally relevant object perception cues. Ongoing work aims to leverage Robin’s customizable signal parameter space to identify specific acoustic cues best suited to different tasks and environmental features, paving the way for designing future echolocation-based aids for BVI individuals.
Conference Name
