A Kalman Filter model of echo-guided head movement

Presentation

Abstract

Human echolocation relies on the dynamic integration of auditory feedback and motor behaviour to localize sound-reflecting targets. Inspired by behavioural paradigms in which blind participants use echoacoustic clicks to localize objects, we developed a computational model to explore predictive updating during echo-guided target localisation. Specifically, we implemented a Kalman Filter (KF) as a control policy that estimates horizontal target azimuth from echo measurements and adaptively adjusts head orientation in one-dimensional space. Although not a direct model of neural computation, the KF serves as a dynamic state estimator simulating how noisy external cues can reduce spatial uncertainty through action. Measurement reliability was modulated by the angle between head direction and target azimuth, reflecting the directionality of echolocation emissions. This was modelled as a scaled cardioid function of azimuthal eccentricity, where larger head-target relative angles yield noisier echo signals., 

We tested the KF-guided model under two conditions: a test condition with clicks and a control condition without clicks. Simulated data from multiple participants revealed that the KF model achieved localization error and convergence rates comparable to real-world human data, while the control condition failed to converge. Learning dynamics showed consistent improvement across trials in the KF condition, absent in the control model. Curve fits to trial error profiles revealed reliable convergence dynamics.

These findings suggest that simple predictive computational approaches can reproduce key aspects of echo-guided sensorimotor learning, offering a computational foundation for interrogating the mechanisms of echolocation ability in humans, and a framework for developing more complex biologically grounded models.

Conference Name

International Multisensory Research Forum (IMRF)
Durham, UK
Year of Publication:
2025