Wearable omnidirectional vision system for personal localization and guidance

Conference Paper


Autonomous navigation and recognition of the environment are fundamental abilities for people extensively studied in computer vision and robotics fields. Expansion of low cost wearable sensing provides interesting opportunities for assistance systems that augment people navigation and recognition capabilities. This work presents our wearable omnidirectional vision system and a novel two-phase localization approach running on it. It runs state-of-the-art real time visual odometry adapted to catadioptric images augmented with topological-semantic information. The presented approach benefits from using wearable sensors to improve visual odometry results with true scaled solution. The wide field of view of catadioptric vision system used makes features last longer in the field of view and allows more compact location representation which facilitates topological place recognition. Experiments in this paper show promising ego-localization results in realistic settings, providing good true scaled visual odometry estimation and recognition of indoor regions.

Conference Name

Conference on Computer Vision and Pattern Recognition Workshops (CVPRW)

Year of Publication