Abstract
Abstract - Indoor navigation is a significant challenge for people with visual impairments, who often lack visual cues such as informational signs, landmarks, and structural features that people with normal vision rely on for way finding. I will describe various approaches to recognizing and analyzing informational signs, such as Exit and restroom signs, in a building. These approaches will be incorporated into iNavigate, a smartphone app we are developing that provides indoor navigation assistance for blind individuals. The app combines a digital map of the environment with computer vision and inertial sensing to estimate the user’s location on the map in real time. These approaches can recognize and analyze any sign from a small number of training images to multiple types of signs be processed simultaneously in each video frame. When a sign is recognized in the environment, we can estimate the sign’s distance from the camera, which provides useful information to help iNavigate estimate the user’s location on the map.