Zoom Brown Bag: An Indoor Navigation System using Computer Vision and Sign Recognition

Zoom Brown Bag: An Indoor Navigation System using Computer Vision and Sign Recognition

Past Event Date: 


Ali Cheraghi, Postdoctoral Fellow

Event Category: 

Event Type: 

Abstract - Indoor navigation is a significant challenge for people with visual impairments, who often lack access to visual cues such as informational signs, landmarks, and structural features that people with normal vision rely on for wayfinding. I will describe various approaches to recognizing and analyzing informational signs, such as Exit and restroom signs, in a building. These approaches will be incorporated in iNavigate, a smartphone app we are developing that provides accessible indoor navigation assistance. The app combines a digital map of the environment with computer vision and inertial sensing to estimate the user’s location on the map in real-time. These approaches can recognize and analyze any sign from a small number of training images to multiple types of signs be processed simultaneously in each video frame. When a sign is recognized in the environment, we can estimate the sign’s distance from the camera, which provides useful information to help iNavigate estimate the user’s location on the map.


Improving Zoom accessibility for people with hearing impairments 

People with hearing impairments often use lipreading and speechreading to improve speech comprehension. This approach is helpful but only works if the speaker’s face and mouth are clearly visible. For the benefit of people with hearing impairments on Zoom calls, please enable your device’s camera whenever you are speaking on Zoom, and face the camera while you speak. (Feel free to disable your camera when you aren’t speaking.)