• Active

    A Computer Vision-Based Indoor Wayfinding Tool

    We are creating a smartphone-based indoor navigation app to estimate the user's location in an indoor environment and guide him/her to a desired destination.

  • Active
    Picture of artificial head, fitted with microphones, on a moving platform

    Acoustic Cues for Wayfinding

    This project aims to do a detailed analysis of the environmental acoustic cues that help some blind navigate successfully.

  • Active

    Adaptive Visual Strategies for Individuals with Macular Degeneration

    In this project we try to gain a better understanding of what visual strategies people use to gather information in the world.

  • Active
    Venn diagram of the multidisciplinary approach to this clinical trial on blind navigation.

    Advanced Spatiomotor Rehabilitation for Navigation in Blindness & Visual Impairment

    Successful navigation requires the development of an accurate and flexible mental, or cognitive, map of the navigational space and of the route trajectory required to travel from the current to the target location. The Cognitive-Kinesthetic (C-K) Rehabilitation Training that we have developed in the preceding period utilizes a unique form of blind memory-guided drawing to develop cognitive mapping to a high level of proficiency.

  • Algorithmic Automated Description (AAD)

    Automated algorithmic description (AAD) uses existing machine-vision techniques to automate specific aspects of description such as camera motion, scene changes, face identification, and the reading of printed text.

  • Active

    Alleviating interocular suppression by high-attention demand training in amblyopia


    The goal of this project is to test a hypothesis that whether or not training patients to pay more attention to the input from the amblyopic eye can overcome interocular suppression to treat amblyopia.


  • Completed

    Assessment of Speechreading with Dual Sensory Loss: Visual and Hearing Impairments

    The purpose of this research study is to test subjects who have various hearing and vision problems on their lipreading, visual and auditory skills in order to understand the relationships between lipreading and visual impairment.

  • Hands with pen on street map closeup

    Audio/Tactile BART Station Maps

    This collaborative project between Smith-Kettlewell and the San Francisco LightHouse applies Smartpen-Based audio/tactile graphics tools to improve orientation and wayfinding by travelers with visual disabilities in and around unfamiliar transit stations.

  • Audio/Tactile Graphics Using LiveScribe Smartpen

    Active from 2007 to 2010, this project was the first exploration of using a digital smartpen as a platform for creating and presenting audio/tactile graphics – a system of using touch-based audio to annotate tactile figures to improve access to graphical information for the blind.

  • Active
  • Completed
    BLaDE (Barcode Localization and Decoding Engine) smartphone app in action


    BLaDE (Barcode Localization and Decoding Engine) is an Android smartphone app designed to enable a blind or visually impaired user find and read product barcodes.