Projects

  • Completed

    Audio/Tactile Graphics Using LiveScribe Smartpen

    Active from 2007 to 2010, this project was the first exploration of using a digital smartpen as a platform for creating and presenting audio/tactile graphics – a system of using touch-based audio to annotate tactile figures to improve access to graphical information for the blind.

  • Completed

    Crowd-Sourced Description for Web-Based Video (CSD)

    The Descriptive Video Exchange Project focuses on crowd-sourced techniques for describing DVD media.

  • Completed

    Algorithmic Automated Description (AAD)

    Automated algorithmic description (AAD) uses existing machine-vision techniques to automate specific aspects of description such as camera motion, scene changes, face identification, and the reading of printed text. Such events could be identified by computer routines that automatically add annotations to the video.

  • Completed

    Novel Method to Teach Scotoma Awareness

    This project aims to improve visual function in individuals with age-related macular degeneration (AMD). AMD isassociated with central field loss that cannot be corrected optically.

  • Completed

    Impact of Eye Movements on Reach Performance

    Aim 2 of Reaching with Central Field Loss

  • Completed
  • Completed

    Target Selection in the Real World

    Attention and Segmentation

  • Completed
    Zoomed-in view of appliance display partially obscured by glare

    Display Reader

    The goal of the Display Reader project is to develop a computer vision system that runs on smartphones and tablets to enable blind and visually impaired persons to read appliance displays.

  • Completed
    BLaDE (Barcode Localization and Decoding Engine) smartphone app in action

    BLaDE

    BLaDE (Barcode Localization and Decoding Engine) is an Android smartphone app designed to enable a blind or visually impaired user find and read product barcodes.

  • Completed
    Virtual aerial view of intersection area near a pedestrian's feet, reconstructed by Crosswatch algorithms

    Crosswatch

    Crosswatch is a smartphone-based system developed for providing real-time guidance to blind and visually impaired travelers at traffic intersections.

  • Completed

    Go and Nogo Decision Making

    The decision to make or withhold a saccade has been studied extensively using a go-nogo paradigm, but little is known about the decision process underlying pursuit of moving objects. Prevailing models describe pursuit as a feedback system that responds reactively to a moving stimulus. However, situations often arise in which it is disadvantageous to pursue, and humans can decide not to pursue an object just because it moves. This project explores mechanisms underlying the decision to pursue or maintain fixation. Our paradigm, ocular baseball, involves a target that moves from the periphery toward a central zone called the "plate".

Pages