Principal Investigator:
Contact Information:
Email: coughlan@ski.org
Mobile Phone: (415) 345-2146
2318 Fillmore St.
San Francisco, CA 94115
The goal of our laboratory is to develop and test assistive technology for blind and visually impaired persons that is enabled by computer vision and other sensor technologies.
Publications
Projects
- Other
- Active
- Completed
Audiom Map of Smith-Kettlewell
An Audiom map of the main Smith-Kettlewell building at 2318 Fillmore St., San Francisco has been created by Brandon Biggs. This is an audio-visual map that allows users to explore a detailed map of the building with or without vision. The map runs in any browser and is available to anyone who visits the building.
MapIO: a Gestural and Conversational Interface for Tactile Maps
For individuals who are blind or have low vision, tactile maps provide essential spatial information but are limited in the amount of data they can convey. Digitally augmented tactile maps enhance these capabilities with audio feedback, thereby combining the tactile feedback provided by the map with an audio description of the touched elements. In this context, we explore an embodied interaction paradigm to augment tactile maps with conversational interaction based on Large Language Models, thus enabling users to obtain answers to arbitrary questions regarding the map. We analyze the type of…
Using VR to Help Train Visually Impaired Users to Aim a Camera
People with visual impairments increasingly rely on camera-enabled smartphone apps for tasks like photography, navigation, and text recognition. Despite the growing use of these applications, precise camera aiming remains a significant challenge. This project explores the impact of virtual reality (VR) exploration in the context of learning to use a camera-based app. So far we have studied this approach in the context of training a visually impaired person to use walk-light detector app at traffic intersections.
CamIO Hands
This project builds on the CamIO project to provide point-and-tap interactions allowing a user to acquire detailed information about tactile graphics and 3D models. The interface uses an iPhone’s depth and color cameras to track the user’s hands while they interact with a model. When the user points to a feature of interest on the model with their index finger, the system reads aloud basic information about that feature. For additional information, the user lifts their index finger and taps the feature again. This process can be repeated multiple times to access additional levels of…
Magic Map
The Magic Map is an interactive 3D map installed at the Magical Bridge Playground in Palo Alto, California. It consists of a 1/100 scale 3D bronze representation of the playground, which includes over seventy play structures organized into multiple play zones and paths. When the user's index fingertip touches a specific feature on the map, the name and description of the feature are read aloud in audio. This interactivity allows visitors with visual impairments to navigate the map without requiring them to read braille.
Outreach at Smith-Kettlewell
Smith-Kettelwell is deeply committed to supporting our community. Please see the news tab for our upcoming and ongoing events. In the News tab, you will find information about our involvement in past outreach activities.
Audiom
Audiom is a tool that allows blind and visually impaired individuals to view maps completely in audio. It is a web component and can be embedded into any webpage, similar to Google Maps. It allows non-visual use of route, landmark, and survey knowledge, which is the critical information needed for navigation.
Blindness and Low Vision Support Group
Join Dr. Don Fletcher, one of the world’s leading authorities on Low Vision Rehabilitation, to share experiences and learn about the things that help you maintain a full and happy life while living with low vision.
A Computer Vision-Based Indoor Wayfinding Tool
The ability to navigate safely and confidently is a fundamental requirement for independent travel and access to many settings such as work, school, shopping, transit and healthcare. Navigation is particularly challenging for people with visual impairments, who have limited ability to see signs, landmarks or maps posted in the environment.
Completed
Crowd-Sourced Description for Web-Based Video (CSD)
The Descriptive Video Exchange Project (funded by the National Eye Institute of the National Institutes of Health, (grant # R01 EY020925-01) focuses on crowd-sourced techniques for describing DVD media. CSD will expand DVX to include Internet-based media such as YouTube, iTunes U, and other streamed video found on a wide variety of web sites. Many streamed Internet-based video sources provide well-defined, public APIs for accessing all the information DVX requires. Using these APIs will allow the VDRDC to expand DVX to include streamed content so that seamless, simple, crowd-sourced descriptions can be added to Internet-based video by volunteers or professionals anywhere.
Talking Signs
Created by William Loughborough in 1979, Talking Lights was a system of infrared transmitters and receivers allowing blind and visually impaired travelers to quickly and easily "read signs" at a distance.
ZoomBoard: an Affordable, Portable System to Improve Access to Presentations and Lecture Notes for Low Vision Viewers
The goal of the project is to develop a “ZoomBoard” system that students with low vision can use to better access visual material on a whiteboard or blackboard. The prototype version of the system that we plan to develop in this grant will consist of a dedicated camera system placed by the teacher to capture a view of the board, which wirelessly transmits a video stream that will be displayed on a student’s iPad. The student will use the ZoomBoard app to view this video stream, zoom in on any region of interest using a pinch gesture on the iPad, and apply image enhancements such as contrast…
Regressions in Braille Reading
This project explores regressions (movements to re-read text) in braille reading. The image on the right plots the braille reading finger movements in blue and regressions in black.
Sign Finder
This project seeks to develop a computer vision-based system that allows a visually impaired traveler to find and read informational signs, such as signs labeling office doors, streets, restrooms and Exit signs. Link open source code
Completed
Tutorials and Reference
These are tutorials and reference materials I have written on various topics in probability and geometry over the years.
Tactile Graphics Helper (TGH)
Tactile graphics use raised lines, textures, and elevations to provide individuals with visual impairments access to graphical materials through touch. Tactile graphics are particularly important for students in science, technology, engineering, and mathematics (STEM) fields, where educational content is often conveyed using diagrams and charts. However, providing a student who has a visual impairment with a tactile graphic does not automatically provide the student access to the graphic's educational content. Instead, the student may struggle to decipher subtle differences between textures or…
Workshop Series on Computer Vision and Sensor-Enabled Assistive Technology for Visual Impairment
Recent workshop: Workshop on Environmental Sensing Technologies for Visual Impairment (ESTVI '13 in San Francisco) ESTVI '13 focused on emerging technologies capable of sensing environmental features for applications in access technologies for persons with visual impairment, including low vision and blindness. The development of environmental sensing technologies (ESTs) and the study of their potential to support the activities of daily living for visually impaired persons is progressing at a rapid pace, and engages many disparate research fields, including computer vision, wearable sensors…
Display Reader
The goal of the Display Reader project is to develop a computer vision system that runs on smartphones and tablets to enable blind and visually impaired persons to read appliance displays. Such displays are found on an increasing array of appliances such as microwave ovens, thermostats and home medical devices. Click here for Display Reader software download.
BLaDE
BLaDE (Barcode Localization and Decoding Engine) is an Android smartphone app designed to enable a blind or visually impaired user find and read product barcodes. The primary innovation of BLaDE, relative to most commercially available smartphone apps for reading barcodes, is that it provides real-time audio feedback to help visually impaired users find a barcode, which is a prerequisite to being able to read it. Link to BLaDE software download: http://legacy.ski.org/Rehab/Coughlan_lab/BLaDE/ Click here for YouTube video demo of BLaDE in action.
Centers
- Rehabilitation Engineering Research CenterThe Center's research goal is to develop and apply new scientific knowledge and practical, cost-effective devices to better understand and address the real-world problems of blind, visually impaired, and deaf-blind...
People
Current People
- Charity Pitcher-CooperResearch AssociatePronouns: she/her I joined The Smith-Kettlewell Rehabilitation Engineering Research Center (RERC) team in 2017 as an assistant to scientist Dr. […]
Past People
Collaborators
Internal Collaborators
External Collaborators
- Sergio MascettiAffiliate ScientistDepartment of Computer Science, University of MilanSergio Mascetti is Associate Professor at the Department of Computer Science of the Università degli Studi di Milano where he also received his BSc, […]
News
- Oakland High School’s Students Visit Smith-KettlewellOn Friday, February 6th, students from Oakland High School's Innovative Design and Engineering Academy visited Smith-Kettlewell. Students learned about different kinds of accessibility research directly from some of our scientists, giving...
- Empowering Data Vision: A Self-Paced Introduction to the Convergence of AI and Data ScienceJuly 25, 2025 — San Francisco, CA
- SKERI Scientists Join the IGNITE STEM Fair at Burton High SchoolOn April 18, 2024 SKERI scientists enjoyed the beautiful Phillip and Sala Burton Academic High School campus while offering its students hands-on, experiential, and interactive examples of SKERI's scientists' work.
- SKERI Researchers Bring their Rehabilitation and Assistive Technology to CSUN Tech ConferenceSmith-Kettlewell’s Rehabilitation Engineering Research Center (RERC) on Blindness and Low Vision, funded by the National Institute on Disability, Independent Living and Rehabilitation Research (NIDILRR ), supports eight projects related to visual...
- CamIO Receives Supplement to Enhance Software Tools for Open ScienceDr. James Coughlan has been awarded funds to increase access to his CamIO tool for making objects accessible to blind and visually impaired persons. The funds were part of NIH's Notice of Special Interest for...
- SKERI Receives Rehabilitation Engineering Research Center (RERC) grant on Blindness and Low VisionSmith-Kettlewell is proud to announce the newly awarded Rehabilitation Engineering Research Center (RERC) grant on Blindness and Low Vision. This is a five-year grant from the National Institute on Disability,...
- SKERI Researcher talks Indoor Navigation & Mapping on Blind BargainsThe work of Dr. James Coughlan and Brandon Biggs was again recognized at the annual CSUN conference, where Brandon was interviewed for a podcast on Blind Bargains, a source for...
Events
Event Category
Event Type
Blindness and Low Vision Support Group, Year Recap (Hybrid)
Wednesday, May 20th, 2026 – 3:30 PM to 5:00 PM
Bay Area Outreach and Recreation program (BORP) (Hybrid)
Wednesday, April 15th, 2026 – 3:30 PM to 5:00 PM
Get Involved
If you are interested in vision science or want to learn more about low vision and blindness, there are many opportunities to get involved at The Smith-Kettlewell Eye Research Institute.





































