Rehabilitation Engineering Research Center

Collage of RERC staff mebers and RERC projects

The Center's research goal is to develop and apply new scientific knowledge and practical, cost-effective devices to better understand and address the real-world problems of blind, visually impaired, and deaf-blind consumers. The RERC has many ongoing R&D projects and collaborative relationships, both internal and external to Smith-Kettlewell. Primary funding for the RERC comes from the National Institute on Disability and Rehabilitation Research, with other important sources of support, including the National Eye Institute, and The Smith-Kettlewell Eye Research Institute.

Please check out the latest news about the RERC on Twitter.

Tabs

Journal Articles
Conference Papers
Non-Visual Access to an Interactive 3D Map. (2022). Non-Visual Access to an Interactive 3D Map. In Joint International Conference on Digital Inclusion, Assistive Technology & Accessibility (ICCHP-AAATE '22).
Modeling echo-target acquisition in blind humans. (2019). Modeling echo-target acquisition in blind humans. In Conference on Cognitive Computational Neuroscience. Conference on Cognitive Computational Neuroscience: Berlin, Germany.
Design and Evaluation of an Audio Game-Inspired Auditory Map Interface. (2019). Design and Evaluation of an Audio Game-Inspired Auditory Map Interface. In The 25th International Conference on Auditory Display (ICAD 2019). Northumbria University, Newcastle-upon-Tyne, UK: Northumbria University, Newcastle-upon-Tyne, UK.
JustPoint: Identifying Colors with a Natural User Interface. (2017). JustPoint: Identifying Colors with a Natural User Interface. In 19th Int’l ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2017). ACM: Baltimore, MD.
Presentations/Posters
Other Publications
Assistive Technologies for the Blind. (2010). Assistive Technologies for the Blind. In Sage Encyclopedia of Perception (pp. 64-67). Goldstein, B. (Ed.) Sage Publications: Thousand Oaks, CA.
Sensory Rehabilitation. (2010). Sensory Rehabilitation. In Sage Encyclopedia of Perception (pp. 881-883). Goldstein, B. (Ed) Sage Publications: Thousand Oaks, CA.
Vision. (2008). Vision. In Lange, M (Ed.) Fundamentals in Assistive Technology: An Introduction in Assistiave Technology Implementation in the Lives of People with Disabilities (4th ed., pp. 293-311). Resna Press: Arlington, VA.
Aids for people who are blind or visually impaired. (2007). Aids for people who are blind or visually impaired. In Cooper, RA, Ohnabe H, Hobson D, (Eds.) An Introduction to Rehabilitation Engineering (pp. 287-313). Taylor & Francis: New York.
Sensory Aids. (2006). Sensory Aids. In Akay, M. (Ed), Wiley Encyclopedia of Biomedical Engineering. John Wiley & Sons: New Jersey.
Signage and wayfinding. (2000). Signage and wayfinding. In Silverstone, Lang, Rosenthal & Faye (Eds), The Lighthouse Handbook on Visual Impairment and Vision Rehabilitation (Vol. 1–2). Oxford University Press: New York.

Upcoming Events

There are no Upcoming Events at this time

Past Events

  • a cartoon rendering of an eyeball and a brain holding hands

    Chandna Lab (SEELAB)

    We use rigorous scientific research with the goal to improve detection and treatment outcomes for individuals with strabismus, amblyopia, and cerebral visual impairment.

    View Lab
  • L to R: Huiying Shen, Ali Cheraghi, Brandon Biggs, James Coughlan, Charity Pitcher-Cooper, Giovanni Fusco

    Coughlan Lab

    The Coughlan Lab

    View Lab
  • Photo of Anca Velisar, Kate Agathos, Natela Shanidze & Al Lotze with words Eye-Head Lab underneath

    Shanidze Lab

    Our laboratory is interested in the mechanisms of eye and head movement and coordination and how those mechanisms are altered when visual or vestibular inputs are compromised.

    View Lab
  • Sensor distribution of MEG decoding signature for visual alphabetic letters

    Teng Lab

    We aim to better understand how people perceive, interact with, and move through the world, especially when vision is unavailable. To this end, the lab studies perception and sensory processing in multiple sensory modalities, with particular interests in echolocation and braille reading in blind persons. We are also interested in mobility and navigation, including assistive technology using nonvisual cues. These are wide-ranging topics, which we approach using a combination of psychophysical, neurophysiological, engineering, and computational tools.

    View Lab
  • Verguese Lab

    Verghese Lab

    Our laboratory studies the mechanisms of healthy vision and action, as well as the basis of attention and visual adaptation in clinical populations.

    View Lab
Active Completed Inactive
Current
  • App icon for Tactile Graphics Helper

    SKERI scientists release Tactile Graphics Helper app for free download

    TGH (short for “Tactile Graphics Helper”) is a free iOS app from The Smith-Kettlewell Eye Research Institute that makes tactile graphics more accessible to people with visual impairments.

    View News
  • The letter Y in lavender and D in purple, with a lavender rectangle around the two letters

    Relaunch of YouDescribe

    On May 18, 2017, in honor of the sixth Global Accessibility Awareness Day (GAAD), Smith-Kettlewell relaunched the award- winning YouDescribe with new features, expanded capabilities, and exciting possibilities for the future of audio description. YouDescribe is a free web-based platform for adding audio description to YouTube videos to improve accessibility for the blind. Conceived in 2011 by Smith-Kettlewell scientist Dr. Joshua Miele in 2011, YouDescribe is a unique platform that allows sighted describers to add audio description to any YouTube video and share those descriptions with blind viewers.

    View News
  • two photos of people in looking at laptop screens

    DescribeAthon 17 'Marathon'

    On January 26, 2017, in San Francisco, the Smith-Kettlewell Eye Research Institute’s Rehabilitation and Engineering Research Center (RERC) hosted DescribeAthon 17 -- an event that used the Institute’s YouDescribe technology, developed by scientist Dr. Josh Miele, to raise awareness about video accessibility on the web for blind viewers. YouDescribe is an enhanced video program for YouTube in which recorded voices describe what cannot be seen.

    View News
  • Photo of Bill Gerrey
  • A hand icon hovers over a 2-D globe. Two grey arrows circle the globe counterclockwise.

    Smith-Kettlewell Announces New Wayfinding App for Blind and Visually-...

    Remote Infrared Signage (also known as “Talking Signs”) was invented at The Smith-Kettlewell Eye Research Institute in San Francisco. This powerful system used infrared beams to provide blind travelers with information about the location of transmitters marking bathrooms, bus stops, businesses, buildings, and beyond. Users could point hand-held receivers to accurately locate and identify the “signs” in that direction.

    View News
  • Photo of CamIO ("Camera Input-Output") system

    Scientists Receive NEI Grant Aiding Blind Interaction with Physical Objects

    James Coughlan, PhD, Senior Scientist at the Smith-Kettlewell Eye Research Institute, San Francisco, California, was recently awarded a four-year grant from NIH-NEI (R01EY025332) entitled, “Enabling Audio-Haptic Interaction with Physical Objects for the Visually Impaired Summary".

    View News
  • Smartphone app, BLaDE, detecting a bar code

    Dr. Coughlan's Bar Code Reader, BLaDE, Featured in Scientific American

    Work by Drs. James Coughlan and Ender Tekin on bar code readers as an accessibility tool is discussed in Scientific American. The work with using these tools is specifically focused for people who are blind or who have visual impairments.

    View News

Pages