Rehabilitation Engineering Research Center

Collage of RERC staff mebers and RERC projects

The Center's research goal is to develop and apply new scientific knowledge and practical, cost-effective devices to better understand and address the real-world problems of blind, visually impaired, and deaf-blind consumers. The RERC has many ongoing R&D projects and collaborative relationships, both internal and external to Smith-Kettlewell. Primary funding for the RERC comes from the National Institute on Disability and Rehabilitation Research, with other important sources of support, including the National Eye Institute, and The Smith-Kettlewell Eye Research Institute.

Please check out the latest news about the RERC on Twitter.

Tabs

Journal Articles
Ahmetovic, D., Manduchi, R., Coughlan, J., & Mascetti, S.. (2017). Mind your crossings: Mining GIS imagery for crosswalk localization. Acm Transactions On Accessible Computing (Taccess), 9(4). http://doi.org/10.1145/3046790
Conference Papers
Biggs, B., Toth, C., Stockman, T., Coughlan, J., & Walker, B. N.. (2022). Evaluation of a Non-Visual Auditory Choropleth and Travel Map Viewer. In International Conference on Auditory Display (ICAD) 2022. Virtual: Virtual.
Coughlan, J., Biggs, B., & Shen, H.. (2022). Non-Visual Access to an Interactive 3D Map. In Joint International Conference on Digital Inclusion, Assistive Technology & Accessibility (ICCHP-AAATE '22).
Fusco, G., & Coughlan, J.. (2020). Indoor Localization for Visually Impaired Travelers Using Computer Vision on a Smartphone. In 17th International Web for All Conference: Automation for Accessibility.
Teng, S., & Fusco, G.. (2019). Modeling echo-target acquisition in blind humans. In Conference on Cognitive Computational Neuroscience. Conference on Cognitive Computational Neuroscience: Berlin, Germany.
Biggs, B., Coughlan, J., & Coppin, P.. (2019). Design and Evaluation of an Audio Game-Inspired Auditory Map Interface. In The 25th International Conference on Auditory Display (ICAD 2019). Northumbria University, Newcastle-upon-Tyne, UK: Northumbria University, Newcastle-upon-Tyne, UK.
Mascetti, S., Gerino, A., Bernareggi, C., D’Acquisto, S., Ducci, M., & Coughlan, J.. (2017). JustPoint: Identifying Colors with a Natural User Interface. In 19th Int’l ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2017). ACM: Baltimore, MD.
Coughlan, J., & Miele, J.. (2017). Evaluating Author and User Experience for an Audio-Haptic System for Annotation of Physical Models. In 19th Int’l ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2017). ACM: Baltimore, MD.
Presentations/Posters
García-Lázaro, H. G., Alwis, Y., & Teng, S.. (2021). Neural representations of temporal and spectral regularities of reverberant environments. Date Published 11/2021, 50th Society of Neuroscience Annual Meeting: Chicago/ Virtual.
García-Lázaro, H. G., Wong-Kee-You, A. M. B., Alwis, Y., & Teng, S.. (2021). Making remote studies accessible during COVID: A SKERI case study on web-based psychophysical research with blind participants. Date Published 07/2021, Rehabilitation Engineering and Assistive Technology Society of North America (RESNA) Annual Meeting: Virtual.
Other Publications
Shen, H., & Coughlan, J.. (2012). Towards a real-time system for finding and reading signs for visually impaired users. In Computers Helping People with Special Needs (pp. 41–47). Springer Berlin Heidelberg.
Brabyn, J. (2010). Assistive Technologies for the Blind. In Sage Encyclopedia of Perception (pp. 64-67). Goldstein, B. (Ed.) Sage Publications: Thousand Oaks, CA.
Brabyn, J. (2010). Sensory Rehabilitation. In Sage Encyclopedia of Perception (pp. 881-883). Goldstein, B. (Ed) Sage Publications: Thousand Oaks, CA.
Brabyn, J. (2008). Vision. In Lange, M (Ed.) Fundamentals in Assistive Technology: An Introduction in Assistiave Technology Implementation in the Lives of People with Disabilities (4th ed., pp. 293-311). Resna Press: Arlington, VA.
Brabyn, J., Seelman, K. D., & Panchang, S.. (2007). Aids for people who are blind or visually impaired. In Cooper, RA, Ohnabe H, Hobson D, (Eds.) An Introduction to Rehabilitation Engineering (pp. 287-313). Taylor & Francis: New York.
Seelman, K. D., Brabyn, J., Ortmann, A., & Palmer, C. V.. (2006). Sensory Aids. In Akay, M. (Ed), Wiley Encyclopedia of Biomedical Engineering. John Wiley & Sons: New Jersey.
Arditi, A., & Brabyn, J.. (2000). Signage and wayfinding. In Silverstone, Lang, Rosenthal & Faye (Eds), The Lighthouse Handbook on Visual Impairment and Vision Rehabilitation (Vol. 1–2). Oxford University Press: New York.
Heller, M., & Brabyn, J.. Visual Impairment: Ergonomic Considerations in Blind and Low Vision Rehabilitation. In Kumar, S. (Ed.), Perspectives in Rehabilitation Ergonomics (Vol. 1). Taylor & Francis: London.

Upcoming Events

There are no Upcoming Events at this time

Past Events

  • a cartoon rendering of an eyeball and a brain holding hands

    Chandna Lab (SEELAB)

    We use rigorous scientific research with the goal to improve detection and treatment outcomes for individuals with strabismus, amblyopia, and cerebral visual impairment.

    View Lab
  • L to R: Huiying Shen, Ali Cheraghi, Brandon Biggs, James Coughlan, Charity Pitcher-Cooper, Giovanni Fusco

    Coughlan Lab

    The Coughlan Lab

    View Lab
  • Photo of Anca Velisar, Kate Agathos, Natela Shanidze & Al Lotze with words Eye-Head Lab underneath

    Shanidze Lab

    Our laboratory is interested in the mechanisms of eye and head movement and coordination and how those mechanisms are altered when visual or vestibular inputs are compromised.

    View Lab
  • Sensor distribution of MEG decoding signature for visual alphabetic letters

    Teng Lab

    We aim to better understand how people perceive, interact with, and move through the world, especially when vision is unavailable. To this end, the lab studies perception and sensory processing in multiple sensory modalities, with particular interests in echolocation and braille reading in blind persons. We are also interested in mobility and navigation, including assistive technology using nonvisual cues. These are wide-ranging topics, which we approach using a combination of psychophysical, neurophysiological, engineering, and computational tools.

    View Lab
  • Verguese Lab

    Verghese Lab

    Our laboratory studies the mechanisms of healthy vision and action, as well as the basis of attention and visual adaptation in clinical populations.

    View Lab
Active Completed Inactive
Current
  • App icon for Tactile Graphics Helper

    SKERI scientists release Tactile Graphics Helper app for free download

    TGH (short for “Tactile Graphics Helper”) is a free iOS app from The Smith-Kettlewell Eye Research Institute that makes tactile graphics more accessible to people with visual impairments.

    View News
  • The letter Y in lavender and D in purple, with a lavender rectangle around the two letters

    Relaunch of YouDescribe

    On May 18, 2017, in honor of the sixth Global Accessibility Awareness Day (GAAD), Smith-Kettlewell relaunched the award- winning YouDescribe with new features, expanded capabilities, and exciting possibilities for the future of audio description. YouDescribe is a free web-based platform for adding audio description to YouTube videos to improve accessibility for the blind. Conceived in 2011 by Smith-Kettlewell scientist Dr. Joshua Miele in 2011, YouDescribe is a unique platform that allows sighted describers to add audio description to any YouTube video and share those descriptions with blind viewers.

    View News
  • two photos of people in looking at laptop screens

    DescribeAthon 17 'Marathon'

    On January 26, 2017, in San Francisco, the Smith-Kettlewell Eye Research Institute’s Rehabilitation and Engineering Research Center (RERC) hosted DescribeAthon 17 -- an event that used the Institute’s YouDescribe technology, developed by scientist Dr. Josh Miele, to raise awareness about video accessibility on the web for blind viewers. YouDescribe is an enhanced video program for YouTube in which recorded voices describe what cannot be seen.

    View News
  • Photo of Bill Gerrey
  • A hand icon hovers over a 2-D globe. Two grey arrows circle the globe counterclockwise.

    Smith-Kettlewell Announces New Wayfinding App for Blind and Visually-...

    Remote Infrared Signage (also known as “Talking Signs”) was invented at The Smith-Kettlewell Eye Research Institute in San Francisco. This powerful system used infrared beams to provide blind travelers with information about the location of transmitters marking bathrooms, bus stops, businesses, buildings, and beyond. Users could point hand-held receivers to accurately locate and identify the “signs” in that direction.

    View News
  • Photo of CamIO ("Camera Input-Output") system

    Scientists Receive NEI Grant Aiding Blind Interaction with Physical Objects

    James Coughlan, PhD, Senior Scientist at the Smith-Kettlewell Eye Research Institute, San Francisco, California, was recently awarded a four-year grant from NIH-NEI (R01EY025332) entitled, “Enabling Audio-Haptic Interaction with Physical Objects for the Visually Impaired Summary".

    View News
  • Smartphone app, BLaDE, detecting a bar code

    Dr. Coughlan's Bar Code Reader, BLaDE, Featured in Scientific American

    Work by Drs. James Coughlan and Ender Tekin on bar code readers as an accessibility tool is discussed in Scientific American. The work with using these tools is specifically focused for people who are blind or who have visual impairments.

    View News

Pages