Rehabilitation Engineering Research Center

Collage of RERC staff mebers and RERC projects

The Center's research goal is to develop and apply new scientific knowledge and practical, cost-effective devices to better understand and address the real-world problems of blind, visually impaired, and deaf-blind consumers. The RERC has many ongoing R&D projects and collaborative relationships, both internal and external to Smith-Kettlewell. Primary funding for the RERC comes from the National Institute on Disability and Rehabilitation Research, with other important sources of support, including the National Eye Institute, and The Smith-Kettlewell Eye Research Institute.

Please check out the latest news about the RERC on Twitter.

Tabs

Journal Articles
Ahmetovic, D., Manduchi, R., Coughlan, J., & Mascetti, S.. (2017). Mind your crossings: Mining GIS imagery for crosswalk localization. Acm Transactions On Accessible Computing (Taccess), 9(4). http://doi.org/10.1145/3046790
Conference Papers
Biggs, B., Toth, C., Stockman, T., Coughlan, J., & Walker, B. N.. (2022). Evaluation of a Non-Visual Auditory Choropleth and Travel Map Viewer. In International Conference on Auditory Display (ICAD) 2022. Virtual: Virtual.
Coughlan, J., Biggs, B., & Shen, H.. (2022). Non-Visual Access to an Interactive 3D Map. In Joint International Conference on Digital Inclusion, Assistive Technology & Accessibility (ICCHP-AAATE '22).
Fusco, G., & Coughlan, J.. (2020). Indoor Localization for Visually Impaired Travelers Using Computer Vision on a Smartphone. In 17th International Web for All Conference: Automation for Accessibility.
Teng, S., & Fusco, G.. (2019). Modeling echo-target acquisition in blind humans. In Conference on Cognitive Computational Neuroscience. Conference on Cognitive Computational Neuroscience: Berlin, Germany.
Biggs, B., Coughlan, J., & Coppin, P.. (2019). Design and Evaluation of an Audio Game-Inspired Auditory Map Interface. In The 25th International Conference on Auditory Display (ICAD 2019). Northumbria University, Newcastle-upon-Tyne, UK: Northumbria University, Newcastle-upon-Tyne, UK.
Mascetti, S., Gerino, A., Bernareggi, C., D’Acquisto, S., Ducci, M., & Coughlan, J.. (2017). JustPoint: Identifying Colors with a Natural User Interface. In 19th Int’l ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2017). ACM: Baltimore, MD.
Coughlan, J., & Miele, J.. (2017). Evaluating Author and User Experience for an Audio-Haptic System for Annotation of Physical Models. In 19th Int’l ACM SIGACCESS Conference on Computers and Accessibility (ASSETS 2017). ACM: Baltimore, MD.
Presentations/Posters
García-Lázaro, H. G., Alwis, Y., & Teng, S.. (2021). Neural representations of temporal and spectral regularities of reverberant environments. Date Published 11/2021, 50th Society of Neuroscience Annual Meeting: Chicago/ Virtual.
García-Lázaro, H. G., Wong-Kee-You, A. M. B., Alwis, Y., & Teng, S.. (2021). Making remote studies accessible during COVID: A SKERI case study on web-based psychophysical research with blind participants. Date Published 07/2021, Rehabilitation Engineering and Assistive Technology Society of North America (RESNA) Annual Meeting: Virtual.
Other Publications
Shen, H., & Coughlan, J.. (2012). Towards a real-time system for finding and reading signs for visually impaired users. In Computers Helping People with Special Needs (pp. 41–47). Springer Berlin Heidelberg.
Brabyn, J. (2010). Assistive Technologies for the Blind. In Sage Encyclopedia of Perception (pp. 64-67). Goldstein, B. (Ed.) Sage Publications: Thousand Oaks, CA.
Brabyn, J. (2010). Sensory Rehabilitation. In Sage Encyclopedia of Perception (pp. 881-883). Goldstein, B. (Ed) Sage Publications: Thousand Oaks, CA.
Brabyn, J. (2008). Vision. In Lange, M (Ed.) Fundamentals in Assistive Technology: An Introduction in Assistiave Technology Implementation in the Lives of People with Disabilities (4th ed., pp. 293-311). Resna Press: Arlington, VA.
Brabyn, J., Seelman, K. D., & Panchang, S.. (2007). Aids for people who are blind or visually impaired. In Cooper, RA, Ohnabe H, Hobson D, (Eds.) An Introduction to Rehabilitation Engineering (pp. 287-313). Taylor & Francis: New York.
Seelman, K. D., Brabyn, J., Ortmann, A., & Palmer, C. V.. (2006). Sensory Aids. In Akay, M. (Ed), Wiley Encyclopedia of Biomedical Engineering. John Wiley & Sons: New Jersey.
Arditi, A., & Brabyn, J.. (2000). Signage and wayfinding. In Silverstone, Lang, Rosenthal & Faye (Eds), The Lighthouse Handbook on Visual Impairment and Vision Rehabilitation (Vol. 1–2). Oxford University Press: New York.
Heller, M., & Brabyn, J.. Visual Impairment: Ergonomic Considerations in Blind and Low Vision Rehabilitation. In Kumar, S. (Ed.), Perspectives in Rehabilitation Ergonomics (Vol. 1). Taylor & Francis: London.

Upcoming Events

There are no Upcoming Events at this time

Past Events

  • a cartoon rendering of an eyeball and a brain holding hands

    Chandna Lab (SEELAB)

    We use rigorous scientific research with the goal to improve detection and treatment outcomes for individuals with strabismus, amblyopia, and cerebral visual impairment.

    View Lab
  • L to R: Huiying Shen, Ali Cheraghi, Brandon Biggs, James Coughlan, Charity Pitcher-Cooper, Giovanni Fusco

    Coughlan Lab

    The Coughlan Lab

    View Lab
  • Photo of Anca Velisar, Kate Agathos, Natela Shanidze & Al Lotze with words Eye-Head Lab underneath

    Shanidze Lab

    Our laboratory is interested in the mechanisms of eye and head movement and coordination and how those mechanisms are altered when visual or vestibular inputs are compromised.

    View Lab
  • Sensor distribution of MEG decoding signature for visual alphabetic letters

    Teng Lab

    We aim to better understand how people perceive, interact with, and move through the world, especially when vision is unavailable. To this end, the lab studies perception and sensory processing in multiple sensory modalities, with particular interests in echolocation and braille reading in blind persons. We are also interested in mobility and navigation, including assistive technology using nonvisual cues. These are wide-ranging topics, which we approach using a combination of psychophysical, neurophysiological, engineering, and computational tools.

    View Lab
  • Verguese Lab

    Verghese Lab

    Our laboratory studies the mechanisms of healthy vision and action, as well as the basis of attention and visual adaptation in clinical populations.

    View Lab
Active Completed Inactive
Current

Pages