Skip to main content
Smith-Kettlewell logo
Donate
  • About Us
    • Mission Statement
    • Accomplishments
    • Leadership
    • History
    • Funding Sources
    • Directions to SKERI
  • Science
    • Centers
    • Labs
    • Projects
    • Publications/Bibliography
  • People
    • Research
      • Scientists (PI's)
      • Current Fellows
      • Research Staff
      • Emeritus
    • Administration
  • What's New
    • Events
    • Calendar
    • News
  • Get Involved / Support
    • Why Get Involved
      • Donate
      • Giving Options
    • Participate in a Study
    • Volunteers
    • Donate
  • Fellowship Program
    • Overview
    • Current Fellows
    • Past Fellows
      • Past Research Fellows
      • Past Clinical Fellows
    • Current Mentors
    • Fellowship Openings
      • Fellowship Application
        • How To Apply
  • Careers
    • Current Opportunities
  • Administration
    • Grants Management
      • Pre-award
      • Post-award
    • IRB
    • HR
      • Employee Handbook
      • New Appointment Form
      • SKERI Conflict of Interest Policy

You are here

Home
Portrait photo of Santani Teng
Teng Lab

Santani Teng

Associate Scientist
Degrees: Ph.D., Psychology, University of California, Berkeley
M.A., Psychology University of California, Davis
B.A., Psychology, University of California, Davis

Hello! I'm an Associate Scientist at Smith-Kettlewell, where I investigate auditory spatial perception, haptics, echolocation, and assisted mobility in sighted and blind persons. Previously, I completed my Ph.D. at UC Berkeley and postdoctoral work at MIT, where I remain affiliated.

(H/h)

See Publications
Current/Previous Fellows

Tabs

  • Publications
  • Centers
  • Labs
  • Projects
Journal Articles
Ratan Murty, N. A., Teng, S., Beeler, D., Mynick, A., Oliva, A., & Kanwisher, N.. (2020). Visual Experience is not Necessary for the Development of Face Selectivity in the Lateral Fusiform Gyrus. Proceedings Of The National Academy Of Sciences. http://doi.org/10.1073/pnas.2004607117
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Lowe, M. X., Mohsenzadeh, Y., Lahner, B., Charest, I., Oliva, A., & Teng, S.. (2020). Spatiotemporal Dynamics of Sound Representations Reveal a Hierarchical Progression of Category Selectivity. Biorxiv. http://doi.org/10.1101/2020.06.12.149120
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Teng, S., Sommer, V., Pantazis, D., & Oliva, A.. (2017). Hearing scenes: a neuromagnetic signature of auditory source and reverberant space separation. Eneuro, 4.
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Cichy, R., & Teng, S.. (2017). Resolving the neural dynamics of visual and auditory scene processing in the human brain: a methodological approach. Philosophical Transactions Of The Royal Society B: Biological Sciences, 372, 20160108.
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Sohl-Dickstein, J., Teng, S., Gaub, B. M., Rodgers, C. C., Li, C., DeWeese, M. R., & Harper, N. S.. (2015). A device for human ultrasonic echolocation. Ieee Transactions On Biomedical Engineering, 62, 1526–1534.
  • Google Scholar
  • BibTex
  • Tagged
  • XML
View Document
Teng, S., Puri, A., & Whitney, D.. (2012). Ultrafine spatial acuity of blind expert human echolocators. Experimental Brain Research, 216, 483–488.
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Teng, S., & Whitney, D.. (2011). The acuity of echolocation: Spatial resolution in sighted persons compared to the performance of an expert who is blind. Journal Of Visual Impairment & Blindness, 105, 20–32.
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Zanolie, K., Teng, S., Donohue, S. E., van Duijvenvoorde, A. C. K., Band, G. P. H., Rombouts, S. A. R. B., & Crone, E. A.. (2008). Switching between colors and shapes on the basis of positive and negative feedback: An fMRI and EEG study on feedback-based learning. Cortex, 44, 537–547.
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Conference Papers
Teng, S., & Fusco, G.. (2019). Modeling echo-target acquisition in blind humans. In Conference on Cognitive Computational Neuroscience.
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Lin, N. - C., Liu, S. - H., Huang, Y. - W., Su, Y. - S., Lu, C. - L., Hsu, W. - T., et al.. (2019). Toward an open platform of blind navigation via interactions with autonomous robots. In ACM SIGCHI Conference.
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Teng, S., Sommer, V., Cichy, R., Pantazis, D., & Oliva, A.. (2018). Auditory letter-name processing elicits crossmodal representations in blind listeners. In Conference on Cognitive Computational Neuroscience. http://doi.org/10.32470/CCN.2018.1236-0
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Lowe, M., Teng, S., Mohsenzadeh, Y., Charest, I., Pantazis, D., & Oliva, A.. (2018). Temporal dynamics underlying sound discrimination in the human brain. In Conference on Cognitive Computational Neuroscience. http://doi.org/10.32470/CCN.2018.1090-0
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Wang, H. -cheng, Katzschmann, R. K., Teng, S., Araki, B., Giarré, L., & Rus, D.. (2017). Enabling independent navigation for visually impaired people through a wearable vision-based feedback system. In IEEE International Conference on Robotics and Automation (pp. 6533–6540).
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Teng, S., Cichy, R., Pantazis, D., & Oliva, A.. (2017). The evolution of tactile letter representations in blind braille readers. In Conference on Cognitive Computational Neuroscience. Retrieved from https://ccneuro.org/2017/abstracts/abstract_3000316.pdf
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Presentations/Posters
Wong-Kee-You, A., Alwis, Y., & Teng, S.. (2021). Hearing real spaces: The development of sensitivity to reverberation statistics in children. Date Published 03/2021, Cognitive Neuroscience Society Annual Meeting: Virtual.
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Teng, S., Lei, D., & Mackeben, M.. (2020). The haptic kinematics of two-handed braille reading in blind adults. Date Published 09/2020, Eurohaptics: Semi-virtual; Leiden, The Netherlands. Retrieved from https://www.ski.org/project/haptic-kinematics-two-handed-braille-reading-blind-adults
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Teng, S. (2020). Fusing non-simultaneous multimodal data to understand neural dynamics across timescales. Organization for Human Brain Mapping Annual Meeting. Organization for Human Brain Mapping Annual Meeting: Virtual (originally Montreal, Canada).
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Lowe, M. X., Mohsenzadeh, Y., Lahner, B., Charest, I., Oliva, A., & Teng, S.. (2019). Spatiotemporal dynamics of sound representations in the human brain. Society for Neuroscience.
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Ezeana, M., Dacus, C., Whitney, D., Teng, S., & Puri, A.. (2019). The spatial resolution of object discrimination across echolocation and touch. Object Perception, Visual Attention, and Visual Memory (OPAM).
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Ratan Murty, N. A., Teng, S., Beeler, D., Mynick, A., Oliva, A., & Kanwisher, N.. (2019). Visual experience is not necessary for the development of face selectivity in the lateral fusiform gyrus.
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Lowe, M., Mohsenzadeh, Y., Lahner, B., Teng, S., Charest, I., & Oliva, A.. (2019). Spatiotemporal neural representations in high- level visual cortex evoked from sounds. Vision Sciences Society.
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Ratan Murty, N. A., Teng, S., Beeler, D., Mynick, A., Oliva, A., & Kanwisher, N.. (2019). Strong face selectivity in the fusiform can develop in the absence of visual experience. Vision Sciences Society.
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Teng, S. (2018). Echolocation: Research to inform future practice. MacFarland Seminar, AER meeting. Reno, NV: Reno, NV. Retrieved from https://aerbvi.org/professional-development/conferences/aeric2018/info/
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Teng, S., Cichy, R., Pantazis, D., & Oliva, A.. (2018). Transforming tactile braille signals over space and time. Vision Sciences Society.
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Teng, S., Cichy, R., Pantazis, D., & Oliva, A.. (2017). The evolution of tactile letter representations in blind braille readers. Society for Neuroscience.
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Teng, S., Cichy, R., Pantazis, D., Sommer, V., & Oliva, A.. (2016). Neurodynamics of visual and auditory scene size representations. Vision Sciences Society.
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Teng, S., Puri, A., & Whitney, D.. (2011). Crossmodal transfer of object information in human echolocation. International Multisensory Research Conference. Fukuoka, Japan: Fukuoka, Japan.
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Teng, S., Lei, D., & Mackeben, M.. The haptic kinematics of two-handed braille reading in blind adults. Date Published 09/2020, Eurohaptics 2020: Leiden, The Netherlands.
  • Google Scholar
  • BibTex
  • Tagged
  • XML
Other Publications
Oliva, A., & Teng, S.. (2016). Cognitive Society. In Handbook of Science and Technology Convergence (pp. 1–9). Springer International Publishing.
  • Google Scholar
  • BibTex
  • Tagged
  • XML

Rehabilitation Engineering Research Center

The Center's research goal is to develop and apply new scientific knowledge and practical, cost-effective devices to better understand and address the real-world problems of blind, visually impaired, and deaf-blind consumers

View Center

Teng Lab

Welcome to the Cognition, Action, and Neural Dynamics Laboratory at SKERI

We study auditory, visual, and haptic perception, echolocation, and assisted mobility in sighted and blind persons. Using a combination of psychophysical, neurophysiological, engineering, and computational tools, we aim to...

View Lab

Brabyn Lab

My lab is the home for the Rehabilitation Engineering Research Center funded by NIDILRR with additional support from SKERI and other agencies. Together we address problems of blindness and visual impairment, particularly problems faced by our target populaiton that may be amenable to...

View Lab

Coughlan Lab

The goal of our laboratory is to develop and test assistive technology for blind and visually impaired persons that is enabled by computer vision and other sensor technologies.

View Lab

Optimizing Echoacoustics: An online perceptual study (Under construction)

An online study of echoacoustic perception

The Kinematics of Braille Reading

[Under construction]

When blind persons read braille, a system of raised dots for tactile reading and writing, how is the information processed? How do a few indentations on the fingerpads translate to linguistic information, and how does the text, in turn, influence the motions of the...

Active
Active

Hearing the World: A Remote Study of Auditory Perception

We aim to investigate the nature of auditory perception and how the brain learns rules for interpreting sounds.

Active

Haptic Kinematics of Two-Handed Braille Reading in Blind Adults

This page (currently under construction) accompanies a work-in-progress poster at the 2020 Eurohaptics meeting.


Zoom poster Q&A times:...

Active

Human Echolocation

What is echolocation? Sometimes, the surrounding world is too dark and silent for typical vision and hearing. This is true in deep caves, for example, or in murky water where little light penetrates. Animals living in these environments often have the ability to echolocate: They make sounds and...

Active

Reverberant Auditory Scene Analysis

The world is rich in sounds and their echoes from reflecting surfaces, making acoustic reverberation a ubiquitous part of everyday life. We usually think of reverberation as a nuisance to overcome (it makes understanding speech or locating sound sources harder), but it also carries useful...

Contact Information
2318 Fillmore St
San Francisco, CA 94115
Email: santani@ski.org
Links
Older site with more detail
  • Directions
  • Accessibility
  • Webmaster
  • Login

© 2019 The Smith-Kettlewell Eye Research Institute | Terms of Use | Privacy Policy

2318 Fillmore Street, San Francisco, CA 94115-1813

415-345-2000 | TTY 415-345-2290 | Fax 415-345-8455

Facebook Twitter LinkedIn YouTube Pinterest