
Teng Lab
Welcome to the Cognition, Action, and Neural Dynamics Laboratory
We aim to better understand how people perceive, interact with, and move through the world, especially when vision is unavailable. To this end, the lab studies perception and sensory processing in multiple sensory modalities, with particular interests in echolocation and braille reading in blind persons. We are also interested in mobility and navigation, including assistive technology using nonvisual cues. These are wide-ranging topics, which we approach using a combination of psychophysical, neurophysiological, engineering, and computational tools.
Spatial Perception and Navigation
It's critically important to have a sense of the space around us — nearby objects, obstacles, people, walls, pathways, hazards, and so on. How do we convert sensory information into that perceptual space, especially when augmenting or replacing vision? How is this information applied when actually navigating to a destination? One cue to the surrounding environment is acoustic reverberation. Sounds reflect off thousands of nearby and distant surfaces, blending together to create a signature of the size, shape, and contents of a scene. Previous research shows that our auditory systems separate reverberant backgrounds from direct sounds, guided by internal statistical models of real-world acoustics. We continue this work by asking, for example, when reverberant scene analysis develops and how it is affected by blindness.
Echolocation, Braille, and Active Sensing
Sometimes, the surrounding world is too dark and silent for typical vision and hearing. This is true in deep caves, for example, or in murky water where little light penetrates. Animals living in these environments often have the ability to echolocate: They make sounds and listen for their reflections. Like turning on a flashlight in a dark room, echolocation is a way to illuminate objects and spaces actively using sound. Using tongue clicks, cane taps or footfalls, some blind humans have demonstrated an ability to use echolocation (also called sonar, or biosonar in living organisms) to sense, interact with, and move around the world. What information does echolocation afford its practitioners? What are the factors contributing to learning it? How broadly is it accessible to blind, and sighted, people?
Neural Dynamics and Crossmodal Plasticity
The visual cortex does not fall silent in blindness. About half the human neocortex is thought to be devoted to visual processing. However, in blind people, who are deprived of the visual input that drives those brain regions, those areas become responsive to auditory, tactile, and other nonvisual tasks, a phenomenon called crossmodal plasticity. What kinds of computations take place under these circumstances? How do crossmodally recruited brain regions represent information? To what extent are the connections mediating this plasticity present in sighted people as well? These and many other basic questions remain debated despite decades of crossmodal plasticity research.
Want to join the lab or participate in a study?
Check here for details on any current funded or volunteer opportunities.
For inquiries, email the lab PI at santani@ski.org
Tabs



-
Rehabilitation Engineering Research Center
View CenterThe Center's research goal is to develop and apply new scientific knowledge and practical, cost-effective devices to better understand and address the real-world problems of blind, visually impaired, and deaf-blind consumers
-
Teng Lab Experiment Participation
Available experiments in the Cognition, Action, and Neural Dynamics Lab
-
Active
Haptic Kinematics of Two-Handed Braille Reading in Blind Adults
This page (currently under construction) accompanies a work-in-progress poster at the 2020 Eurohaptics meeting.
-
Active
Hearing the World: A Remote Study of Auditory Perception
We aim to investigate the nature of auditory perception and how the brain learns rules for interpreting sounds.
-
Active
Human Echolocation
What is echolocation? Sometimes, the surrounding world is too dark and silent for typical vision and hearing.
-
Active
Reverberant Auditory Scene Analysis
The world is rich in sounds and their echoes from reflecting surfaces, making acoustic reverberation a ubiquitous part of everyday life.
-
Inactive
-
Inactive
Optimizing Echoacoustics: An online perceptual study (Under construction)
An online study of echoacoustic perception
-
Inactive
- Giovanni Fusco - Engineering Manager and Lead Machine Learning Eng., Pixofarm
- Haydée G García-Lázaro - Post-Doctoral Fellow
- Naomi Heinen - Research Intern
- Santani Teng - Associate Scientist
- Yash Kshirsagar - Laboratory Assistant
- Aaron Buckley - Research Intern
- Audrey Wong-Kee-You - Postdoctoral Fellow
- Brendyn Chao - Research Intern
- Catalina Mendoza - Laboratory Assistant
- Yelena Isnora - Research Intern
- Zachary Adnane - Research Intern
-
SKERI Receives Rehabilitation Engineering Research Center (RERC) grant on...
Smith-Kettlewell is proud to announce the newly awarded Rehabilitation Engineering Research Center (RERC) grant on Blindness and Low Vision. This is a five-year grant from the National Institute on Disability, Independent Living, and Rehabilitation Research, establishing Smith-Kettlewell as a center promoting the independence and well-being of people with visual impairments through research and development to improve the understanding of, and provide solutions for, challenges facing the blind and low-vision community. -
Object echolocation project receives Best Poster award at OPAM 29
A Smith-Kettlewell study measuring object information available via echolocation was recognized with a Best Poster award at the 2021 Conference on Object Perception, visual Attention, and visual Memory (OPAM 29). -
Dr. Teng awarded grant to model human echolocation
The National Eye Institute has awarded an R21 grant to Santani Teng, an Associate Scientist at Smith-Kettlewell, to model perceptual mechanisms of echolocation. -
Haydée García-Lázaro receives FoVea Award 2021 from Vision Sciences Society
The FoVea Travel and Networking Award has the mission to advance the visibility, impact, and success of women in vision science. It is open to female members of the Vision Science Society (VSS) in pre-doctoral, post-doctoral, pre-tenure faculty, or research scientist positions. Haydée García-Lázaro, a postdoctoral fellow at SKERI working with Dr Satani Teng, received the FoVea Travel and Networking Award 2021 from Females of Vision et al., & Vision Sciences Society. -
Haydée García-Lázaro receives the Elsevier/Vision Research Virtual Travel...
Haydée García-Lázaro, a postdoctoral fellow at SKERI working at TengLab received the Elsevier/Vision Research Virtual Travel Award 2021 from the Vision Sciences Society -
SKERI Scientists Spearhead a Face Shield Making Campaign
Founded by Dr. Santani Teng, the Bay Area Face Shield Supply (BAFSS) has sent hundreds of face shields to healthcare workers in the community. Dr. Teng has been working with Drs Audrey Wong-Kee-You and Cecile Vullings, along with other volunteers across the Bay Area to provide life-saving personal protective equipment to healthcare workers as they work with COVID-19 pateints. -
SKERI researcher awarded grant to study echolocation
Santani Teng, an Associate Scientist at Smith-Kettlewell , has been awarded a three-year grant from The E. Matilda Ziegler Foundation for the Blind and Visually Impaired to study the neural processes of echolocation. Like bats and dolphins, some blind people use active echolocation : they make sounds and use the reflections to perceive and interact with their surroundings. These sounds, often in...