Sensor distribution of MEG decoding signature for visual alphabetic letters

Teng Lab

Participate in our studies! Visit Teng Lab Experiment Participation for details.

Welcome to the Cognition, Action, and Neural Dynamics Laboratory

We aim to better understand how people perceive, interact with, and move through the world, especially when vision is unavailable. To this end, the lab studies perception and sensory processing in multiple sensory modalities, with particular interests in echolocation and braille reading in blind persons. We are also interested in mobility and navigation, including assistive technology using nonvisual cues. These are wide-ranging topics, which we approach using a combination of psychophysical, neurophysiological, engineering, and computational tools.

Spatial Perception and Navigation

It's critically important to have a sense of the space around us — nearby objects, obstacles, people, walls, pathways, hazards, and so on. How do we convert sensory information into that perceptual space, especially when augmenting or replacing vision? How is this information applied when actually navigating to a destination? One cue to the surrounding environment is acoustic reverberation. Sounds reflect off thousands of nearby and distant surfaces, blending together to create a signature of the size, shape, and contents of a scene. Previous research shows that our auditory systems separate reverberant backgrounds from direct sounds, guided by internal statistical models of real-world acoustics. We continue this work by asking, for example, when reverberant scene analysis develops and how it is affected by blindness.

Echolocation, Braille, and Active Sensing

Sometimes, the surrounding world is too dark and silent for typical vision and hearing. This is true in deep caves, for example, or in murky water where little light penetrates. Animals living in these environments often have the ability to echolocate: They make sounds and listen for their reflections. Like turning on a flashlight in a dark room, echolocation is a way to illuminate objects and spaces actively using sound. Using tongue clicks, cane taps or footfalls, some blind humans have demonstrated an ability to use echolocation (also called sonar, or biosonar in living organisms) to sense, interact with, and move around the world. What information does echolocation afford its practitioners? What are the factors contributing to learning it? How broadly is it accessible to blind, and sighted, people?

Neural Dynamics and Crossmodal Plasticity

The visual cortex does not fall silent in blindness. About half the human neocortex is thought to be devoted to visual processing. However, in blind people, who are deprived of the visual input that drives those brain regions, those areas become responsive to auditory, tactile, and other nonvisual tasks, a phenomenon called crossmodal plasticity. What kinds of computations take place under these circumstances? How do crossmodally recruited brain regions represent information? To what extent are the connections mediating this plasticity present in sighted people as well? These and many other basic questions remain debated despite decades of crossmodal plasticity research.

Want to join the lab or participate in a study?

Check here for details on any current funded or volunteer opportunities. 

For inquiries, email the lab PI at santani@ski.org

Tabs

Presentations/Posters
García-Lázaro, H. G., Alwis, Y., & Teng, S.. (2022). Speech encoding modulated by task-relevance and reverberant acoustic statistics. presented at the 04/2022, Cognitive Society for Neuroscience: San Francisco, California.
Teng, S. (2022). Building a meaningful world when vision is unavailable. presented at the 10/2022, Ν⍴Ψ Neuroscience Honor Society meeting: Bradley University, Peoria, IL.
Teng, S. (2021). Common and distinct mechanisms in blind and sighted reading. presented at the 09/2021, Bay Area Vision Research Day: Berkeley, CA (Virtual).
Teng, S. (2021). Echolocation: A practical overview of sonar for adaptive neurotechnology. presented at the 07/2021, National Center for Adaptive Neurotechnologies Focus Course: Virtual.
García-Lázaro, H. G., Alwis, Y., & Teng, S.. (2021). Neural representations of temporal and spectral regularities of reverberant environments. presented at the 11/2021, 50th Society of Neuroscience Annual Meeting: Chicago/ Virtual.
  • Collage of RERC staff mebers and RERC projects

    Rehabilitation Engineering Research Center

    The Center's research goal is to develop and apply new scientific knowledge and practical, cost-effective devices to better understand and address the real-world problems of blind, visually impaired, and deaf-blind consumers

    View Center
Active Inactive
Current
Past
  • SKERI RERC logo

    SKERI Receives Rehabilitation Engineering Research Center (RERC) grant on...

    Smith-Kettlewell is proud to announce the newly awarded Rehabilitation Engineering Research Center (RERC) grant on Blindness and Low Vision. This is a five-year grant from the National Institute on Disability, Independent Living, and Rehabilitation Research, establishing Smith-Kettlewell as a center promoting the independence and well-being of people with visual impairments through research and development to improve the understanding of, and provide solutions for, challenges facing the blind and low-vision community.
  • Schematic diagram of a study showing inspection of an object by echolocation or vision, then discrimination by touch.

    Object echolocation project receives Best Poster award at OPAM 29

    A Smith-Kettlewell study measuring object information available via echolocation was recognized with a Best Poster award at the 2021 Conference on Object Perception, visual Attention, and visual Memory (OPAM 29).
  • Four-panel schematic illustrating an echolocation target, initially blurry, becoming progressively more sharply defined.

    Dr. Teng awarded grant to model human echolocation

    The National Eye Institute has awarded an R21 grant to Santani Teng, an Associate Scientist at Smith-Kettlewell, to model perceptual mechanisms of echolocation.
  • FoVea Logo

    Haydée García-Lázaro receives FoVea Award 2021 from Vision Sciences Society

    The FoVea Travel and Networking Award has the mission to advance the visibility, impact, and success of women in vision science. It is open to female members of the Vision Science Society (VSS) in pre-doctoral, post-doctoral, pre-tenure faculty, or research scientist positions. Haydée García-Lázaro, a postdoctoral fellow at SKERI working with Dr Satani Teng, received the FoVea Travel and Networking Award 2021 from Females of Vision et al., & Vision Sciences Society.
  • Haydée smiling

    Haydée García-Lázaro receives the Elsevier/Vision Research Virtual Travel...

    Haydée García-Lázaro, a postdoctoral fellow at SKERI working at TengLab received the Elsevier/Vision Research Virtual Travel Award 2021 from the Vision Sciences Society
  • SKERI Scientists Spearhead a Face Shield Making Campaign

    Founded by Dr. Santani Teng, the Bay Area Face Shield Supply (BAFSS) has sent hundreds of face shields to healthcare workers in the community. Dr. Teng has been working with Drs Audrey Wong-Kee-You and Cecile Vullings, along with other volunteers across the Bay Area to provide life-saving personal protective equipment to healthcare workers as they work with COVID-19 pateints.
  • A schematic diagram showing a neural activation map superimposed on a head, which is emitting sound waves that a nearby surface is reflecting.

    SKERI researcher awarded grant to study echolocation

    Santani Teng, an Associate Scientist at Smith-Kettlewell , has been awarded a three-year grant from The E. Matilda Ziegler Foundation for the Blind and Visually Impaired to study the neural processes of echolocation. Like bats and dolphins, some blind people use active echolocation : they make sounds and use the reflections to perceive and interact with their surroundings. These sounds, often in...