Wednesday, April 7, 2021

All day
 
 
Before 01
01
02
03
04
05
06
07
08
09
10
11
12
13
14
15
16
17
18
19
20
21
22
23
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
Zoom Brown Bag: Multisensory interactions in the primary visual and auditory cortex of humans: Evidence from source-imaged visual and auditory evoked potentials
Event Date:

Abstract - Our environment is multisensory; at any given time, information can be received through multiple senses. While it was previously believed that multisensory processing in the cortex was restricted to higher-order regions, there is now evidence to suggest that multisensory interactions may occur as early as in primary sensory regions (Kayser et al., 2009; Murray et al., 2016). For instance, both the primary visual and auditory cortex exhibit crossmodal sensitivity (Calvert et al., 1997; Brang et al., 2015) and direct connections between both regions have been reported (Beer et al., 2011, 2013). However, whether multisensory inputs actually converge in sensory regions of the cortex remains unclear. In this brown bag, I will present results from a study where we used source-imaged steady-state visual and auditory evoked potentials to address this open question. The goal of the study was to determine whether the primary visual and auditory cortex respond to crossmodal sensory stimulation and are locations of early multisensory input convergence in the cortex. We used a frequency-tagged approach in which a visual (FV) and auditory (FA) stimulus were presented at distinct modulation frequencies, either alone or concurrently. Significant responses at the harmonic frequencies of the visual (nFV) and auditory stimulus (nFA) were localized in both the primary visual and auditory cortex, even when the stimulus was presented alone. Moreover, significant responses at intermodulation (IM) frequencies (FV±FA), reflecting the convergence of visual and auditory inputs, were also observed and localized to these regions when the visual and auditory stimuli were presented concurrently. Overall, our results demonstrate that the visual and auditory cortex are multisensory: both regions respond to crossmodal stimulation and are cortical locations of multisensory signal convergence. https://www.ski.org/users/audrey-wong-kee-you

Improving Zoom accessibility for people with hearing impairments People with hearing impairments often use lipreading and speechreading to improve speech comprehension. This approach is helpful but only works if the speaker’s face and mouth are clearly visible. For the benefit of people with hearing impairments on Zoom calls, please enable your device’s camera whenever you are speaking on Zoom, and face the camera while you speak. (Feel free to disable your camera when you aren’t speaking.)