CCN Forum: Visual speech modulates auditory speech perception in the auditory cortex through multiple distinct mechanisms
Karthik Ganesan, Graduate Student, Cognitive and Cognitive Neuroscience
- All News
- Events
-
- Upcoming Events
- Biopsychology
- Clinical Science
- Cognition & Cognitive Neuroscience
- Developmental Psychology
- Departmental
- Diversity
- Evolution & Human Adaptations Program (EHAP)
- Exploring the Mind Community Talks
- Gender and Feminist Psychology
- Graduate Program
- Methods Hour
- Michigan Neuroimaging Initiative
- Personality and Social Contexts
- Social Psychology
- Undergraduate Program
- Women's & Gender Studies and Psychology
Friday, February 26, 2021
2:00-3:00 PM
Virtual
Abstract:
Visual speech cues are often needed to disambiguate distorted speech sounds in the natural environment. However, understanding how the brain encodes and transmits visual information for usage by the auditory system remains a challenge. One persistent question is whether visual signals have a unitary effect on auditory processing or elicit multiple distinct effects throughout auditory cortex. In this talk, I will present results from our group that attempts to better understand how vision modulates speech processing. Data from neural responses to audiovisual speech in electrodes implanted intracranially as well as pilot results from a complementary study using fMRI will be discussed. These data show that visual speech modulates auditory processes in the auditory cortex in multiple ways, eliciting temporally and spatially distinct patterns of activity that differ across theta, beta, and high-gamma frequency bands. These results are consistent with models that posit multiple distinct mechanisms supporting audiovisual speech perception.
Visual speech cues are often needed to disambiguate distorted speech sounds in the natural environment. However, understanding how the brain encodes and transmits visual information for usage by the auditory system remains a challenge. One persistent question is whether visual signals have a unitary effect on auditory processing or elicit multiple distinct effects throughout auditory cortex. In this talk, I will present results from our group that attempts to better understand how vision modulates speech processing. Data from neural responses to audiovisual speech in electrodes implanted intracranially as well as pilot results from a complementary study using fMRI will be discussed. These data show that visual speech modulates auditory processes in the auditory cortex in multiple ways, eliciting temporally and spatially distinct patterns of activity that differ across theta, beta, and high-gamma frequency bands. These results are consistent with models that posit multiple distinct mechanisms supporting audiovisual speech perception.
Building: | Off Campus Location |
---|---|
Location: | Virtual |
Event Type: | Presentation |
Tags: | Talk |
Source: | Happening @ Michigan from Department of Psychology, Cognition & Cognitive Neuroscience, Weinberg Institute for Cognitive Science |