Lip-read words can be decoded from the auditory regions of the brain in the same way as heard speech.
- Words read silently on the lips can be categorized from the auditory cortex.
- The auditory system combines neural distributions evoked by heard and lip-read words to generate a more accurate estimate of what was said.
- According to the authors, this helps people maintain their hearing abilities as they age, highlighting the value of face-to-face communication in supporting listening comprehension.
Observing a person’s facial movements improves the accuracy of speech perception. By seeing these early visual cues, the auditory system primes auditory neurons before sounds are heard, according to David Brang, a professor of psychology at the University of Michigan (United States). “It is not known how these visual signals are represented in the auditory system or how they interact with auditory speech representations.”the researcher said.
Communication: Combining visual and auditory cues to obtain more precise information
In a study published in the journal Current Biologyhe and his team wanted to test the hypothesis that the auditory system encodes visual information from speech. To do this, the scientists used functional magnetic resonance imaging (fMRI) data from healthy adults and intracranial recordings from electrodes implanted in epileptic patients during auditory and visual speech perception tasks.
The results revealed that lip-read words could be classified at earlier times than heard words. “This suggests that lipreading may involve a predictive mechanism that facilitates speech processing before auditory information is available,” explained David Brang. The work supports a model in which the auditory system rapidly integrates lipreading information to improve hearing abilities, particularly in challenging listening environments such as noisy restaurants, and to obtain more accurate speech information, which significantly improves communication abilities.
“The ability of visual speech to activate and encode information in the auditory cortex appears to be a crucial mechanism”
According to the authors, this rapid use of lipreading information is probably even more pronounced in people with hearing loss. “As hearing abilities decline, people increasingly rely on visual cues to aid in comprehension. The ability of visual speech to activate and encode information in the auditory cortex appears to be a crucial compensatory mechanism,” concluded David Brang.