In a noisy environment, lip-reading can help us to better understand the person we are speaking to

November 13, 2012, Max Planck Society
An area in the left posterior superior temporal sulcus (STS) showed greater activation when mouth movements did not match the expected word. Credit: MPI for Human Cognitive and Brain Sciences

(Medical Xpress)—In a noisy environment, lip-reading can aid understanding of a conversation. Researchers at the Max Planck Institute for Human Cognitive and Brain Sciences who have been investigating this phenomenon are now able to show that the greater the activity in a particular region of the temporal lobe was, the more able participants were to match words with mouth movements. Visual and auditory information are combined in the so-called superior temporal sulcus (STS).

In everyday life we rarely consciously try to lip-read. However, in a it is often very helpful to be able to see the mouth of the person you are speaking to. Re-searcher Helen Blank at the MPI in Leipzig explains why this is so: "When our brain is able to combine information from different sensory sources, for example during lip-reading, speech comprehension is improved." In a recent study, the researchers of the Research Group " of Human Communication" investigated this phenomenon in more detail to uncover how visual and auditory work together during lip-reading.

In the experiment, brain activity was measured using () while participants heard short sentences. The participants then watched a short silent video of a person speaking. Using a button press, participants indicated whether the sentence they had heard matched the mouth movements in the video. If the sentence did not match the video, a part of the brain network that combines visual and auditory information showed greater activity and there were increased connections between the auditory speech region and the STS.

"It is possible that advanced auditory information generates an expectation about the that will be seen", says Blank. "Any contradiction between the prediction of what will be seen and what is actually observed generates an error signal in the STS."

How strong the activation is depends on the lip-reading skill of participants: The strong-er the activation, the more correct responses were. "People that were the best lip-readers showed an especially strong error signal in the STS", Blank explains. This effect seems to be specific to the content of speech - it did not occur when the subjects had to decide if the identity of the voice and face matched.

The results of this study are very important to basic research in this area. A better un-derstanding of how the brain combines auditory and visual information during speech processing could also be applied in clinical settings. "People with hearing impairment are often strongly dependent on lip-reading", says Blank. The researchers suggest that further studies could examine what happens in the brain after lip-reading training or during a combined use of sign language and lip-reading.

Explore further: Playing music alters the processing of multiple sensory stimuli in the brain

More information: Blank, H., von Kriegstein, K. Mechanisms of enhancing visual–speech recognition by prior auditory information. NeuroImage 65, 15 January 2013, 109–118.

Related Stories

Playing music alters the processing of multiple sensory stimuli in the brain

November 24, 2011
(Medical Xpress) -- Over the years pianists develop a particularly acute sense of the temporal correlation between the movements of the piano keys and the sound of the notes played. However, they are no better than non-musicians ...

Brain 'hears' voices when reading direct speech

July 26, 2011
(Medical Xpress) -- When reading direct quotations, the brain ‘hears’ the voice of the speaker, say scientists.

Recommended for you

Research reveals atomic-level changes in ALS-linked protein

January 18, 2018
For the first time, researchers have described atom-by-atom changes in a family of proteins linked to amyotrophic lateral sclerosis (ALS), a group of brain disorders known as frontotemporal dementia and degenerative diseases ...

Fragile X finding shows normal neurons that interact poorly

January 18, 2018
Neurons in mice afflicted with the genetic defect that causes Fragile X syndrome (FXS) appear similar to those in healthy mice, but these neurons fail to interact normally, resulting in the long-known cognitive impairments, ...

How your brain remembers what you had for dinner last night

January 17, 2018
Confirming earlier computational models, researchers at University of California San Diego and UC San Diego School of Medicine, with colleagues in Arizona and Louisiana, report that episodic memories are encoded in the hippocampus ...

Recording a thought's fleeting trip through the brain

January 17, 2018
University of California, Berkeley neuroscientists have tracked the progress of a thought through the brain, showing clearly how the prefrontal cortex at the front of the brain coordinates activity to help us act in response ...

Midbrain 'start neurons' control whether we walk or run

January 17, 2018
Locomotion comprises the most fundamental movements we perform. It is a complex sequence from initiating the first step, to stopping when we reach our goal. At the same time, locomotion is executed at different speeds to ...

Neuroscientists suggest a model for how we gain volitional control of what we hold in our minds

January 16, 2018
Working memory is a sort of "mental sketchpad" that allows you to accomplish everyday tasks such as calling in your hungry family's takeout order and finding the bathroom you were just told "will be the third door on the ...

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.