What you see affects what you hear (Videos)

March 4, 2009

Understanding what a friend is saying in the hubbub of a noisy party can present a challenge - unless you can see the friend's face.

New research from Baylor College of Medicine in Houston and the City College of New York shows that the visual information you absorb when you see can improve your understanding of the spoken words by as much as sixfold.

Your brain uses the visual information derived from the person's face and lip movements to help you interpret what you hear, and this benefit increases when the sound quality rises to moderately noisy, said Dr. Wei Ji Ma, assistant professor of neuroscience at BCM and the report's lead author, in a report that appears online today in the open access journal PLoS ONE.

The video will load shortly
Example of congruent AV stimuli (boot) - 12dB noise.

"Most people with normal hearing lip-read very well, even though they don't think so," said Ma. "At certain noise levels, lip-reading can increase word recognition performance from 10 to 60 percent correct."

However, when the environment is very noisy or when the voice you are trying to understand is very faint, lip-reading is difficult.

The video will load shortly
Examples of congruent AV* stimuli (cheap) - 12dB noise

"We find that a minimum sound level is needed for lip-reading to be most effective," said Ma.

This research is the first to study word recognition in a natural setting, where people report freely what they believe is being said. Previous experiments only used limited lists of words for people to choose from.

The lip-reading data help scientists understand how the brain integrates two different kinds of stimuli to come to a conclusion.

Ma and his colleagues constructed a mathematical model that allowed them to predict how successful a person will be at integrating the visual and auditory information.

People actually combine the two stimuli close to optimally, Ma said. What they perceive depends on the reliability of the stimuli.

"Suppose you are a detective," he said. "You have two witnesses to a crime. One is very precise and believable. The other one is not as believable. You take information from both and weigh the believability of each in your determination of what happened."

In a way, lip-reading involves the same kind of integration of information in the brain, he said.

In experiments, videos of individuals were shown in which a person said a word. If the person is presented normally, the visual information provides a great benefit when it is integrated with the auditory information, especially when there is moderate background noise. Surprisingly, if the person is just a "cartoon" that does not truly mouth the word, then the visual information is still helpful, though not as much.

In another study, the person mouths one word but the audio projects another, and often the brain integrates the two stimuli into a totally different perceived word.

"The mathematical model can predict how often the person will understand the word correctly in all these contexts," Ma said.

More information: Wei Ji Ma, Xiang Zhou, Lars A. Ross, John J. Foxe, Lucas C. Parra, " Lip-reading aids word recognition most in moderate noise: a Bayesian explanation using high-dimensional feature space," PLoS ONE, in press, to appear March 2009. dx.plos.org/10.1371/journal.pone.0004638

Source: Baylor College of Medicine

Explore further: Amazingly flexible: Learning to read in your 30s profoundly transforms the brain

Related Stories

Physical keyboards make virtual reality typing easier

May 8, 2017

What's better than a holographic keyboard? A real one, apparently. New research from computer scientists at Michigan Technological University delves into the different ways to type in a virtual reality (VR) space. They're ...

Why deaf people can have accents, too

March 22, 2017

Most people have probably encountered someone who appears to use lip-reading to overcome a hearing difficulty. But it is not as simple as that. Speech is "bimodal", in that we use both sounds and facial movements and gestures ...

Recommended for you

People match confidence levels to make decisions in groups

May 26, 2017

When trying to make a decision with another person, people tend to match their confidence levels, which can backfire if one person has more expertise than the other, finds a new study led by UCL and University of Oxford researchers.

Optic probes shed light on binge-eating

May 26, 2017

Activating neurons in an area of the brain not previously associated with feeding can produce binge-eating behavior in mice, a new Yale study finds.

Study finds gray matter density increases during adolescence

May 26, 2017

For years, the common narrative in human developmental neuroimaging has been that gray matter in the brain - the tissue found in regions of the brain responsible for muscle control, sensory perception such as seeing and hearing, ...

Game study not playing around with PTSD relief

May 26, 2017

Post-traumatic stress disorder (PTSD) patients wrestling with one of its main symptoms may find long-term relief beyond medication thanks to the work of a Western researcher.

Fathers' brains respond differently to daughters than sons

May 25, 2017

Fathers with toddler daughters are more attentive and responsive to those daughters' needs than fathers with toddler sons are to the needs of those sons, according to brain scans and recordings of the parents' daily interactions ...

Scientists demonstrate the existence of 'social neurons'

May 25, 2017

The existence of new "social" neurons has just been demonstrated by scientists from the Institut de neurosciences des systèmes (Aix-Marseille University / INSERM), the Laboratoire de psychologie sociale et cognitive (Université ...

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.