New research suggests speech comprehension improves with a slight auditory delay in some individuals

April 21, 2017
Credit: Dean Drobot/Shutterstock.com

A new paper has shed light on sensory timing in the brain by showing that there is a lag between us hearing a person's voice and seeing their lips move, and these delays vary depending on what task we're doing.

The research from City, University of London – co-authored by academics from University of Sussex, Middlesex University and Birkbeck, University of London – also showed potential benefits of tailoring the time delay between audio and video signals to each individual, finding that speech comprehension improved in 50 per cent of participants by 20 words in every 100, on average. The paper is published in the journal Scientific Reports.

Dr Elliot Freeman, author of the paper and Senior Lecturer in Psychology at City, University of London, said: "Our study sheds new light on a controversy spanning over two centuries, over whether senses can really be out of sync in some individuals. Speech comprehension was at its best in our study when there was a slight auditory lag, which suggests that perception can actually be less-than-optimal when we converse or watch TV with synchronised sound and vision. We think that by tailoring auditory delays to each individual we could improve understanding and correct their sub-optimal perception."

To investigate the effect, the team showed 36 participants audiovisual movies depicting the lower half of a face speaking words or syllables, with the audio degraded by background noise. Stimuli were presented with a range of audiovisual asynchronies, spanning nine equally spaced levels from 500ms auditory lead to 500ms auditory lag, including simultaneous. In two separate tasks, participants had to identify which phoneme ('ba' or 'da'), or which word the speaker said.

They team found that there was an average auditory lag of 91ms for the phoneme identification (based on the famous McGurk effect) and 113ms for word identification, but this varied widely between individuals.

Interestingly, the measurements from each task for individual participants were similar for repetitions of the same task, but not for different tasks. This suggests that we may have not just one general individual between seeing and hearing, but different ones for different tasks.

The authors speculate that these task-specific delays arise because neural signals from eyes and ears are integrated in different brain areas for different tasks, and that they must each travel via different routes to arrive there. Individual brains may differ in the length of these connections.

Dr Freeman said: "What we found is that sight and sound are in fact out of sync, and that each is unique and stable for each individual. This can have a real impact on the interpretation of audiovisual speech.

"Poor lip-sync, as often experienced on cable TV and video phone calls, makes it harder to understand what people are saying. Our research could help us better understand why our natural lip-syncing may not always be perfect, and how this may affect everyday communication. We might be able to improve communication by artificially correcting sensory delays between sight and sound. Our work might be translated into novel diagnostics and therapeutic applications benefiting individuals with dyslexia, autism spectrum or hearing impairment."

Explore further: When your eyes override your ears: New insights into the McGurk effect

More information: See a visual demonstration of the test: www.staff.city.ac.uk/~sbbf269/ … re_out_of_synch.html

Alberta Ipser et al. Sight and sound persistently out of synch: stable individual differences in audiovisual synchronisation revealed by implicit measures of lip-voice integration, Scientific Reports (2017). DOI: 10.1038/srep46413

Related Stories

When your eyes override your ears: New insights into the McGurk effect

February 16, 2017
Seeing is not always believing - visual speech (mouth movements) mismatched with auditory speech (sounds) can result in the perception of an entirely different message. This mysterious illusion is known as the McGurk effect. ...

Synaesthesia 'hearing-motion' phenomenon more common than previously thought, says study

January 19, 2017
A little-known synaesthetic 'hearing-motion' phenomenon in which people hear faint sounds following totally silent visual flashes may be more common than previously realised, according to a new study from City, University ...

The man who hears you speak before he sees your lips move

July 8, 2013
Research led by Dr Elliot Freeman, from City University London's Department of Psychology, which examined the first documented case of someone who hears people talk before he sees their lips move, has been published in New ...

Study reveals senses of sight and sound separated in children with autism

January 14, 2014
Like watching a foreign movie that was badly dubbed, children with autism spectrum disorders (ASD) have trouble integrating simultaneous information from their eyes and their ears, according to a Vanderbilt study published ...

Hearing with your eyes—a Western style of speech perception

November 15, 2016
Which parts of a person's face do you look at when you listen them speak? Lip movements affect the perception of voice information from the ears when listening to someone speak, but native Japanese speakers are mostly unaffected ...

Brain waves indicate listening challenges in older adults

February 4, 2015
The elderly often complain about hearing difficulties, especially when several people are talking all at once. Researchers at the Max Planck Institute for Cognitive and Brain Sciences in Leipzig have discovered that the reason ...

Recommended for you

Study finds graspable objects grab attention more than images of objects do

December 15, 2017
Does having the potential to act upon an object have a unique influence on behavior and brain responses to the object? That is the question Jacqueline Snow, assistant professor of psychology at the University of Nevada, Reno, ...

Little understood cell helps mice see color

December 14, 2017
Researchers at the University of Colorado Anschutz Medical Campus have discovered that color vision in mice is far more complex than originally thought, opening the door to experiments that could potentially lead to new treatments ...

Scientists chart how brain signals connect to neurons

December 14, 2017
Scientists at Johns Hopkins have used supercomputers to create an atomic scale map that tracks how the signaling chemical glutamate binds to a neuron in the brain. The findings, say the scientists, shed light on the dynamic ...

Activating MSc glutamatergic neurons found to cause mice to eat less

December 13, 2017
(Medical Xpress)—A trio of researchers working at the State University of New York has found that artificially stimulating neurons that exist in the medial septal complex in mouse brains caused test mice to eat less. In ...

Gene mutation causes low sensitivity to pain

December 13, 2017
A UCL-led research team has identified a rare mutation that causes one family to have unusually low sensitivity to pain.

Scientists discover blood sample detection method for multiple sclerosis

December 13, 2017
A method for quickly detecting signs of multiple sclerosis has been developed by a University of Huddersfield research team.

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.