New research suggests speech comprehension improves with a slight auditory delay in some individuals

April 21, 2017
Credit: Dean Drobot/Shutterstock.com

A new paper has shed light on sensory timing in the brain by showing that there is a lag between us hearing a person's voice and seeing their lips move, and these delays vary depending on what task we're doing.

The research from City, University of London – co-authored by academics from University of Sussex, Middlesex University and Birkbeck, University of London – also showed potential benefits of tailoring the time delay between audio and video signals to each individual, finding that speech comprehension improved in 50 per cent of participants by 20 words in every 100, on average. The paper is published in the journal Scientific Reports.

Dr Elliot Freeman, author of the paper and Senior Lecturer in Psychology at City, University of London, said: "Our study sheds new light on a controversy spanning over two centuries, over whether senses can really be out of sync in some individuals. Speech comprehension was at its best in our study when there was a slight auditory lag, which suggests that perception can actually be less-than-optimal when we converse or watch TV with synchronised sound and vision. We think that by tailoring auditory delays to each individual we could improve understanding and correct their sub-optimal perception."

To investigate the effect, the team showed 36 participants audiovisual movies depicting the lower half of a face speaking words or syllables, with the audio degraded by background noise. Stimuli were presented with a range of audiovisual asynchronies, spanning nine equally spaced levels from 500ms auditory lead to 500ms auditory lag, including simultaneous. In two separate tasks, participants had to identify which phoneme ('ba' or 'da'), or which word the speaker said.

They team found that there was an average auditory lag of 91ms for the phoneme identification (based on the famous McGurk effect) and 113ms for word identification, but this varied widely between individuals.

Interestingly, the measurements from each task for individual participants were similar for repetitions of the same task, but not for different tasks. This suggests that we may have not just one general individual between seeing and hearing, but different ones for different tasks.

The authors speculate that these task-specific delays arise because neural signals from eyes and ears are integrated in different brain areas for different tasks, and that they must each travel via different routes to arrive there. Individual brains may differ in the length of these connections.

Dr Freeman said: "What we found is that sight and sound are in fact out of sync, and that each is unique and stable for each individual. This can have a real impact on the interpretation of audiovisual speech.

"Poor lip-sync, as often experienced on cable TV and video phone calls, makes it harder to understand what people are saying. Our research could help us better understand why our natural lip-syncing may not always be perfect, and how this may affect everyday communication. We might be able to improve communication by artificially correcting sensory delays between sight and sound. Our work might be translated into novel diagnostics and therapeutic applications benefiting individuals with dyslexia, autism spectrum or hearing impairment."

Explore further: When your eyes override your ears: New insights into the McGurk effect

More information: See a visual demonstration of the test: www.staff.city.ac.uk/~sbbf269/ … re_out_of_synch.html

Alberta Ipser et al. Sight and sound persistently out of synch: stable individual differences in audiovisual synchronisation revealed by implicit measures of lip-voice integration, Scientific Reports (2017). DOI: 10.1038/srep46413

Related Stories

When your eyes override your ears: New insights into the McGurk effect

February 16, 2017
Seeing is not always believing - visual speech (mouth movements) mismatched with auditory speech (sounds) can result in the perception of an entirely different message. This mysterious illusion is known as the McGurk effect. ...

Synaesthesia 'hearing-motion' phenomenon more common than previously thought, says study

January 19, 2017
A little-known synaesthetic 'hearing-motion' phenomenon in which people hear faint sounds following totally silent visual flashes may be more common than previously realised, according to a new study from City, University ...

The man who hears you speak before he sees your lips move

July 8, 2013
Research led by Dr Elliot Freeman, from City University London's Department of Psychology, which examined the first documented case of someone who hears people talk before he sees their lips move, has been published in New ...

Study reveals senses of sight and sound separated in children with autism

January 14, 2014
Like watching a foreign movie that was badly dubbed, children with autism spectrum disorders (ASD) have trouble integrating simultaneous information from their eyes and their ears, according to a Vanderbilt study published ...

Hearing with your eyes—a Western style of speech perception

November 15, 2016
Which parts of a person's face do you look at when you listen them speak? Lip movements affect the perception of voice information from the ears when listening to someone speak, but native Japanese speakers are mostly unaffected ...

Brain waves indicate listening challenges in older adults

February 4, 2015
The elderly often complain about hearing difficulties, especially when several people are talking all at once. Researchers at the Max Planck Institute for Cognitive and Brain Sciences in Leipzig have discovered that the reason ...

Recommended for you

Researchers find monkey brain structure that decides if viewed objects are new or unidentified

August 18, 2017
A team of researchers working at the University of Tokyo School of Medicine has found what they believe is the part of the monkey brain that decides if something that is being viewed is recognizable. In their paper published ...

Study of nervous system cells can help to understand degenerative diseases

August 18, 2017
The results of a new study show that many of the genes expressed by microglia differ between humans and mice, which are frequently used as animal models in research on Alzheimer's disease and other neurodegenerative disorders.

How whip-like cell appendages promote bodily fluid flow

August 18, 2017
Researchers at Nagoya University have identified a molecule that enables cell appendages called cilia to beat in a coordinated way to drive the flow of fluid around the brain; this prevents the accumulation of this fluid, ...

Researchers make surprising discovery about how neurons talk to each other

August 17, 2017
Researchers at the University of Pittsburgh have uncovered the mechanism by which neurons keep up with the demands of repeatedly sending signals to other neurons. The new findings, made in fruit flies and mice, challenge ...

Neurons involved in learning, memory preservation less stable, more flexible than once thought

August 17, 2017
The human brain has a region of cells responsible for linking sensory cues to actions and behaviors and cataloging the link as a memory. Cells that form these links have been deemed highly stable and fixed.

How we recall the past: Neuroscientists discover a brain circuit dedicated to retrieving memories

August 17, 2017
When we have a new experience, the memory of that event is stored in a neural circuit that connects several parts of the hippocampus and other brain structures. Each cluster of neurons may store different aspects of the memory, ...

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.