New research suggests speech comprehension improves with a slight auditory delay in some individuals

April 21, 2017
Credit: Dean Drobot/Shutterstock.com

A new paper has shed light on sensory timing in the brain by showing that there is a lag between us hearing a person's voice and seeing their lips move, and these delays vary depending on what task we're doing.

The research from City, University of London – co-authored by academics from University of Sussex, Middlesex University and Birkbeck, University of London – also showed potential benefits of tailoring the time delay between audio and video signals to each individual, finding that speech comprehension improved in 50 per cent of participants by 20 words in every 100, on average. The paper is published in the journal Scientific Reports.

Dr Elliot Freeman, author of the paper and Senior Lecturer in Psychology at City, University of London, said: "Our study sheds new light on a controversy spanning over two centuries, over whether senses can really be out of sync in some individuals. Speech comprehension was at its best in our study when there was a slight auditory lag, which suggests that perception can actually be less-than-optimal when we converse or watch TV with synchronised sound and vision. We think that by tailoring auditory delays to each individual we could improve understanding and correct their sub-optimal perception."

To investigate the effect, the team showed 36 participants audiovisual movies depicting the lower half of a face speaking words or syllables, with the audio degraded by background noise. Stimuli were presented with a range of audiovisual asynchronies, spanning nine equally spaced levels from 500ms auditory lead to 500ms auditory lag, including simultaneous. In two separate tasks, participants had to identify which phoneme ('ba' or 'da'), or which word the speaker said.

They team found that there was an average auditory lag of 91ms for the phoneme identification (based on the famous McGurk effect) and 113ms for word identification, but this varied widely between individuals.

Interestingly, the measurements from each task for individual participants were similar for repetitions of the same task, but not for different tasks. This suggests that we may have not just one general individual between seeing and hearing, but different ones for different tasks.

The authors speculate that these task-specific delays arise because neural signals from eyes and ears are integrated in different brain areas for different tasks, and that they must each travel via different routes to arrive there. Individual brains may differ in the length of these connections.

Dr Freeman said: "What we found is that sight and sound are in fact out of sync, and that each is unique and stable for each individual. This can have a real impact on the interpretation of audiovisual speech.

"Poor lip-sync, as often experienced on cable TV and video phone calls, makes it harder to understand what people are saying. Our research could help us better understand why our natural lip-syncing may not always be perfect, and how this may affect everyday communication. We might be able to improve communication by artificially correcting sensory delays between sight and sound. Our work might be translated into novel diagnostics and therapeutic applications benefiting individuals with dyslexia, autism spectrum or hearing impairment."

Explore further: When your eyes override your ears: New insights into the McGurk effect

More information: See a visual demonstration of the test: www.staff.city.ac.uk/~sbbf269/Elliot_Freemans_Home/My_Research/Entries/2013/3/19_Sight_and_sound_are_out_of_synch.html

Alberta Ipser et al. Sight and sound persistently out of synch: stable individual differences in audiovisual synchronisation revealed by implicit measures of lip-voice integration, Scientific Reports (2017). DOI: 10.1038/srep46413

Related Stories

The man who hears you speak before he sees your lips move

July 8, 2013

Research led by Dr Elliot Freeman, from City University London's Department of Psychology, which examined the first documented case of someone who hears people talk before he sees their lips move, has been published in New ...

Hearing with your eyes—a Western style of speech perception

November 15, 2016

Which parts of a person's face do you look at when you listen them speak? Lip movements affect the perception of voice information from the ears when listening to someone speak, but native Japanese speakers are mostly unaffected ...

Brain waves indicate listening challenges in older adults

February 4, 2015

The elderly often complain about hearing difficulties, especially when several people are talking all at once. Researchers at the Max Planck Institute for Cognitive and Brain Sciences in Leipzig have discovered that the reason ...

Recommended for you

Brains of one-handed people suggest new organization theory

April 20, 2017

In people born with one hand, the brain region that would normally light up with that missing hand's activity lights up instead with the activity of other body parts—including the arm, foot, and mouth—that fill in for ...

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.