Research led by Dr Elliot Freeman, from City University London's Department of Psychology, which examined the first documented case of someone who hears people talk before he sees their lips move, has been published in New Scientist magazine.
The research team studied PH, a retired pilot, who first experienced 'auditory leading' while watching television. He initially suspected poor dubbing, but later noticed the same phenomenon in conversations with people. He could hear what they were saying before he could see their lips moving saying the words he had just heard.
Testing with simple computer-based tasks confirmed a visual delay of almost a quarter of a second. The team then tested a number of 'normal' participants, and were surprised to find similar, though less profound, asynchronies between hearing and vision. They concluded that the sensory asynchrony found in PH is not an exception, but may be a general rule.
Accurately synchronising sight and sound is important in performance of many complex skills such as lip-reading, learning to read or to speak a new language. In some cases a person's livelihood or even their life might depend on it. Performing artists, tennis players, pilots and surgeons all need to coordinate their behaviour rapidly and accurately on the basis of complex multisensory cues.
The research, could inspire new strategies and devices for optimising multisensory perception, according to Dr Freeman: "The exciting implications of our research are that multisensory perception is suboptimal in many healthy individuals - and can, in principle, be improved. By simply delaying one sense relative to another, we might be able to deliver immediate benefits, for example improving speech comprehension in hearing-impaired individuals or early language learners".
Explore further: 'Read my lips'—it's easier when they're your own