Study shows how our brains sync hearing with vision
Every high-school physics student learns that sound and light travel at very different speeds. If the brain did not account for this difference, it would be much harder for us to tell where sounds came from, and how they are related to what we see.
Instead, the brain allows us to make better sense of our world by playing tricks, so that a visual and a sound created at the same time are perceived as synchronous, even though they reach the brain and are processed by neural circuits at different speeds.
One of the brain's tricks is temporal recalibration: altering our sense of time to synchronize our joint perception of sound and vision. A new study finds that recalibration depends on brain signals constantly adapting to our environment to sample, order and associate competing sensory inputs together.
Scientists at The Neuro (Montreal Neurological Institute-Hospital) of McGill university recruited volunteers to view short flashes of light paired with sounds with a variety of delays and asked them to report whether they thought both happened at the same time. The participants performed this task inside a magnetoencephalography (MEG) machine, which recorded and imaged their brain waves with millisecond precision. The audio-visual pairs of stimuli changed each time, with sounds and visual objects presented closer or farther apart in time, and with random orders of presentation.
The researchers found that the volunteers' perception of simultaneity between the audio and visual stimuli in a pair was strongly affected by the perceived simultaneity of the stimulus pair before it. For example, if presented with a sound followed by a visual milliseconds apart and perceived as asynchronous, one is much more likely to report the next audio-visual stimulus pair as synchronous, even when it's not. This form of active temporal recalibration is one of the tools used by the brain to avoid a distorted or disconnected perception of reality, and help establish causal relations between the images and sounds we perceive, despite different physical velocities and neural processing speeds.
The MEG signals revealed that this brain feat was enabled by a unique interaction between fast and slow brain waves in auditory and visual brain regions. Slower brain rhythms pace the temporal fluctuations of excitability in brain circuits. The higher the excitability, the easier an external input is registered and processed by receiving neural networks.
Based on this, the researchers propose a new model for understanding recalibration, whereby faster oscillations riding on top of slower fluctuations create discrete and ordered time slots to register the order of sensory inputs. For example, when an audio signal reaches the first available time slot in the auditory cortex and so does a visual input, the pair is perceived as simultaneous. For this to happen, the brain needs to position the visual time slots a bit later than the auditory ones to account for the slower physiological transduction of visual signals. The researchers found that this relative delay between neural auditory and visual time slots is a dynamic process that constantly adapts to each participant's recent exposure to audiovisual perception.
Their data confirmed the new dynamic integration model by showing how these subtle tens-of-millisecond delays of fast brain oscillations can be measured in every individual and explain their respective judgments of perceived simultaneity.
In autism and speech disorders, the processing of the senses, especially hearing, is altered. In schizophrenia as well, patients can be affected by perceived distortions of sensory inputs. The neurophysiological mechanisms of temporal recalibration described in this study may be altered in these disorders, and their discovery may reveal new research goals to improve these deficits.
"Overall, this study emphasizes that our brains constantly absorb and adapt to the bombardment of sensory information from diverse sources," says Sylvain Baillet, a researcher at The Neuro and the study's senior author. "To make sense of our complex environments, including social interactions, brain circuits actively make adjustments of subtle physiological mechanisms to better anticipate and predict the nature and timing of external stimulations. That helps us build a resilient and adaptive mental map of their representation."