Study examines how brain corrects perceptual errors

March 23, 2011

( -- New research provides the first evidence that sensory recalibration — the brain's automatic correcting of errors in our sensory or perceptual systems — can occur instantly.

"Until recently, neuroscientists thought of sensory recalibration as a mechanism that is primarily used for coping with long-term changes, such as growth during development, brain injury or stroke," said Ladan Shams, a UCLA assistant professor of psychology and an expert on perception and cognitive neuroscience. "It appeared that extensive time, and thus many repetitions of error, were needed for mechanisms of recalibration to kick in. However, our findings indicate we don't need weeks, days, or even minutes or seconds to adapt. To some degree, we adapt instantaneously.

"If recalibration can occur in milliseconds, as we now think, then we can adapt even to transient changes in the environment and in our bodies."

In Shams' study, reported in tomorrow's (March 23) issue of the , 146 individuals, primarily UCLA undergraduates, performed what is known as a fundamental perceptual task. They looked at the center of a large screen that had eight speakers hidden behind it. Sometimes they heard only a brief burst of sound somewhat like radio static; sometimes they saw only a quick flash of light; and sometimes they both heard a sound and saw a light. They were asked to determine where the sound was and where the light was.

The participants, the researchers found, were much more accurate in determining where the light was than where the sound was.

The 'ventriloquist illusion'

"The perceived location of sound gets shifted toward the location of the visual stimulus," Shams said. "That is known as the 'ventriloquist illusion.' If I repeatedly, for thousands of times, present a flash of light on the left side and a sound on the right side, afterwards, even when the sound is presented alone, the perceived location of sound will be shifted to the left, toward where the flash was. The visual stimulus affects the perception of the sound, not only while it is present, but also as an after-effect. This phenomenon has been known, but neuroscientists thought it required a large number of repeated exposures.

"We found this shift can happen not after thousands of trials, but after just a single trial. A small fraction of a second is enough to cause this perceptual shift. These findings provide the first evidence that sensory recalibration can occur rapidly, after only milliseconds. This indicates that recalibration of auditory space does not require the accumulation of substantial evidence of error to become engaged, and instead it is operational continuously."

In the study, the subjects were presented with a variety of different combinations. For example, in one trial the flash could be 10 degrees to the right of the sound; in the next, it could be 15 degrees to the left of the sound; then there could be sound and no flash; then flash and no sound; then the sound and flash could be in the same location.

"For every trial that contained sound alone (with no flash), we studied how the subjects located the sound in relation to what they experienced in the previous trial, where there was a flash. We found a very strong correlation; if the flash was to the right of the sound in the previous trial, then on the trial with the sound alone, the sound was perceived a little to the right; if the flash was to the left of the sound on the previous trial, then on the trial with sound alone, the sound was perceived a little to the left. The larger the discrepancy, the larger the shift."

While the subjects seemed to be making perceptual errors rather than correcting them, Shams stressed that this was an unnatural environment in which researchers artificially created a discrepancy between auditory and visual stimuli to show how quickly recalibration could occur.

In the real world, she said, recalibration would actually result in a reduction in errors in a person experiencing an auditory-visual discrepency due to a flaw in one of their senses.

Implications for rehabilitation, robotics

This research could have implications for rehabilitation from brain injuries and could help in the development of prosthesis, when, for example, people get hearing devices and can use vision to guide their learning of how to localize sound. It also has implications for the design of robotic recalibration, which could be useful for aircraft as well as robots.

Our senses are similar to those of a robot. NASA's Mars Rovers, for instance, are sampling the planet's surface using cameras, sensors, microphones and other equipment, which, like our senses, can get damaged. If a camera becomes misaligned across the rocky terrain, its function will be diminished.

"Sensory recalibration is a critical function for both biological and artificial systems," Shams said. "As with artificial sensors, biological sensory systems can become faulty and need correction every now and then."

Ailments such as a blocked ear canal or a problem with our sense of smell or vision can lead to distorted perceptions — or shifts — in our spatial map. If there is a systematic error in our auditory system, it needs to be corrected. When biological sensory systems become faulty, the brain typically provides the correction automatically.

"Fortunately, human sensory systems already possess the uncanny ability to recalibrate their own localization maps through the interactions between visual and auditory systems," Shams said. "Our new findings show that the multisensory recalibration is continuously functioning after only milliseconds of sensory discrepancies, allowing for rapid adaptation to changes in sensory signals. This rapid adaptation allows not only adaptation to long-term changes such as those induced by injury and disease, but also adaptation to transient changes, such as changes in the echo properties of our surrounding space as we walk from one room to another room or from indoors to outdoors, or when one ear is temporarily blocked by hair or headwear."

The research by Shams and David Wozny — who earned his Ph.D. from UCLA in August in Shams' laboratory, and is currently a postdoctoral fellow at Oregon Health and Science University — is shedding light for the first time on the dynamics of sensory recalibration. They have learned, for example, that repeated exposures will increase the shift, which accumulates quickly before slowing down.

"Vision is teaching hearing," Shams said. "If vision tells me one time that sound is not here (indicating her left), but here (her right), then I shift my auditory map a little; if it happens twice in a row, I shift even more. If it happens three times in a row, I shift even more."

An optimal learning strategy?

Using the same set of data, Shams and Wozny published in the Aug. 5, 2010, issue of the journal PLoS Computational Biology a computational model that allows them to analyze why subjects perceive the sounds and sights in a particular way and what computations occur in their brains when they hear the sounds and see flashes. (Ulrik Beierholm, a former UCLA graduate student of Shams, who is currently a postdoctoral fellow at University College London's Gatsby Computational Neuroscience Unit, was a co-author.)

"By analyzing the data using three models, we can determine which model best explains the data and can characterize the strategy the subjects' brains use to make perceptual decisions, Shams said.

Determining the locations of sights and sounds is a basic brain function, and scientists assume that such functions are performed optimally because they have been refined over millions of years of evolution, Shams said. Because this is a basic task, neuroscientists would expect almost all brains to perform it in the same way.

"Surprisingly, we found the perceptual task is not performed uniformly across subjects. Different people use different strategies to perform this task," Shams said. "Secondly, the vast majority of people, at least 75 percent, use a strategy that is considered seriously sub-optimal."

What is this sub-optimal strategy? By way of analogy, Shams says, if there is a 70 percent chance of rain, you would be wise to take an umbrella with you.

"What we found is that instead of people taking the umbrella every time there is, say, a 70 percent chance of rain, so to speak, they match the probability: They take the umbrella only 70 percent of the time," she said.

When subjects were presented with a noise and a flash and were asked where they perceived the noise and flash to be coming from, their brains had to figure out whether the sound and flash were coming from the same location or from different locations.

"If they infer there is a 70 percent chance that the sound and flash are coming from the same object, for the majority of observers, 70 percent of the time they go with that estimate and 30 percent of the time they go with the unlikely estimate," Shams said. "Under conventional measures of optimality, which implicitly assume static environments, this strategy is highly suboptimal.

"However, the conventional way of thinking about these problems may not be correct after all. In a dynamic world, things may change constantly. The optimal strategy is to learn, and to learn you need to take some risks. Even if that's not the best choice at that time, in the long run, it may well be the best choice, because by exploring different possibilities, you may learn more. So paradoxically, a strategy that appears sub-optimal may actually be near-optimal. Perhaps the way we think about function should be revised."

Related Stories

Recommended for you

Neuro chip records brain cell activity

October 26, 2016

Brain functions are controlled by millions of brain cells. However, in order to understand how the brain controls functions, such as simple reflexes or learning and memory, we must be able to record the activity of large ...

Can a brain-computer interface convert your thoughts to text?

October 25, 2016

Ever wonder what it would be like if a device could decode your thoughts into actual speech or written words? While this might enhance the capabilities of already existing speech interfaces with devices, it could be a potential ...

The current state of psychobiotics

October 25, 2016

Now that we know that gut bacteria can speak to the brain—in ways that affect our mood, our appetite, and even our circadian rhythms—the next challenge for scientists is to control this communication. The science of psychobiotics, ...

After blindness, the adult brain can learn to see again

October 25, 2016

More than 40 million people worldwide are blind, and many of them reach this condition after many years of slow and progressive retinal degeneration. The development of sophisticated prostheses or new light-responsive elements, ...


Adjust slider to filter visible comments by rank

Display comments: newest first

not rated yet Mar 23, 2011
"So paradoxically, a strategy that appears sub-optimal may actually be near-optimal. Perhaps the way we think about brain function should be revised."

I think it's very likely that this "70%/30%" strategy is in fact quite optimal. Here, let me give an example:

You hear a sound and see a slight movement, which indicate danger. Your brain determines that it is:
70% likely to be in front
30% likely to be to the right

The researchers claim that the optimal strategy is to jump straight back 70% of the time. However, that would result in a 30% chance of being eaten. Not good.

A person would jump away at an angle, trying to open distance from both.

So the error in this study is that it forces the test subjects to choose ONLY one or the other, instead of a mix of both. Since their brain is giving them an angle instead of a definite answer, they end up choosing a mix of both answers.
not rated yet Mar 23, 2011
...the error in this study is that it forces the test subjects to choose ONLY one or the other, instead of a mix of both. Since their brain is giving them an angle instead of a definite answer, they end up choosing a mix of both answers.

I think you could be on to something there. What you wrote makes perfect sense.
not rated yet Mar 23, 2011
The umbrella analogy is a poor choice. Quick sensory adaptation and error correction really only makes sense biologically in the context of a fight or a chase, where failure means death. I'd hate to live on a planet where life had to evolve to instinctively know when to take an umbrella... that's some nasty rain!
1 / 5 (1) Mar 23, 2011
Hearing the "Ka-cheeng" of a cash register to his right, George's eyes snapped right also; when Angelo heard automatic gunfire to his left, his eyes snapped right to discern the fastest escape route. Some of these conjunctive sensoral coordinations may well be contextually related. The interesting question is at which level of psychoneurological processing "automatic" auditory-visual correlation is taken over by fundamental cognitive-emotional processing; i.e. from thalamic-amygdalic to frontal cortex involvement.
not rated yet Mar 24, 2011

So the error in this study is that it forces the test subjects to choose ONLY one or the other, instead of a mix of both. Since their brain is giving them an angle instead of a definite answer, they end up choosing a mix of both answers.

It is not an error. The goal was to get the brain to automatically associate visual perception with sound. If you use sub-optimal 30% failure rate; then you could mistakenly associate 1 sound and 1 visual as 2 object (which is an error).

One example is: to have a goosebump whenever you heard someone whisper but it is coming from far away. You will be panicing and start to look erratic until you found that person (by looking at him). It is a trick you do with special speaker... so now: imagine someone who constantly in confusion?
5 / 5 (1) Mar 24, 2011
"Perhaps the way we think about brain function should be revised."

Yes. Authors please note.

The findings, published in the May 5 issue of The Journal of Neuroscience, show for the first time that a sex hormone can directly affect auditory function, and point toward the possibility that estrogen controls other types of sensory processing as well. Understanding how estrogen changes the brain's response to sound, say the authors, might open the door to new ways of treating hearing deficiencies.

"We've discovered estrogen doing something totally unexpected," says Raphael Pinaud, assistant professor of brain and cognitive sciences at the University of Rochester and lead author of the study. "We show that estrogen plays a central role in how the brain extracts and interprets auditory information. It does this on a scale of milliseconds in neurons, as opposed to days, months or even years in which estrogen is more commonly known to affect an organism."

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.