Why the world looks stable while we move

March 12, 2018, Universitaet Tübingen
The experimental setup relies on computer-controlled air cushions to stabilise the test subject's head within the fMRI scanner. This allows scanning the moving head. LEDs are used as reference points to measure head movements and adapt the VR feed accordingly. Credit: Tübingen University

Head movements change the environmental image received by the eyes. People still perceive the world as stable, because the brain corrects for any changes in visual information caused by head movements. For the first time, two neuroscientists of the University of Tübingen's Werner Reichardt Centre for Integrative Neuroscience (CIN) have observed these correction processes in the brain with functional magnetic resonance imaging (fMRI). Their study, now published in NeuroImage, has far-reaching implications for the understanding of the effects of virtual realities on our brain.

Even while moving, the environment appears stable, because the brain constantly balances input from different senses. Visual stimuli are compared with input from the sense of equilibrium, the relative positions of the head and body, and of the movements being performed. The result: when people walk or run around, the perception of the world surrounding us does not roll our bounce. But when and our perception of do not fit together, this balancing act in the brain falls apart.

Anybody who has ever delved into fantasy worlds with virtual reality glasses may have experienced this disconnect. VR glasses continually monitor head movements, and the computer adapts the devices' visual input. Nevertheless, prolonged use of VR glasses often leads to motion sickness .Even modern VR systems lack the precision necessary for visual information and head movements to chime perfectly.

Until recently, neuroscientists have not been able to identify the mechanisms that enable the brain to harmonise visual and motion perception. Modern noninvasive studies on human subjects such as by imaging (fMRI) run into one particular problem: images can only be obtained of the resting head.

Tübingen researchers Andreas Schindler and Andreas Bartels have developed a sophisticated apparatus to circumvent this problem. They are now able to employ fMRI to observe what happens in the brain when we move our head while perceiving fitting – or non-fitting – visual and motion stimuli. In order to do so, subjects wearing VR glasses entered a specially modified fMRI scanner in which computer-controlled air cushions fixate the subjects' heads immediately following movement. During the head movements, the VR glasses displayed images congruent with the movements. In other cases, the glasses displayed images incongruent with head movements. When the air cushions stabilised the probands' heads after the movements, the fMRI signal was recorded.

Andreas Schindler says, "With fMRI, we cannot directly measure neuronal activity. fMRI just shows blood flow and oxygen saturation in the brain, with a delay of several seconds. That is often seen as a deficiency, but for our study, it was actually useful for once: we were able to record the very moment when the subject's brain was busy balancing its own head movement and the images displayed on the VR glasses. And we were able to do so seconds after the fact, when the subject's head was already resting quietly on its air cushion. Normally, head movements and brain imaging don't go together, but we hacked the system, so to speak."

The researchers could thus observe brain activity that had so far only been investigated in primates and, indirectly, in certain patients with brain lesions. One area in the posterior insular cortex showed heightened activity whenever the VR display and movements congruently simulated a stable environment. When the two signals conflicted, this heightened activity vanished. The same observation held true in a number of other brain regions responsible for the processing of in motion.

The new method opens the door for a more focused study of the neuronal interactions between motion and visual perception. Moreover, the Tübingen researchers have shown for the first time what happens in the when we enter virtual worlds and balance on the knife's edge between immersion and motion sickness.

Explore further: The human brain can 'see' what is around the corner

More information: Andreas Schindler et al. Integration of visual and non-visual self-motion cues during voluntary head movements in the human brain, NeuroImage (2018). DOI: 10.1016/j.neuroimage.2018.02.006

Related Stories

The human brain can 'see' what is around the corner

December 4, 2017
Neuroscientists at the University of Glasgow have shown how the human brain can predict what our eyes will see next, using functional magnetic resonance imaging (fMRI).

Seeing movement: Why the world in our head stays still when we move our eyes

March 21, 2012
Scientists from Germany discovered new functions of brain regions that are responsible for seeing movement.

Can't get an image out of your head? Your eyes are helping to keep it there

February 14, 2018
Even though you are not aware of it, your eyes play a role in searing an image into your brain, long after you have stopped looking at it.

How the brain sees the world in 3-D

March 21, 2017
We live in a three-dimensional world, but everything we see is first recorded on our retinas in only two dimensions.

Virtual reality users must learn to use what they see

December 4, 2017
Anyone with normal vision knows that a ball that seems to quickly be growing larger is probably going to hit them on the nose.

Bifocals in the brain: Visual information from near and far space processed with differing degrees of acuity

June 10, 2016
Neuroscientists from Tübingen have discovered how our brain processes visual stimuli above and below the horizon differently. The researchers led by Dr. Ziad Hafed of the Werner Reichardt Centre for Integrative Neuroscience ...

Recommended for you

Cell study reveals how head injuries lead to serious brain diseases

November 16, 2018
UCLA biologists have discovered how head injuries adversely affect individual cells and genes that can lead to serious brain disorders. The life scientists provide the first cell "atlas" of the hippocampus—the part of the ...

Newborn babies' brain responses to being touched on the face measured for the first time

November 16, 2018
A newborn baby's brain responds to being touched on the face, according to new research co-led by UCL.

Precision neuroengineering enables reproduction of complex brain-like functions in vitro

November 14, 2018
One of the most important and surprising traits of the brain is its ability to dynamically reconfigure the connections to process and respond properly to stimuli. Researchers from Tohoku University (Sendai, Japan) and the ...

New brain imaging research shows that when we expect something to hurt it does, even if the stimulus isn't so painful

November 14, 2018
Expect a shot to hurt and it probably will, even if the needle poke isn't really so painful. Brace for a second shot and you'll likely flinch again, even though—second time around—you should know better.

A 15-minute scan could help diagnose brain damage in newborns

November 14, 2018
A 15-minute scan could help diagnose brain damage in babies up to two years earlier than current methods.

New clues to the origin and progression of multiple sclerosis

November 13, 2018
Mapping of a certain group of cells, known as oligodendrocytes, in the central nervous system of a mouse model of multiple sclerosis (MS), shows that they might have a significant role in the development of the disease. The ...

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.