Study shows humans process visual information near our hands differently

May 1, 2014 by Bob Yirka, Medical Xpress report
The front and back of a human right hand. Credit: Wikipedia.

(Medical Xpress)—A trio of researchers, one from The Australian National University and the other two from the University of Toronto in Canada, has published a paper in the journal Psychonomic Bulletin and Review, in which they present evidence they've found that shows that humans process visual information near their hands differently than they do for visual information in other places. In their paper, Stephanie Goodhew, Nicole Fogel and Jay Pratt outline a lab study they conducted with volunteer participants that solidifies the notion that spatial information near our hands, is special.

Anecdotal evidence has suggested that we humans see things near our hands differently than we see everything else—our difficulty threading a needle is but one example. Proving this phenomenon has been difficult, but the researchers in this new effort might just have done it.

Scientists have various theories that might explain vision perception differences, such as one that suggest that our brains are hard-wired to think differently when paying close attention, another suggests that we have different kinds of brains cells that respond differently in different scenarios—one kind, call P cells, are better at processing small details in spatial imagery, the thinking goes, and another known as M cells, are not so good at the fine details but react much faster to changes in what is seen. It's this second theory that the researchers in this latest effort have tried to prove true.

To try to prove that we have P and C cells, the researchers asked volunteers to sit at a computer monitor and to watch as images were displayed. On the screen were two objects, each over a single color background. As the volunteers watched, the shapes were made to move behind a rectangle—when the shapes reemerged, they had changed—some had their background color changed, others their shape, and others both. When the researchers asked if the shapes had changed, they found that the volunteers responded slower if the background color had been changed—but, the kicker was that it only occurred when the volunteer's hands were not near the computer screen. When asked to move their hands within view of the screen, the slowdown went away.

The researchers say this simple experiment shows that the theorized M cells are what the brain uses for processing information when the hands are involved—not just when we are looking at something close up, or are concentrating hard. It only happens when the brain is aware that the hands are part of the being processed.

Explore further: Neuroscientists discover way to increase product value without making changes to it

More information: The nature of altered vision near the hands: Evidence for the magnocellular enhancement account from object correspondence through occlusion, Psychonomic Bulletin & Review, March 2014. link.springer.com/article/10.3 … %2Fs13423-014-0622-5

Abstract
A growing body of evidence indicates that the perception of visual stimuli is altered when they occur near the observer's hands, relative to other locations in space (see Brockmole, Davoli, Abrams, & Witt, 2013, for a review). Several accounts have been offered to explain the pattern of performance across different tasks. These have typically focused on attentional explanations (attentional prioritization and detailed attentional evaluation of stimuli in near-hand space), but more recently, it has been suggested that near-hand space enjoys enhanced magnocellular (M) input. Here we differentiate between the attentional and M-cell accounts, via a task that probes the roles of position consistency and color consistency in determining dynamic object correspondence through occlusion. We found that placing the hands near the visual display made observers use only position consistency, and not color, in determining object correspondence through occlusion, which is consistent with the fact that M cells are relatively insensitive to color. In contrast, placing observers' hands far from the stimuli allowed both color and position contribute. This provides evidence in favor of the M-cell enhancement account of altered vision near the hands.

Related Stories

Neuroscientists discover way to increase product value without making changes to it

March 10, 2014
(Medical Xpress)—A team of researchers working at the University of Texas has discovered a way to cause the perceived value of a product to rise, without changing the product itself: add a button that makes noise. In their ...

Human brains 'hard-wired' to link what we see with what we do

March 13, 2014
Your brain's ability to instantly link what you see with what you do is down to a dedicated information 'highway', suggests new UCL-led research.

Study finds mantis shrimp process vision differently than other organisms (w/ video)

January 24, 2014
(Phys.org) —Researchers with the University of Queensland, Brisbane along with an associate from National Cheng Kung University, in China have found what they believe to be a reasonable explanation for mantis shrimp having ...

Researchers find hand to mouth movement in humans likely hard-wired

April 1, 2014
(Medical Xpress)—A team of researchers in France has found evidence that suggests that human hand-to-mouth actions are hard-wired into the brain. In their paper published in Proceedings of the National Academy of Sciences, ...

Learning and remembering linked to holding material in hands, new research shows

September 23, 2011
New research from the University of Notre Dame shows that people’s ability to learn and remember information depends on what they do with their hands while they are learning.

Study indicates visual adaptation enhanced by sleep and may be tied to memory

August 28, 2013
(Medical Xpress)—A team of researchers at University College in London has conducted a study that suggests that visual adaptation is enhanced by sleep and might also be tied to memory. In their paper published in Proceedings ...

Recommended for you

Intensive behavior therapy no better than conventional support in treating teenagers with antisocial behavior

January 19, 2018
Research led by UCL has found that intensive and costly multisystemic therapy is no better than conventional therapy in treating teenagers with moderate to severe antisocial behaviour.

Babies' babbling betters brains, language

January 18, 2018
Babies are adept at getting what they need - including an education. New research shows that babies organize mothers' verbal responses, which promotes more effective language instruction, and infant babbling is the key.

College branding makes beer more salient to underage students

January 18, 2018
In recent years, major beer companies have tried to capitalize on the salience of students' university affiliations, unveiling marketing campaigns and products—such as "fan cans," store displays, and billboard ads—that ...

Inherited IQ can increase in early childhood

January 18, 2018
When it comes to intelligence, environment and education matter – more than we think.

Modulating molecules: Study shows oxytocin helps the brain to modulate social signals

January 17, 2018
Between sights, sounds, smells and other senses, the brain is flooded with stimuli on a moment-to-moment basis. How can it sort through the flood of information to decide what is important and what can be relegated to the ...

Baby brains help infants figure it out before they try it out

January 17, 2018
Babies often amaze their parents when they seemingly learn new skills overnight—how to walk, for example. But their brains were probably prepping for those tasks long before their first steps occurred, according to researchers.

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.