See no shape, touch no shape, hear a shape?

October 18, 2010

(PhysOrg.com) -- Scientists at The Montreal Neurological Institute and Hospital – The Neuro, McGill University have discovered that our brains have the ability to determine the shape of an object simply by processing specially-coded sounds, without any visual or tactile input. Not only does this new research tell us about the plasticity of the brain and how it perceives the world around us, it also provides important new possibilities for aiding those who are blind or with impaired vision.

Shape is an inherent property of objects existing in both vision and touch but not sound. Researchers at The Neuro posed the question ‘can shape be represented by sound artificially?’ “The fact that a property of sound such as frequency can be used to convey shape information suggests that as long as the spatial relation is coded in a systematic way, can be preserved and made accessible - even if the medium via which space is coded is not spatial in its physical nature,” says Jung-Kyong Kim, PhD student in Dr. Robert Zatorre’s lab at The Neuro and lead investigator in the study.

In other words, similar to our ocean-dwelling dolphin cousins who use echolocation to explore their surroundings, our brains can be trained to recognize shapes represented by sound and the hope is that those with impaired vision could be trained to use this as a tool. In the study, blindfolded sighted participants were trained to recognize tactile spatial information using sounds mapped from abstract shapes. Following training, the individuals were able to match auditory input to tactually discerned shapes and showed generalization to new auditory-tactile or sound-touch pairings.

“We live in a world where we perceive objects using information available from multiple sensory inputs,” says Dr. Zatorre, neuroscientist at The Neuro and co-director of the International Laboratory for Brain Music and Sound Research. “On one hand, this organization leads to unique sense-specific percepts, such as colour in vision or pitch in hearing. On the other hand our perceptual system can integrate information present across different senses and generate a unified representation of an object. We can perceive a multisensory object as a single entity because we can detect equivalent attributes or patterns across different senses.” Neuroimaging studies have identified brain areas that integrate information coming from different senses – combining input from across the senses to create a complete and comprehensive picture.

The results from The Neuro study strengthen the hypothesis that our perception of a coherent object or event ultimately occurs at an abstract level beyond the sensory input modes in which it is presented. This research provides important new insight into how our brains process the world as well as new possibilities for those with impaired senses.

The study was published in the journal Experimental Research.

Related Stories

Recommended for you

'Selfish brain' wins out when competing with muscle power, study finds

October 20, 2017
Human brains are expensive - metabolically speaking. It takes lot of energy to run our sophisticated grey matter, and that comes at an evolutionary cost.

Researchers find shifting relationship between flexibility, modularity in the brain

October 19, 2017
A new study by Rice University researchers takes a step toward what they see as key to the advance of neuroscience: a better understanding of the relationship between the brain's flexibility and its modularity.

Brain training can improve our understanding of speech in noisy places

October 19, 2017
For many people with hearing challenges, trying to follow a conversation in a crowded restaurant or other noisy venue is a major struggle, even with hearing aids. Now researchers reporting in Current Biology on October 19th ...

Investigating the most common genetic contributor to Parkinson's disease

October 19, 2017
LRRK2 gene mutations are the most common genetic cause of Parkinson's disease (PD), but the normal physiological role of this gene in the brain remains unclear. In a paper published in Neuron, Brigham and Women's Hospital ...

Brain takes seconds to switch modes during tasks

October 19, 2017
The brain rapidly switches between operational modes in response to tasks and what is replayed can predict how well a task will be completed, according to a new UCL study in rats.

Want to control your dreams? Here's how

October 19, 2017
New research at the University of Adelaide has found that a specific combination of techniques will increase people's chances of having lucid dreams, in which the dreamer is aware they're dreaming while it's still happening ...

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.