What you hear could depend on what your hands are doing

October 14, 2012, Georgetown University Medical Center

New research links motor skills and perception, specifically as it relates to a second finding—a new understanding of what the left and right brain hemispheres "hear." Georgetown University Medical Center researchers say these findings may eventually point to strategies to help stroke patients recover their language abilities, and to improve speech recognition in children with dyslexia.

The study, presented at Neuroscience 2012, the annual meeting of the Society for Neuroscience, is the first to match with left brain/right brain auditory processing tasks. Before this research, neuroimaging tests had hinted at differences in such processing.

"Language is processed mainly in the left hemisphere, and some have suggested that this is because the left hemisphere specializes in analyzing very rapidly changing sounds," says the study's senior investigator, Peter E. Turkeltaub, M.D., Ph.D., a in the Center for and Recovery. This newly created center is a joint program of Georgetown University and MedStar National Rehabilitation Network.

Turkeltaub and his team hid rapidly and slowly changing sounds in and asked 24 volunteers to simply indicate whether they heard the sounds by pressing a button.

"We asked the subjects to respond to sounds hidden in background noise," Turkeltaub explained. "Each subject was told to use their right hand to respond during the first 20 sounds, then their left hand for the next 20 second, then right, then left, and so on." He says when a subject was using their right hand, they heard the rapidly changing sounds more often than when they used their left hand, and vice versa for the slowly changing sounds.

"Since the left hemisphere controls the right hand and vice versa, these results demonstrate that the two hemispheres specialize in different kinds of sounds—the left hemisphere likes rapidly changing sounds, such as , and the likes slowly changing sounds, such as syllables or intonation," Turkeltaub explains. "These results also demonstrate the interaction between motor systems and perception. It's really pretty amazing. Imagine you're waving an American flag while listening to one of the presidential candidates. The speech will actually sound slightly different to you depending on whether the flag is in your left hand or your right hand."

Ultimately, Turkeltaub hopes that understanding the basic organization of auditory systems and how they interact with motor systems will help explain why language resides in the left hemisphere of the brain, and will lead to new treatments for language disorders, like aphasia (language difficulties after stroke or brain injury) or dyslexia.

"If we can understand the basic brain organization for audition, this might ultimately lead to new treatments for people who have speech recognition problems due to stroke or other brain injury. Understanding better the specific roles of the two hemispheres in auditory processing will be a big step in that direction. If we find that people with aphasia, who typically have injuries to the , have difficulty recognizing speech because of problems with low-level auditory perception of rapidly changing sounds, maybe training the specific auditory processing deficits will improve their ability to recognize speech," Turkeltaub concludes.

Explore further: Listen up: Abnormality in auditory processing underlies dyslexia

Related Stories

Listen up: Abnormality in auditory processing underlies dyslexia

December 21, 2011
People with dyslexia often struggle with the ability to accurately decode and identify what they read. Although disrupted processing of speech sounds has been implicated in the underlying pathology of dyslexia, the basis ...

Researchers identify components of speech recognition pathway in humans

June 22, 2011
Neuroscientists at Georgetown University Medical Center (GUMC) have defined, for the first time, three different processing stages that a human brain needs to identify sounds such as speech — and discovered that they ...

Researchers rewrite textbook on location of brain's speech processing center

January 30, 2012
Scientists have long believed that human speech is processed towards the back of the brain's cerebral cortex, behind auditory cortex where all sounds are received -- a place famously known as Wernicke's area after the German ...

Recommended for you

Fragile X finding shows normal neurons that interact poorly

January 18, 2018
Neurons in mice afflicted with the genetic defect that causes Fragile X syndrome (FXS) appear similar to those in healthy mice, but these neurons fail to interact normally, resulting in the long-known cognitive impairments, ...

How your brain remembers what you had for dinner last night

January 17, 2018
Confirming earlier computational models, researchers at University of California San Diego and UC San Diego School of Medicine, with colleagues in Arizona and Louisiana, report that episodic memories are encoded in the hippocampus ...

Recording a thought's fleeting trip through the brain

January 17, 2018
University of California, Berkeley neuroscientists have tracked the progress of a thought through the brain, showing clearly how the prefrontal cortex at the front of the brain coordinates activity to help us act in response ...

Midbrain 'start neurons' control whether we walk or run

January 17, 2018
Locomotion comprises the most fundamental movements we perform. It is a complex sequence from initiating the first step, to stopping when we reach our goal. At the same time, locomotion is executed at different speeds to ...

A 'touching sight': How babies' brains process touch builds foundations for learning

January 16, 2018
Touch is the first of the five senses to develop, yet scientists know far less about the baby's brain response to touch than to, say, the sight of mom's face, or the sound of her voice.

Researchers identify protein involved in cocaine addiction

January 16, 2018
Mount Sinai researchers have identified a protein produced by the immune system—granulocyte-colony stimulating factor (G-CSF)—that could be responsible for the development of cocaine addiction.

1 comment

Adjust slider to filter visible comments by rank

Display comments: newest first

Tausch
not rated yet Oct 14, 2012
lol
Yes. I play piano. I switch (cross over my arms)to play the same piece of music with the hands now performing the original music with the left hand playing the right hand's role and vice versa.

j/k
You are correct. Basic brain organization for audition starts with conception. Fetal/embryonic motor/audition development get their 'cues' for development long before these innate functions mature to serve as the functions they are intended to perform at birth.

You will experience a delay in your understanding if lateralization of brain function is paramount to your understanding.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.