Words, gestures are translated by same brain regions, says new research

November 9, 2009

Your ability to make sense of Groucho's words and Harpo's pantomimes in an old Marx Brothers movie takes place in the same regions of your brain, says new research funded by the National Institute on Deafness and Other Communication Disorders (NIDCD), one of the National Institutes of Health.

In a study published in this week's Early Edition of (PNAS), researchers have shown that the brain regions that have long been recognized as a center in which spoken or written words are decoded are also important in interpreting wordless gestures. The findings suggest that these brain regions may play a much broader role in the interpretation of symbols than researchers have thought and, for this reason, could be the evolutionary starting point from which originated.

"In babies, the ability to communicate through gestures precedes spoken language, and you can predict a child's language skills based on the repertoire of his or her gestures during those early months," said James F. Battey, Jr., M.D., Ph.D., director of the NIDCD. "These findings not only provide compelling evidence regarding where language may have come from, they help explain the interplay that exists between language and gesture as children develop their language skills."

Scientists have known that sign language is largely processed in the same regions of the brain as spoken language. These regions include the inferior frontal gyrus, or Broca's area, in the front left side of the brain, and the posterior temporal region, commonly referred to as Wernicke's area, toward the back left side of the brain. It isn't surprising that signed and spoken language activate the same brain regions, because sign language operates in the same way as spoken language does—with its own vocabulary and rules of grammar.

In this study, NIDCD researchers, in collaboration with scientists from Hofstra University School of Medicine, Hempstead, N.Y., and San Diego State University, wanted to find out if non-language-related gestures—the hand and body movements we use that convey meaning on their own, without having to be translated into specific words or phrases—are processed in the same regions of the brain as language is. Two types of gestures were considered for the study: pantomimes, which mimic objects or actions, such as unscrewing a jar or juggling balls, and emblems, which are commonly used in social interactions and which signify abstract, usually more emotionally charged concepts than pantomimes. Examples include a hand sweeping across the forehead to indicate "it's hot in here!" or a finger to the lips to signify "be quiet."

While inside a functional MRI machine, 20 healthy, English-speaking volunteers—nine males and 11 females—watched video clips of a person either acting out one of the two gesture types or voicing the phrases that the gestures represent. As controls, volunteers also watched clips of the person using meaningless gestures or speaking pseudowords that had been chopped up and randomly reorganized so the brain would not interpret them as language. Volunteers watched 60 video clips for each of the six stimuli, with the clips presented in 45-second time blocks at a rate of 15 clips per block. A mirror attached to the head enabled the volunteer to watch the video projected on the scanner room wall. The scientists then measured brain activity for each of the stimuli and looked for similarities and differences as well as any communication occurring between individual parts of the brain.

The researchers found that for the gesture and spoken language stimuli, the brain was highly activated in the inferior frontal and posterior temporal areas, the long-recognized language regions of the brain.

"If gesture and language were not processed by the same system, you'd have spoken language activating the inferior frontal and posterior temporal areas, and gestures activating other parts of the brain," said Allen Braun, M.D., senior author on the paper, "But in fact we found virtual overlap."

Current thinking in the study of language is that, like a smart search engine that pops up the most suitable Web site at the top of its search results, the posterior temporal region serves as a storehouse of words from which the selects the most appropriate match. The researchers suggest that, rather than being limited to deciphering words alone, these regions may be able to apply meaning to any incoming symbols, be they words, gestures, images, sounds, or objects. According to Dr. Braun, these regions also may present a clue into how language evolved.

"Our results fit a longstanding theory which says that the common ancestor of humans and apes communicated through meaningful gestures and, over time, the brain regions that processed gestures became adapted for using words," he said. "If the theory is correct, our language areas may actually be the remnant of this ancient communication system, one that continues to process as well as language in the human brain."

Dr. Braun adds that developing a better understanding of the systems that support gestures and words may help in the treatment of some patients with aphasia, a disorder that hinders a person's ability to produce or understand language.

Source: NIH/National Institute on Deafness and Other Communication Disorders

Related Stories

Recommended for you

Your brain responses to music reveal if you're a musician or not

January 23, 2018
How your brain responds to music listening can reveal whether you have received musical training, according to new Nordic research conducted in Finland (University of Jyväskylä and AMI Center) and Denmark (Aarhus University).

New neuron-like cells allow investigation into synthesis of vital cellular components

January 22, 2018
Neuron-like cells created from a readily available cell line have allowed researchers to investigate how the human brain makes a metabolic building block essential for the survival of all living organisms. A team led by researchers ...

Finding unravels nature of cognitive inflexibility in fragile X syndrome

January 22, 2018
Mice with the genetic defect that causes fragile X syndrome (FXS) learn and remember normally, but show an inability to learn new information that contradicts what they initially learned, shows a new study by a team of neuroscientists. ...

Epilepsy linked to brain volume and thickness differences

January 22, 2018
Epilepsy is associated with thickness and volume differences in the grey matter of several brain regions, according to new research led by UCL and the Keck School of Medicine of USC.

Research reveals atomic-level changes in ALS-linked protein

January 18, 2018
For the first time, researchers have described atom-by-atom changes in a family of proteins linked to amyotrophic lateral sclerosis (ALS), a group of brain disorders known as frontotemporal dementia and degenerative diseases ...

Fragile X finding shows normal neurons that interact poorly

January 18, 2018
Neurons in mice afflicted with the genetic defect that causes Fragile X syndrome (FXS) appear similar to those in healthy mice, but these neurons fail to interact normally, resulting in the long-known cognitive impairments, ...

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.