What sign language teaches us about the brain

July 25, 2014 by Sana Suri
The power is yours. Credit: wycliffesa, CC BY-ND

The world's leading humanoid robot, ASIMO, has recently learnt sign language. The news of this breakthrough came just as I completed Level 1 of British Sign Language (I dare say it took me longer to master signing than it did the robot!). As a neuroscientist, the experience of learning to sign made me think about how the brain perceives this means of communicating.

For instance, during my training, I found that mnemonics greatly simplified my learning process. To sign the colour blue you use the fingers of your right hand to rub the back of your left hand, my simple mnemonic for this sign being that the veins on the back of our hand appear blue. I was therefore forming an association between the word blue (English), the sign for blue (BSL), and the visual aid that links the two. However, the two languages differ markedly in that one relies on sounds and the other on visual signs.

Do our brains process these languages differently? It seems that for the most part, they don't. And it turns out that studies of sign language users have helped bust a few myths.

As neuroscience took off, it became fashionable to identify specific regions of the brain that were thought to be responsible for certain skills. However, we now know that this oversimplification paints only half a picture. Nowhere else is this clearer than in the case of how human brains perceive language, whether spoken or sign language.

The evidence for this comes from two kinds of studies: lesion analyses, which examine the functional consequences of damage to involved in language, and neuroimaging, which explores how these regions are engaged in processing language.

Lesions teach new lessons

Early theories of language processing pointed to two regions in the left hemisphere of the brain that were thought to be chiefly responsible for producing and understanding spoken language – .

Damage to Broca's area, which is located near the part of the motor cortex that controls the mouth and lips, usually gives rise to difficulties in the production of speech. But this doesn't adversely affect one's ability to communicate, or understand conversation. So a hearing person with a lesion in Broca's area – which can form, say, after a stroke – may not be able to form fluid sentences, but he or she could use single words, short phrases, and possibly nod or shake their head to gesture their responses.

The comprehension of speech, on the other hand, is largely believed to be processed within Wernicke's area, which is located near the auditory cortex – the part of the brain that receives signals from the ears. Hearing people with damage to Wernicke's area are usually fluent in producing speech, but may make up words (for example: "cataloop" for "caterpillar" shown in the video below) and speak in long sentences that have no meaning.

If Broca's area is involved solely in the production of speech, and Wernicke's area in understanding speech sounds, then we might expect that visual languages like sign language remain unaffected when these areas are damaged. But, surprisingly, they do not.

One of the seminal studies in this field was by award-winning husband and wife team Edward Klima and Ursula Bellugi at the Salk Institute. They found that deaf signers who had lesions in left hemisphere "speech centres" like Broca's and Wernicke's areas produced significantly more sign errors on naming, repetition and sentence-comprehension tasks than signers with damaged right hemispheres.

The right hemisphere of the brain is more involved in visual and spatial functions than the left hemisphere, and this is not to say that the right hemisphere is not at all involved in producing and comprehending sign language. However, these findings verify that despite the differences in modality, signed and spoken languages are similarly affected by damage to the left hemisphere of the brain.

Images speak out too

Functional neuroimaging, which can show images of active regions in the brain, has agreed with lesion studies. Despite the fundamental differences in input or output modes for signed and spoken languages, there are common patterns of brain activation when deaf and hearing people process language.

The video will load shortly

For instance, Broca's area is also activated when producing signs and Wernicke's area is activated during the perception of sign language.

Most importantly, these lesion and neuroimaging studies helped clarify two facts. First, that language is not simply limited to hearing and speech, and sign languages are complex linguistic systems processed much like spoken languages. Second, it also cemented our growing reservations of the oversimplified theories of language perception. Their involvement in processing sign language meant that we could no longer think of Broca's and Wernicke's areas exclusively as centres for producing speech and hearing sound, but rather as higher-order language areas in the brain.

Contrary to the common misconception, there is no universal sign language. According to a recent estimate, there are 138 variations of sign language in the world today, with structured syntax, grammar, and even regional accents. It is unfortunate then that a significant proportion of the global deaf community is still battling for legal recognition of these languages.

Credit: Macsweeney et al / Brain

Sign language is sometimes misguidedly looked upon as a "disability" language and simply a visual means of communicating spoken language, when it fact its linguistic construction is almost entirely independent of spoken language. For instance, American and British Sign Language are mutually incomprehensible, even though the hearing people of Britain and America predominantly share the same spoken language.

Knowledge of how sign languages are processed in the brain has not only furthered our understanding of the brain itself, but has also played a part in quashing the once widely believed notion that these signs were simply a loose collection of gestures strung together to communicate spoken language.

Explore further: Early exposure to language for deaf children

Related Stories

Early exposure to language for deaf children

June 5, 2012
(Medical Xpress) -- Most agree that the earlier you expose a child to a language, the easier it is for that child to pick it up. The same rules apply for deaf children.

Brain anatomy differences between deaf, hearing depend on first language learned

April 15, 2014
In the first known study of its kind, researchers have shown that the language we learn as children affects brain structure, as does hearing status. The findings are reported in the Journal of Neuroscience.

Recommended for you

Study finds gene variant increases risk for depression

July 20, 2017
A University of Central Florida study has found that a gene variant, thought to be carried by nearly 25 percent of the population, increases the odds of developing depression.

In making decisions, are you an ant or a grasshopper?

July 20, 2017
In one of Aesop's famous fables, we are introduced to the grasshopper and the ant, whose decisions about how to spend their time affect their lives and future. The jovial grasshopper has a blast all summer singing and playing, ...

Study examines effects of stopping psychiatric medication

July 20, 2017
Despite numerous obstacles and severe withdrawal effects, long-term users of psychiatric drugs can stop taking them if they choose, and mental health care professionals could be more helpful to such individuals, according ...

Perceiving oneself as less physically active than peers is linked to a shorter lifespan

July 20, 2017
Would you say that you are physically more active, less active, or about equally active as other people your age?

New study suggests that reduced insurance coverage for mental health treatment increases costs for the seriously ill

July 19, 2017
Higher out-of-pocket costs for mental health care could have the unintended consequence of increasing the use of acute and involuntary mental health care among those suffering from the most debilitating disorders, a Harvard ...

Old antibiotic could form new depression treatment

July 19, 2017
An antibiotic used mostly to treat acne has been found to improve the quality of life for people with major depression, in a world-first clinical trial conducted at Deakin University.


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.