Scientists unlock secret of how the brain encodes speech

September 26, 2018, Northwestern University
Credit: CC0 Public Domain

People like the late Stephen Hawking can think about what they want to say, but are unable to speak because their muscles are paralyzed. In order to communicate, they can use devices that sense a person's eye or cheek movements to spell out words one letter at a time. However, this process is slow and unnatural.

Scientists want to help these completely paralyzed, or "locked-in," individuals communicate more intuitively by developing a brain machine interface to decode the commands the brain is sending to the tongue, palate, lips and larynx (articulators.)

The person would simply try to say words and the (BMI) would translate into .

New research from Northwestern Medicine and Weinberg College of Arts and Sciences has moved science closer to creating speech-brain machine interfaces by unlocking new information about how the brain encodes speech.

Scientists have discovered the brain controls in a similar manner to how it controls the production of arm and hand movements. To do this, researchers recorded signals from two parts of the brain and decoded what these signals represented. Scientists found the brain represents both the goals of what we are trying to say (speech sounds like "pa" and "ba") and the individual movements that we use to achieve those goals (how we move our lips, palate, tongue and larynx). The different representations occur in two different parts of the brain.

"This can help us build better speech decoders for BMIs, which will move us closer to our goal of helping people that are locked-in speak again," said lead author Dr. Marc Slutzky, associate professor of neurology and of physiology at Northwestern University Feinberg School of Medicine and a Northwestern Medicine neurologist.

The study will be published Sept. 26 in the Journal of Neuroscience.

The discovery could also potentially help people with other speech disorders, such as apraxia of speech, which is seen in children as well as after stroke in adults. In speech apraxia, an individual has difficulty translating speech messages from the brain into spoken language.

How words are translated from your brain into speech

Speech is composed of individual sounds, called phonemes, that are produced by coordinated movements of the lips, tongue, palate and larynx. However, scientists didn't know exactly how these movements, called articulatory gestures, are planned by the brain. In particular, it was not fully understood how the cerebral cortex controls speech production, and no evidence of representation in the brain had been shown.

"We hypothesized speech motor areas of the brain would have a similar organization to arm motor areas of the brain," Slutzky said. "The precentral cortex would represent movements (gestures) of the lips, tongue, palate and larynx, and the higher level cortical areas would represent the phonemes to a greater extent."

That's exactly what they found.

"We studied two parts of the brain that help to produce speech," Slutzky said. "The precentral cortex represented gestures to a greater extent than phonemes. The inferior frontal cortex, which is a higher level speech area, represented both phonemes and gestures."

Chatting up patients in brain surgery to decode their brain signals

Northwestern scientists recorded brain signals from the cortical surface using electrodes placed in patients undergoing brain surgery to remove brain tumors. The patients had to be awake during their surgery, so researchers asked them to read words from a screen.

After the surgery, scientists marked the times when the patients produced phonemes and gestures. Then they used the recorded brain signals from each cortical area to decode which phonemes and gestures had been produced, and measured the decoding accuracy. The in the precentral cortex were more accurate at decoding gestures than phonemes, while those in the inferior frontal cortex were equally good at decoding both phonemes and gestures. This information helped support linguistic models of speech production. It will also help guide engineers in designing brain machine interfaces to decode speech from these brain areas.

The next step for the research is to develop an algorithm for machine interfaces that would not only decode gestures but also combine those decoded gestures to form words.

The paper is titled "Differential Representation of Articulatory Gestures and Phonemes in Precentral and Inferior Frontal Gyri."

Explore further: Study reveals brain activity patterns underlying fluent speech

Related Stories

Study reveals brain activity patterns underlying fluent speech

June 1, 2018
When we speak, we engage nearly 100 muscles, continuously moving our lips, jaw, tongue, and throat to shape our breath into the fluent sequences of sounds that form our words and sentences. A new study by UC San Francisco ...

Stuttering: Stop signals in the brain disturb speech flow

December 12, 2017
One per cent of adults and five per cent of children are unable to achieve what most of us take for granted—speaking fluently. Instead, they struggle with words, often repeating the beginning of a word, for example "G-g-g-g-g-ood ...

Can a brain-computer interface convert your thoughts to text?

October 25, 2016
Ever wonder what it would be like if a device could decode your thoughts into actual speech or written words? While this might enhance the capabilities of already existing speech interfaces with devices, it could be a potential ...

Speech recognition from brain activity

June 16, 2015
Speech is produced in the human cerebral cortex. Brain waves associated with speech processes can be directly recorded with electrodes located on the surface of the cortex. It has now been shown for the first time that is ...

Sounds can help develop speech and gestures in children with autism

February 24, 2016
Children with autism and other similar conditions often have difficulties in several areas of communication. A new doctoral thesis in linguistics from the University of Gothenburg shows that these children can develop speech, ...

'Music of speech' linked to brain area unique to humans

June 28, 2018
We humans are the only primates that can flexibly control the pitch of our voices. This melodic control is not just important for our singing abilities: Fluctuating pitch also conveys critical information during speech—including ...

Recommended for you

Newborn babies' brain responses to being touched on the face measured for the first time

November 16, 2018
A newborn baby's brain responds to being touched on the face, according to new research co-led by UCL.

Precision neuroengineering enables reproduction of complex brain-like functions in vitro

November 14, 2018
One of the most important and surprising traits of the brain is its ability to dynamically reconfigure the connections to process and respond properly to stimuli. Researchers from Tohoku University (Sendai, Japan) and the ...

New brain imaging research shows that when we expect something to hurt it does, even if the stimulus isn't so painful

November 14, 2018
Expect a shot to hurt and it probably will, even if the needle poke isn't really so painful. Brace for a second shot and you'll likely flinch again, even though—second time around—you should know better.

A 15-minute scan could help diagnose brain damage in newborns

November 14, 2018
A 15-minute scan could help diagnose brain damage in babies up to two years earlier than current methods.

New clues to the origin and progression of multiple sclerosis

November 13, 2018
Mapping of a certain group of cells, known as oligodendrocytes, in the central nervous system of a mouse model of multiple sclerosis (MS), shows that they might have a significant role in the development of the disease. The ...

Mutations, CRISPR, and the biology behind movement disorders

November 12, 2018
Scientists at the RIKEN Center for Brain Science (CBS) in Japan have discovered how mutations related to a group of movement disorders produce their effects. Published in Proceedings of the National Academy of Sciences, the ...

1 comment

Adjust slider to filter visible comments by rank

Display comments: newest first

TheGhostofOtto1923
not rated yet Sep 26, 2018
"The person would simply try to say words and the brain machine interface (BMI) would translate into speech"

-And so we may be on the verge of implants which would give us essentially the power of telepathy... and all the ramifications, including brain hacking. Beyond speech - images, music? Photos and vids through our eyes? Direct connect to a VR internet.

WHAT will the world look like then?

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.