What's coming next? Scientists identify how the brain predicts speech

April 25, 2017, Public Library of Science
The artificial grammar used in this study and the phase-amplitude coupling in human auditory cortex. Credit: Dr Y. Kikuchi et al. doi:10.1371/journal.pbio.2000219

An international collaboration of neuroscientists has shed light on how the brain helps us to predict what is coming next in speech.

In the study, publishing on April 25 in the open access journal PLOS Biology scientists from Newcastle University, UK, and a neurosurgery group at the University of Iowa, USA, report that they have discovered mechanisms in the 's involved in processing and predicting upcoming words, which is essentially unchanged throughout evolution. Their research reveals how coordinate with neural populations to anticipate events, a process that is impaired in many neurological and psychiatric disorders such as dyslexia, schizophrenia and Attention Deficit Hyperactivity Disorder (ADHD).

Using an approach first developed for studying infant language learning, the team of neuroscientists led by Dr Yuki Kikuchi and Prof Chris Petkov of Newcastle University had humans and monkeys listen to sequences of spoken words from a made-up language. Both species were able to learn the predictive relationships between the spoken sounds in the sequences.

Neural responses from the auditory cortex in the two species revealed how populations of neurons responded to the speech sounds and to the learned predictive relationships between the sounds. The neural responses were found to be remarkably similar in both species, suggesting that the way the human auditory cortex responds to speech harnesses evolutionarily conserved mechanisms, rather than those that have uniquely specialized in humans for speech or language.

"Being able to predict events is vital for so much of what we do every day," Professor Petkov notes. "Now that we know humans and monkeys share the ability to predict speech we can apply this knowledge to take forward research to improve our understanding of the human brain."

Dr Kikuchi elaborates, "in effect we have discovered the mechanisms for speech in your brain that work like on your mobile phone, anticipating what you are going to hear next. This could help us better understand what is happening when the brain fails to make fundamental predictions, such as in people with dementia or after a stroke."

Building on these results, the team are working on projects to harness insights on predictive signals in the brain to develop new models to study how these signals go wrong in patients with stroke or dementia. The long-term goal is to identify strategies that yield more accurate prognoses and treatments for these patients.

Explore further: Recognizing the basic structure of language is not unique to the human brain

More information: Kikuchi Y, Attaheri A, Wilson B, Rhone AE, Nourski KV, Gander PE, et al. (2017) Sequence learning modulates neural responses and oscillatory coupling in human and monkey auditory cortex. PLoS Biol 15(4): e2000219. DOI: 10.1371/journal.pbio.2000219

Related Stories

Recognizing the basic structure of language is not unique to the human brain

November 17, 2015
A team led at Newcastle University, UK, has shed light on the evolutionary roots of language in the brain.

Gene plays role in poor speech processing, dyslexia

July 11, 2016
A new study led by UT Dallas researchers shows that a gene associated with dyslexia may interfere with the processing of speech, ultimately leading to reading problems that are characteristic of the disorder.

Pop-outs: How the brain extracts meaning from noise

December 20, 2016
When you're suddenly able to understand someone despite their thick accent, or finally make out the lyrics of a song, your brain appears to be re-tuning to recognize speech that was previously incomprehensible.

Researchers identify components of speech recognition pathway in humans

June 22, 2011
Neuroscientists at Georgetown University Medical Center (GUMC) have defined, for the first time, three different processing stages that a human brain needs to identify sounds such as speech — and discovered that they ...

Speech recognition from brain activity

June 16, 2015
Speech is produced in the human cerebral cortex. Brain waves associated with speech processes can be directly recorded with electrodes located on the surface of the cortex. It has now been shown for the first time that is ...

Recommended for you

Use of electrical brain stimulation to foster creativity has sweeping implications

September 18, 2018
What is creativity, and can it be enhanced—safely—in a person who needs a boost of imagination? Georgetown experts debate the growing use of electrical devices that stimulate brain tissue, and conclude there is potential ...

Engineers decode conversations in brain's motor cortex

September 18, 2018
How does your brain talk with your arm? The body doesn't use English, or any other spoken language. Biomedical engineers are developing methods for decoding the conversation, by analyzing electrical patterns in the motor ...

Team identifies brain's lymphatic vessels as new avenue to treat multiple sclerosis

September 17, 2018
Lymphatic vessels that clean the brain of harmful material play a crucial role in the development and progression of multiple sclerosis, new research from the University of Virginia School of Medicine suggests. The vessels ...

Circuit found for brain's statistical inference about motion

September 17, 2018
As the eye tracks a bird flying past, the muscles that pan the eyeballs to keep the target in focus set their pace not only on the speed they see, but also on a reasonable estimate of the speed they expect from having watched ...

Mouse study reveals that activity, not rest, speeds recovery after brain injury

September 17, 2018
When recovering from a brain injury, getting back in the swing of things may be more effective than a prolonged period of rest, according to a new Columbia study in mice. These findings offer a compelling example of the brain's ...

Fine-tuned sense of smell relies on timing

September 17, 2018
If you can tell the difference between a merlot and a cabernet franc just by smell, it's probably all in the timing.

1 comment

Adjust slider to filter visible comments by rank

Display comments: newest first

GamesforLanguage
not rated yet May 12, 2017
Hmm, I wonder what research will discover about "universal grammar"...

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.