Practice builds brain connections for babies learning language, how to speak

July 11, 2006
A six-month-old baby listens to sounds while a magnetoencephalograph measures the baby's brain activity
A six-month-old baby listens to sounds while a magnetoencephalograph measures the baby's brain activity.

Experience, as the old saying goes, is the best teacher. And experience seems to play an important early role in how infants learn to understand and produce language.

Using new technology that measures the magnetic field generated by the activation of neurons in the brain, researchers tracked what appears to be a link between the listening and speaking areas of the brain in newborn, 6-month-old and one-year-old infants, before infants can speak.

The study, which appears in this month's issue of the journal NeuroReport, shows that Broca's area, located in the front of the left hemisphere of the brain, is gradually activated during an infant's initial year of life, according to Toshiaki Imada, lead author of the paper and a research professor at the University of Washington's Institute for Brain and Learning Sciences.

Broca's area has long been identified as the seat of speech production and, more recently, as that of social cognition and is critical to language and reading, according to Patricia Kuhl, co-author of the study and co-director of the UW's Institute for Brain and Learning Sciences.

"Magnetoencephalography is perfectly non-invasive and measures the magnetic field generated by neurons in the brain responding to sensory information that then 'leaks' through the skull," said Imada, one of the world's experts in the uses of magnetoencephalography to study the brain.

Kuhl said there is a long history of a link in the adult brain between the areas responsible for understanding and those responsible for speaking language. The link allows children to mimic the speech patterns they hear when they are very young. That's why people from Brooklyn speak "Brooklynese," she said.

"We think the connection between perception and production of speech gets formed by experience, and we are trying to determine when and how babies do it," said Kuhl, who also is a professor of speech and hearing sciences.

The study involved 43 infants in Finland -- 18 newborns, 17 6-month-olds and 8 one-year olds. Special hardware and software developed for this study allowed the infants' brain activity to be monitored even if they moved and captured brain activation with millisecond precision.

The babies were exposed to three kinds of sounds through earphones -- pure tones that do not resemble speech like notes played on a piano, a three-tone harmonic chord that resembles speech and two Finnish syllables, "pa" and "ta." The researchers collected magnetic data only from the left hemisphere of the brain among the newborns because they cannot sit up and the magnetoencephalography cap was too big to securely fit their heads.

At all three ages the infants showed activation in the temporal part of the brain, Broca's area, that is responsible for listening and understanding speech, showing they were able to detect sound changes for all three stimuli. But the pure perception of sound did not activate the areas of the brain responsible for speaking. However, researchers began seeing some activation in Broca's area when the 6-month-old infants heard the syllables or harmonic chords. By the time the infants were one-year old, the speech stimuli activated Broca's area simultaneously with the auditory areas, indicating "cross-talk" between the area of the brain that hears language and the area that produces language, according to Kuhl.

"We think that early in development babies need to play with sounds, just as they play with their hands. And that helps them map relationships between sounds with the movements of their mouth and tongue," she said. "To master a skill, babies have to play and practice just as they later will in learning how to throw a baseball or ride a bike. Babies form brain connections by listening to themselves and linking what they hear to what they did to cause the sounds. Eventually they will use this skill to mimic speakers in their environments."

This playing with language starts, Kuhl said, when babies begin cooing around 12 weeks of age and begin babbling around seven months of age.

"They are cooing and babbling before they know how to link their mouth and tongue movements. This brain connection between perception and production requires experience," she said.

Co-authors of the study were Yang Zhang of the University of Minnesota, Marie Cheour of the University of Miami and Helsinki University Central Hospital, and Samu Taulu and Antti Ahonen of Elekta Neuromag Oy in Findland. The National Institutes of Health, the National Science Foundation, the Talaris Research Institute and the Apex Foundation, the family foundation of Bruce and Jolene McCaw, supported the research.

Source: University of Washington

Explore further: Babies' babbling betters brains, language

Related Stories

Babies' babbling betters brains, language

January 18, 2018
Babies are adept at getting what they need - including an education. New research shows that babies organize mothers' verbal responses, which promotes more effective language instruction, and infant babbling is the key.

Preterm babies may suffer setbacks in auditory brain development, speech

January 15, 2018
Preterm babies born early in the third trimester of pregnancy are likely to experience delays in the development of the auditory cortex, a brain region essential to hearing and understanding sound, a new study reveals. Such ...

Brain imaging predicts language learning in deaf children

January 15, 2018
In a new international collaborative study between The Chinese University of Hong Kong and Ann & Robert H. Lurie Children's Hospital of Chicago, researchers created a machine learning algorithm that uses brain scans to predict ...

Pitch imperfect? How the brain decodes pitch may improve cochlear implants

November 22, 2017
Picture yourself with a friend in a crowded restaurant. The din of other diners, the clattering of dishes, the muffled notes of background music, the voice of your friend, not to mention your own – all compete for your ...

How Google's Pixel Buds earphones translate languages

November 15, 2017
In the Hitchhiker's Guide to The Galaxy, Douglas Adams's seminal 1978 BBC broadcast (then book, feature film and now cultural icon), one of the many technology predictions was the Babel Fish. This tiny yellow life-form, inserted ...

Technique illuminates the inner workings of artificial-intelligence systems that process language

December 11, 2017
Neural networks, which learn to perform computational tasks by analyzing huge sets of training data, have been responsible for the most impressive recent advances in artificial intelligence, including speech-recognition and ...

Recommended for you

Creation of synthetic horsepox virus could lead to more effective smallpox vaccine

January 19, 2018
UAlberta researchers created a new synthetic virus that could lead to the development of a more effective vaccine against smallpox. The discovery demonstrates how techniques based on the use of synthetic DNA can be used to ...

Novel genomic tools provide new insight into human immune system

January 19, 2018
When the body is under attack from pathogens, the immune system marshals a diverse collection of immune cells to work together in a tightly orchestrated process and defend the host against the intruders. For many decades, ...

Researchers illustrate how muscle growth inhibitor is activated, could aid in treating ALS

January 19, 2018
Researchers at the University of Cincinnati (UC) College of Medicine are part of an international team that has identified how the inactive or latent form of GDF8, a signaling protein also known as myostatin responsible for ...

Women run faster after taking newly developed supplement, study finds

January 19, 2018
A new study found that women who took a specially prepared blend of minerals and nutrients for a month saw their 3-mile run times drop by almost a minute.

Intensive behavior therapy no better than conventional support in treating teenagers with antisocial behavior

January 19, 2018
Research led by UCL has found that intensive and costly multisystemic therapy is no better than conventional therapy in treating teenagers with moderate to severe antisocial behaviour.

Investigators eye new target for treating movement disorders

January 19, 2018
Blocking a nerve-cell receptor in part of the brain that coordinates movement could improve the treatment of Parkinson's disease, dyskinesia and other movement disorders, researchers at Vanderbilt University have reported.

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.