University of Washington researchers found that by listening to various social interactions and exaggerated use of "parentese," infants learn to program the motor movements needed to speak their native language and pay less attention to nonnative sounds. To make the discovery, infants sat in a brain scanner that measures brain activation through a noninvasive technique called magnetoencephalography as seen here. Credit: Patricia Kuhl, Institure for Learning and Brain Sciences, University of Washington

Infants take in the sounds of various languages indiscriminately until about 8 months of age, when their brains start to focus only on the predominant language they hear around them, according to researchers. But, they say, the causes for this transition are less clear.

In an article published in the Proceedings of the National Academy of Sciences, Patricia Kuhl, co-director of the University of Washington's center for Learning in Formal and Informal Environments, and a team of researchers say they've found evidence for one reason the transition occurs. Infants focus on their at the point when their brains are laying the groundwork for speaking it.

Kuhl's research, funded by the National Science Foundation's Directorate for Social, Behavioral and Economic Sciences, showed that hearing syllables at 7 and 11 months of age activates the systems in the needed to speak, even before infants are capable of saying complex words or sentences.

When exposed to "parentese"—the talking style adults employ with babies involving long, slow enunciation of words—infants learn to program the motor movements needed to speak their native language and pay less attention to nonnative sounds.

To study the infants' brain activity, Kuhl used a noninvasive scanning technique called magnetoencephalography. A total of 57 babies listened to a series of English and Spanish language syllables such as "da" and "ta" as researchers recorded brain responses.

The researchers found that 7-month-old babies responded to all speech sounds—in both their native English and non-native Spanish—regardless of whether they had heard the sounds before. But by 11-12 months, the infants' brains showed more motor control activity when they heard nonnative than when they heard native speech. The researchers interpreted this as showing it takes more effort for the baby's brain to predict nonnative speech.

Infants at 11 months are more practiced at simulating the mouth movements needed for native speech, the researchers report, but foreign speech is a whole new skill set. Native English-speaking didn't know how to produce a French vowel or a Spanish consonant.

"It means the baby brain is engaged in trying to talk back right from the start and suggests that 7-month-olds' brains are already trying to figure out how to make the right movements that will produce words," says Kuhl.

More information: "Infants' brain responses to speech suggest Analysis by Synthesis." PNAS 2014 111 (31) 11238-11245; published ahead of print July 14, 2014, DOI: 10.1073/pnas.1410963111

Journal information: Proceedings of the National Academy of Sciences