The innate ability to learn language

March 26, 2012 By Angela Herring
Psychology professor Iris Berent is using behavioral and neuroimaging techniques to investigate whether our ability to learn language is innate. Credit: Mary Knox Merrill

All human languages contain two levels of structure, said Iris Berent, a psychology professor in Northeastern’s College of Science. One is syntax, or the ordering of words in a sentence. The other is phonology, or the sound structure of individual words.

Berent — whose research focuses on the phonological structure of — examines the nature of linguistic competence, its origins and its interaction with reading. While previous studies have all centered on adult language acquisition, she is now working with to address two core questions.

“First,” she said, “do infants have the capacity to encode phonological rules? And, second, are some phonological rules innate?”

To address the first issue, Berent collaborated with neuroscientists Janet Werker, of the University of British Columbia, and Judit Gervain, of the Paris-based Centre National de la Recherche Scientifique.

By utilizing an optical brain imaging technique called near-infrared spectroscopy, or NIRS, the researchers found that newborns have the capacity to learn linguistic rules. This finding — published this month in the Journal of Cognitive Neuroscience — suggests that the neural foundations of language acquisition are present at birth.

Armed with this knowledge, Berent has begun conducting behavioral studies on more than two-dozen infants to explore whether linguistic rules are innate or entirely learned.

“We want to see whether infants prefer certain sound patterns to others even if neither occurs in their language,” Berent explained. “For instance, we know that prefer sequences such as bnog over bdog. Would six-month-old infants show this preference even if their language (English) does not include either sequence?”

For the study, each child is placed in front of a video screen that displays an image pulsing in coordination with rotating sounds, such as “bnog” and “bdog.” Berent hypothesized that infants would look longer at the video screen when they hear sounds to which they are innately biased.

Preliminary results have upheld the hypothesis, but Berent is still accepting new subjects for the study. Her entire research program forms part of a new book called “The Phonological Mind,” which will be published by Cambridge University Press this year.

More information: A symposium on the nature, origins and use of language will take place on March 30 at 12:30 p.m. in the Curry Student Center Ballroom.

Related Stories

Recommended for you

Neuro chip records brain cell activity

October 26, 2016

Brain functions are controlled by millions of brain cells. However, in order to understand how the brain controls functions, such as simple reflexes or learning and memory, we must be able to record the activity of large ...

Can a brain-computer interface convert your thoughts to text?

October 25, 2016

Ever wonder what it would be like if a device could decode your thoughts into actual speech or written words? While this might enhance the capabilities of already existing speech interfaces with devices, it could be a potential ...

The current state of psychobiotics

October 25, 2016

Now that we know that gut bacteria can speak to the brain—in ways that affect our mood, our appetite, and even our circadian rhythms—the next challenge for scientists is to control this communication. The science of psychobiotics, ...

After blindness, the adult brain can learn to see again

October 25, 2016

More than 40 million people worldwide are blind, and many of them reach this condition after many years of slow and progressive retinal degeneration. The development of sophisticated prostheses or new light-responsive elements, ...


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.