Theory: Music underlies language acquisition

September 18, 2012 by B. J. Almond

(Medical Xpress)—Contrary to the prevailing theories that music and language are cognitively separate or that music is a byproduct of language, theorists at Rice University's Shepherd School of Music and the University of Maryland, College Park (UMCP) advocate that music underlies the ability to acquire language.

"Spoken language is a special type of ," said Anthony Brandt, co-author of a theory paper published online this month in the journal Frontiers in Cognitive Auditory Neuroscience. "Language is typically viewed as fundamental to , and music is often treated as being dependent on or derived from language. But from a developmental perspective, we argue that music comes first and language arises from music."

Brandt, associate professor of composition and theory at the Shepherd School, co-authored the paper with Shepherd School graduate student Molly Gebrian and L. Robert Slevc, UMCP assistant professor of psychology and director of the Language and Music Cognition Lab.

"Infants listen first to sounds of language and only later to its meaning," Brandt said. He noted that newborns' extensive abilities in different aspects of depend on the discrimination of the sounds of language – "the most musical aspects of speech."

The paper cites various studies that show what the newborn brain is capable of, such as the ability to distinguish the phonemes, or basic distinctive units of , and such attributes as pitch, rhythm and timbre.

The authors define music as "creative play with sound." They said the term "music" implies an attention to the acoustic features of sound irrespective of any referential function. As adults, people focus primarily on the meaning of speech. But babies begin by hearing language as "an intentional and often repetitive vocal performance," Brandt said. "They listen to it not only for its but also for its rhythmic and phonemic patterns and consistencies. The meaning of words comes later."

Brandt and his co-authors challenge the prevailing view that music cognition matures more slowly than language cognition and is more difficult. "We show that music and language develop along similar time lines," he said.

Infants initially don't distinguish well between their native language and all the languages of the world, Brandt said. Throughout the first year of life, they gradually hone in on their native language. Similarly, infants initially don't distinguish well between their native musical traditions and those of other cultures; they start to hone in on their own musical culture at the same time that they hone in on their native language, he said.

The paper explores many connections between listening to speech and music. For example, recognizing the sound of different consonants requires rapid processing in the temporal lobe of the brain. Similarly, recognizing the timbre of different instruments requires temporal processing at the same speed—a feature of musical hearing that has often been overlooked, Brandt said.

"You can't distinguish between a piano and a trumpet if you can't process what you're hearing at the same speed that you listen for the difference between 'ba' and 'da,'" he said. "In this and many other ways, listening to music and speech overlap." The authors argue that from a musical perspective, speech is a concert of phonemes and syllables.

"While music and language may be cognitively and neurally distinct in adults, we suggest that language is simply a subset of music from a child's view," Brandt said. "We conclude that music merits a central place in our understanding of human development."

Brandt said more research on this topic might lead to a better understanding of why music therapy is helpful for people with reading and speech disorders. People with dyslexia often have problems with the performance of musical rhythm. "A lot of people with language deficits also have musical deficits," Brandt said.

More research could also shed light on rehabilitation for people who have suffered a stroke. "Music helps them reacquire language, because that may be how they acquired in the first place," Brandt said.

Explore further: Dyslexia linked to difficulties in perceiving rhythmic patterns in music

More information: For the full text of the theory paper, visit www.frontiersin.org/Auditory_C … .2012.00327/abstract

Related Stories

Dyslexia linked to difficulties in perceiving rhythmic patterns in music

June 29, 2011
Children with dyslexia often find it difficult to count the number of syllables in spoken words or to determine whether words rhyme. These subtle difficulties are seen across languages with different writing systems and they ...

Recommended for you

New study rebuts the claim that antidepressants do not work

August 18, 2017
A theory that has gained considerable attention in international media, including Newsweek and the CBS broadcast 60 minutes, suggests that antidepressant drugs such as the SSRIs do not exert any actual antidepressant effect. ...

Should I stay or should I leave? Untangling what goes on when a relationship is being questioned

August 17, 2017
Knowing whether to stay in or leave a romantic relationship is often an agonizing experience and that ambivalence can have negative consequences for health and well-being.

Kids learn moral lessons more effectively from stories with humans than human-like animals

August 17, 2017
A study by researchers at the Ontario Institute for Studies in Education (OISE) at the University of Toronto found that four to six-year-olds shared more after listening to books with human characters than books with anthropomorphic ...

History of stress increases miscarriage risk, says new review

August 17, 2017
A history of exposure to psychological stress can increase the risk of miscarriage by upto 42 per cent, according to a new review.

Study finds children pay close attention to potentially threatening information, avoid eye contact when anxious

August 17, 2017
We spend a lot of time looking at the eyes of others for social cues – it helps us understand a person's emotions, and make decisions about how to respond to them. We also know that adults avoid eye contact when anxious. ...

Communicating in a foreign language takes emotion out of decision making

August 16, 2017
If you could save the lives of five people by pushing another bystander in front of a train to his death, would you do it? And should it make any difference if that choice is presented in a language you speak, but isn't your ...

1 comment

Adjust slider to filter visible comments by rank

Display comments: newest first

Tausch
not rated yet Sep 18, 2012
Yes. What more do you need to hear?
Your research needs little correction.

Hint:
All sounds has meaning. As you said;
The 'labels' or 'labeling' (also sounds!) are the additional 'meaning' 'mounted'(to the sounds first heard [processed]) later on.

Congratulations to all involve in this research.
Concurring research:

http://medicalxpr...rld.html

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.