This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

trusted source

proofread

New understanding of how the brain processes and stores words we hear

sound wave
Credit: CC0 Public Domain

Georgetown University Medical Center neuroscientists say the brain's auditory lexicon, a catalog of verbal language, is actually located in the front of the primary auditory cortex, not in back of it—a finding that upends a century-long understanding of this area of the brain. The new understanding matters because it may impact recovery and rehabilitation following a brain injury such as a stroke.

Riesenhuber's lab showed the existence of a lexicon for "written" words at the base of the brain's in a region known as the visual word form area (VWFA) and subsequently determined that newly learned written words are added to the VWFA. This study sought to test whether a similar lexicon existed for "spoken" words in the so-called auditory word form area (AWFA), located anterior to . The findings appeared in Neurobiology of Language.

"Since the early 1900s, scientists believed spoken word recognition took place behind the primary auditory cortex, but that model did not fit well with many observations from patients with speech recognition deficits, such as ," says Maximilian Riesenhuber, Ph.D., professor in the Department of Neuroscience at Georgetown University Medical Center and senior author of this study. "Our discovery of an auditory lexicon more towards the front of the brain provides a new target area to help us understand speech comprehension deficits."

In the study, led by Srikanth Damera, MD, Ph.D., 26 volunteers went through three rounds of functional magnetic resonance imaging (fMRI) scans to examine their spoken word processing abilities. The technique used in this study was called functional-MRI rapid adaptation (fMRI-RA), which is more sensitive than conventional fMRI in assessing representation of auditory words as well as the learning of new words.

"In future studies, it will be interesting to investigate how interventions directed at the AWFA affect speech comprehension deficits in populations with different types of strokes or ," says Riesenhuber. "We are also trying to understand how the written and spoken word systems interact. Beyond that, we are using the same techniques to look for auditory lexica in other parts of the brain, such as those responsible for speech production."

Josef Rauschecker, Ph.D., DSc, professor in the Department of Neuroscience at Georgetown and co-author of the study, adds that many aspects of how the brain processes words, either written or verbal, remain unexplored.

"We know that when we learn to speak, we rely on our auditory system to tell us whether the sound we've produced accurately represents our intended word," he said. "We use that feedback to refine future attempts to say the word. However, the 's process for this remains poorly understood—both for learning to speak for the first time, but also for learning a second language."

More information: Srikanth R. Damera et al, Evidence for a Spoken Word Lexicon in the Auditory Ventral Stream, Neurobiology of Language (2023). DOI: 10.1162/nol_a_00108

Citation: New understanding of how the brain processes and stores words we hear (2023, July 5) retrieved 28 April 2024 from https://medicalxpress.com/news/2023-07-brain-words.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Understanding how to best transform speech into tactile vibrations could benefit hearing-impaired people

28 shares

Feedback to editors