Biomedical engineer finds how brain encodes sounds

November 9, 2017 by Beth Miller, Washington University in St. Louis
Research at the School of Engineering & Applied Science has discovered that auditory cortex neurons may be encoding sounds differently than previously thought. Credit: Washington University in St. Louis

When you are out in the woods and hear a cracking sound, your brain needs to process quickly whether the sound is coming from, say, a bear or a chipmunk. In new research published in PLoS Biology, a biomedical engineer at Washington University in St. Louis has a new interpretation for an old observation, debunking an established theory in the process.

Dennis Barbour, MD, PhD, associate professor of biomedical engineering in the School of Engineering & Applied Science who studies neurophysiology, found in an animal model that auditory cortex may be encoding sounds differently than previously thought. Sensory neurons, such as those in auditory cortex, on average respond relatively indiscriminately at the beginning of a new stimulus, but rapidly become much more selective. The few neurons responding for the duration of a stimulus were generally thought to encode the identity of a stimulus, while the many neurons responding at the beginning were thought to encode only its presence. This theory makes a prediction that had never been tested—that the indiscriminate, initial responses would encode stimulus identity less accurately than how the selective responses register over the sound's duration.

"At the beginning of a sound transition, things are diffusely encoded across the neuron population, but sound identity turns out to be more accurately encoded," Barbour said. "As a result, you can more rapidly identify sounds and act on that information. If you get about the same amount of information for each action potential spike of , as we found, then the more spikes you can put toward a problem, the faster you can decide what to do. Neural populations spike most and encode most accurately at the beginning of stimuli."

Barbour's study involved recording . To make similar kinds of measurements of brain activity in humans, researchers must use noninvasive techniques that average many neurons together. Event-related potential (ERP) techniques record brain signals through electrodes on the scalp and reflect neural activity synchronized to the onset of a stimulus. Functional MRI (fMRI), on the other hand, reflects activity averaged over several seconds. If the brain were using fundamentally different encoding schemes for onsets versus sustained stimulus presence, these two methods might be expected to diverge in their findings. Both reveal the neural encoding of identity, however.

"There has been a lot of debate for a very long time, but especially in the past couple of decades, about whether information representation in the brain is distributed or local," Barbour said.

"If function is localized, with small numbers of neurons bunched together doing similar things, that's consistent with sparse coding, high selectivity, and low population spiking rates. But if you have distributed activity, or lots of neurons contributing all over the place, that's consistent with dense coding, low selectivity and high population spiking rates. Depending on how the experiment is conducted, neuroscientists see both. Our evidence suggests that it might just be both, depending on which data you look at and how you analyze it."

Barbour said the research is the most fundamental work to build a theory for how information might be encoded for sound processing, yet it implies a novel sensory encoding principle potentially applicable to other sensory systems, such as how smells are processed and encoded.

Earlier this year, Barbour worked with Barani Raman, associate professor of , to investigate how the presence and absence of an odor or a is processed. While the response times between the olfactory and auditory systems are different, the neurons are responding in the same ways. The results of that research also gave strong evidence that there may exist a stored set of signal processing motifs that is potentially shared by different sensory systems and even different species.

Explore further: What a locust's nose taught engineers about monkeys' ears

More information: Wensheng Sun et al. Rate, not selectivity, determines neuronal population coding accuracy in auditory cortex, PLOS Biology (2017). DOI: 10.1371/journal.pbio.2002459

Related Stories

What a locust's nose taught engineers about monkeys' ears

May 31, 2017
Is there an opposite for the smell of a rose? Is silence simply the absence of sound? The results of a recent study by a team of biomedical engineers in the School of Engineering & Applied Science at Washington University ...

How do we hear time within sound?

April 16, 2015
How does our auditory system represent time within a sound? A new study published in PLOS Computational Biology investigates how temporal acoustic patterns can be represented by neural activity within auditory cortex, a major ...

Researchers show how particular fear memories can be erased

August 17, 2017
Researchers at the University of California, Riverside have devised a method to selectively erase particular fear memories by weakening the connections between the nerve cells (neurons) involved in forming these memories.

New study reveals contrasts in how groups of neurons function during decision making

July 19, 2017
By training mice to perform a sound identification task in a virtual reality maze, researchers at Harvard Medical School and the Istituto Italiano di Tecnologia (IIT) have identified striking contrasts in how groups of neurons ...

Recommended for you

Wiring diagram of the brain provides a clearer picture of brain scan data

December 14, 2018
Already affecting more than five million Americans older than 65, Alzheimer's disease is on the rise and expected to impact more than 13 million people by 2050. Over the last three decades, researchers have relied on neuroimaging—brain ...

Scientists identify method to study resilience to pain

December 14, 2018
Scientists at the Yale School of Medicine and Veterans Affairs Connecticut Healthcare System have successfully demonstrated that it is possible to pinpoint genes that contribute to inter-individual differences in pain.

Parents' brain activity 'echoes' their infant's brain activity when they play together

December 13, 2018
When infants are playing with objects, their early attempts to pay attention to things are accompanied by bursts of high-frequency activity in their brain. But what happens when parents play together with them? New research, ...

In the developing brain, scientists find roots of neuropsychiatric diseases

December 13, 2018
The most comprehensive genomic analysis of the human brain ever undertaken has revealed new insights into the changes it undergoes through development, how it varies among individuals, and the roots of neuropsychiatric illnesses ...

Researchers discover abundant source for neuronal cells

December 13, 2018
USC researchers seeking a way to study genetic activity associated with psychiatric disorders have discovered an abundant source of human cells—the nose.

Researchers find the cause of and cure for brain injury associated with gut condition

December 13, 2018
Using a mouse model of necrotizing enterocolitis (NEC)—a potentially fatal condition that causes a premature infant's gut to suddenly die—researchers at Johns Hopkins say they have uncovered the molecular causes of the ...

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.