Game technology teaches mice and men to hear better in noisy environments

June 9, 2014, Massachusetts Eye and Ear Infirmary
Daniel Polley, Ph.D., is director of the Mass. Eye and Ear's Amelia Peabody Neural Plasticity Unit of the Eaton-Peabody Laboratories and assistant professor of otology and laryngology at Harvard Medical School. Credit: Eric Antoniou

The ability to hear soft speech in a noisy environment is difficult for many and nearly impossible for the 48 million in the United States living with hearing loss. Researchers from the Massachusetts Eye and Ear, Harvard Medical School and Harvard University programmed a new type of game that trained both mice and humans to enhance their ability to discriminate soft sounds in noisy backgrounds. Their findings will be published in PNAS Online Early Edition the week of June 9-13, 2014.

In the experiment, adult humans and mice with normal hearing were trained on a rudimentary 'audiogame' inspired by sensory foraging behavior that required them to discriminate changes in the loudness of a tone presented in a moderate level of background noise. Their findings suggest new therapeutic options for clinical populations that receive little benefit from conventional sensory rehabilitation strategies.

"Like the children's game 'hot and cold', our game provided instantaneous auditory feedback that allowed our human and mouse subjects to hone in on the location of a hidden target," said senior author Daniel Polley, Ph.D., director of the Mass. Eye and Ear's Amelia Peabody Neural Plasticity Unit of the Eaton-Peabody Laboratories and assistant professor of otology and laryngology at Harvard Medical School. "Over the course of training, both species learned adaptive search strategies that allowed them to more efficiently convert noisy, dynamic audio cues into actionable information for finding the target. To our surprise, human subjects who mastered this simple game over the course of 30 minutes of daily training for one month exhibited a generalized improvement in their ability to understand speech in noisy background conditions. Comparable improvements in the processing of speech in high levels of background noise were not observed for control subjects who heard the sounds of the game but did not actually play the game."

The researchers recorded the electrical activity of neurons in auditory regions of the mouse cerebral cortex to gain some insight into how training might have boosted the ability of the to separate signal from noise. They found that training substantially altered the way the brain encoded sound.

In trained mice, many neurons became highly sensitive to faint sounds that signaled the location of the target in the game. Moreover, neurons displayed increased resistance to noise suppression; they retained an ability to encode faint sounds even under conditions of elevated .

"Again, changes of this ilk were not observed in control mice that watched (and listened) to their counterparts play the game. Active participation in the training was required; passive listening was not enough," Dr. Polley said.

These findings illustrate the utility of brain training exercises that are inspired by careful neuroscience research. "When combined with conventional assistive devices such as hearing aids or cochlear implants, 'audiogames' of the type we describe here may be able to provide the hearing impaired with an improved ability to reconnect to the auditory world. Of particular interest is the finding that brain training improved speech processing in noisy backgrounds – a listening environment where conventional hearing aids offer limited benefit," concluded Dr. Jonathon Whitton, lead author on the paper. Dr. Whitton is a principal investigator at the Amelia Peabody Neural Plasticity Unit and affiliated with the Program in Speech Hearing Bioscience and Technology, Harvard–Massachusetts Institute of Technology Division of Health, Sciences, and Technology.

Explore further: Decoding sound's source: Researchers unravel part of the mystery

More information: Immersive audiomotor game play enhances neural and perceptual salience of weak signals in noise, PNAS: www.pnas.org/cgi/doi/10.1073/pnas.1322184111

Related Stories

Decoding sound's source: Researchers unravel part of the mystery

October 1, 2013
As Baby Boomers age, many experience difficulty in hearing and understanding conversations in noisy environments such as restaurants. People who are hearing-impaired and who wear hearing aids or cochlear implants are even ...

Noisy classroom simulation aids comprehension in hearing-impaired children

February 11, 2013
Children with hearing loss struggle to hear in noisy school classrooms, even with the help of hearing aids and other devices to amplify their teacher's voice. Training the brain to filter out background noise and thus understand ...

Cocktail party neuroscience: Making sense of voices in a crowd

May 28, 2014
Listening to a conversation in the context of a cocktail party presents a great challenge for the auditory system. Without realizing it, one must extract, from a complex mixture of sound, the sound of a single voice to understand ...

Infants benefit from implants with more frequency sounds

May 19, 2014
(Medical Xpress)—A new study from a UT Dallas researcher demonstrates the importance of considering developmental differences when creating programs for cochlear implants in infants.

Study: Hearing impaired ears hear differently in noisy environments

September 11, 2012
(Medical Xpress)—The world continues to be a noisy place, and Purdue University researchers have found that all that background chatter causes the ears of those with hearing impairments to work differently.

Brain picks out salient sounds from background noise by tracking frequency and time

July 23, 2013
New research reveals how our brains are able to pick out important sounds from the noisy world around us. The findings, published online today in the journal 'eLife', could lead to new diagnostic tests for hearing disorders.

Recommended for you

Research reveals atomic-level changes in ALS-linked protein

January 18, 2018
For the first time, researchers have described atom-by-atom changes in a family of proteins linked to amyotrophic lateral sclerosis (ALS), a group of brain disorders known as frontotemporal dementia and degenerative diseases ...

Fragile X finding shows normal neurons that interact poorly

January 18, 2018
Neurons in mice afflicted with the genetic defect that causes Fragile X syndrome (FXS) appear similar to those in healthy mice, but these neurons fail to interact normally, resulting in the long-known cognitive impairments, ...

How your brain remembers what you had for dinner last night

January 17, 2018
Confirming earlier computational models, researchers at University of California San Diego and UC San Diego School of Medicine, with colleagues in Arizona and Louisiana, report that episodic memories are encoded in the hippocampus ...

Recording a thought's fleeting trip through the brain

January 17, 2018
University of California, Berkeley neuroscientists have tracked the progress of a thought through the brain, showing clearly how the prefrontal cortex at the front of the brain coordinates activity to help us act in response ...

Midbrain 'start neurons' control whether we walk or run

January 17, 2018
Locomotion comprises the most fundamental movements we perform. It is a complex sequence from initiating the first step, to stopping when we reach our goal. At the same time, locomotion is executed at different speeds to ...

Miles Davis is not Mozart: The brains of jazz and classical pianists work differently

January 16, 2018
Keith Jarret, world-famous jazz pianist, once answered in an interview when asked if he would ever be interested in doing a concert where he would play both jazz and classical music: "No, that's hilarious. [...] It's like ...

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.