What the brain saw

March 31, 2011
Spike distributions for neurons responding to two features can have shapes that are difficult to understand. Credit: Courtesy of Dr. Tatyana Sharpee, Salk Institute for Biological Studies

The moment we open our eyes, we perceive the world with apparent ease. But the question of how neurons in the retina encode what we "see" has been a tricky one. A key obstacle to understanding how our brain functions is that its components—neurons—respond in highly nonlinear ways to complex stimuli, making stimulus-response relationships extremely difficult to discern.

Now a team of physicists at the Salk Institute for Biological Studies has developed a general mathematical framework that makes optimal use of limited measurements, bringing them a step closer to deciphering the "language of the brain." The approach, described in the current issue of the Public Library of Science, Computational Biology, reveals for the first time that only information about pairs of temporal stimulus patterns is relayed to the brain.

"We were surprised to find that higher-order stimulus combinations were not encoded, because they are so prevalent in our natural environment," says the study's leader Tatyana Sharpee, Ph.D., an assistant professor in the Computational Neurobiology Laboratory and holder of the Helen McLorraine Developmental Chair in Neurobiology. "Humans are quite sensitive to changes in higher-order combinations of spatial patterns. We found it not to be the case for temporal patterns. This highlights a fundamental difference in the spatial and temporal aspects of visual encoding."

The video will load shortly
This is an example of the flickering light stimulus presented during the experiment. Movie: Courtesy of Dr. Tatyana Sharpee, Salk Institute for Biological Studies

The human face is a perfect example of a higher-order combination of spatial patterns. All components—eyes, nose, mouth—have very specific spatial relationships with each other, and not even Picasso, in his Cubist period, could throw the rules completely overboard.

Our eyes take in the visual environment and transmit information about individual components, such as color, position, shape, motion and brightness to the brain. Individual in the retina get excited by certain features and respond with an electrical signal, or spike, that is passed on to visual centers in the brain, where information sent by neurons with different preferences is assembled and processed.

For simple sensory events—like turning on a light, for example—the brightness correlates well with the spike probability in a luminance-sensitive cell in the . "However, over the last decade or so, it has become apparent that neurons actually encode information about several features at the same time," says graduate student and first author Jeffrey D. Fitzgerald.

"Up to this point, most of the work has been focused on identifying the features the cell responds to," he says. "The question of what kind of information about these features the cell is encoding had been ignored. The direct measurements of stimulus-response relationships often yielded weird shapes [see Figure 1, for example], and people didn't have a mathematical framework for analyzing it."

To overcome those limitations, Fitzgerald and colleagues developed a so-called minimal model of the nonlinear relationships of information processing systems by maximizing a quantity that is referred to as noise entropy. The latter describes the uncertainty about a neuron's probability to spike in response to a stimulus.

When Fitzgerald applied this approach to recordings of visual neurons probed with flickering movies, which co-author Lawrence Sincich and Jonathan Horton at the University of California, San Francisco, had made, he discovered that on average, first-order correlations accounted for 78 percent of the encoded information, while second-order correlations accounted for more than 92 percent. Thus, the brain received very little information about correlations that were higher than second order.

"Biological systems across all scales, from molecules to ecosystems, can all be considered information processors that detect important events in their environment and transform them into actionable information," says Sharpee. "We therefore hope that this way of 'focusing' the data by identifying maximally informative, critical stimulus-response relationships will be useful in other areas of systems biology."

Related Stories

Recommended for you

Natural compound reduces signs of aging in healthy mice

October 27, 2016

Much of human health hinges on how well the body manufactures and uses energy. For reasons that remain unclear, cells' ability to produce energy declines with age, prompting scientists to suspect that the steady loss of efficiency ...

A metabolic switch to turn off obesity

October 27, 2016

You've tried all the diets. No matter: you've still regained the weight you lost, even though you ate well and you exercised regularly! This may be due to a particular enzyme in the brain: the alpha/beta hydrolase domain-6 ...

Mitochondria control stem cell fate

October 27, 2016

What happens in intestinal epithelial cells during a chronic illness? Basic research conducted at the Chair of Nutrition and Immunology at the Technical University of Munich (TUM) addressed this question by generating a new ...

Scientists develop 'world-first' 3-D mammary gland model

October 27, 2016

A team of researchers from Cardiff University and Monash Biomedicine Discovery Institute has succeeded in creating a three-dimensional mammary gland model that will pave the way for a better understanding of the mechanisms ...


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.