The visual system as economist: Neural resource allocation in visual adaptation

by Stuart Mason Dambrot feature
Results of experiment 1. (A) The circles and lines represent the sampled stimulus conditions. We measured “slices” of the spatiotemporal sensitivity function: at one spatial or one temporal frequency (one column or one row of circles), or at one speed (an oblique line). (B) Results of experiment 1 in one observer (O3). The contour plot is an estimate of contrast sensitivity function obtained by fitting a standard model (3) to the estimates of sensitivity at conditions marked by circles in A. The white crosses mark conditions where sensitivity was maximal within the speeds marked by oblique lines in A. Copyright © PNAS, doi:10.1073/pnas.1204109110

(Medical Xpress)—It has long been held that in a new environment, visual adaptation should improve visual performance. However, evidence has contradicted this expectation: Adaptation sometimes not only decreases sensitivity for the adapting stimuli, but can also change sensitivity for stimuli very different from the adapting ones. Recently, scientists at the Salk Institute for Biological Studies and the Schepens Eye Research Institute formulated and tested the hypothesis that these results can be explained by a process that optimizes sensitivity for many stimuli, rather than changing sensitivity only for those stimuli whose statistics have changed. By manipulating stimulus statistics – that is, measuring visual sensitivity across a wide range of spatiotemporal luminance modulations while varying the distribution of stimulus speeds – the researchers demonstrated a large-scale reorganization of visual sensitivity. This reorganization formed an orderly pattern of sensitivity gains and losses predicted by a theory describing how visual systems can optimize the distribution of receptive field characteristics across stimuli.

Researchers Sergei Gepshtein, Luis A. Lesmes and Thomas D. Albright faced a variety of challenges in conducting their study. "It's well known that exposure to new changes our perception of these . However, understanding the nature of this adaptive process – that is, why it happens and what its goals are – has been elusive," Gepshtein tells Medical Xpress. "Previous visual adaptation studies produced puzzling results that would not agree with a simple explanation." More specifically, Gepshtein explains, visual adaptation would sometimes improve sensitivity to new stimuli, but sometimes sensitivity would decrease, or would change for stimuli that differed from the new stimuli.

"From our current perspective, previous results appeared to be inconsistent because adaptation was viewed as a local . Rather," Gepshtein points out, "visual perception is mediated by multiple neuronal cells organized in a system in which each cell is responsive only to a small range of stimuli, but the system as a whole is responsive to the entire ensemble of stimuli."

From the system perspective, the question the scientists faced was how the system organizes sensitivity of its multiple cells across the full range of stimuli. "In the previous view, which we call stimulus account of adaptation, the question was how changing stimulus frequency or persistency would change sensitivity to that stimulus," Gepshtein continues. "In our present system account of adaptation, however, we instead ask how adaptation affects sensitivity to the entire ensemble of potential stimuli." In other words, instead of a local approach (sensitivity changes to individual stimuli), the team researchers adopted a global approach (how the distribution of sensitivity across all stimuli is affected by changes in the distribution of stimulation).

Interestingly, since the number of neuronal cells in the system is large but limited, the scientists view the visual system's organization of sensitivity as an economic process – that is, as the allocation of limited resources. "When stimulation changes," Gepshtein explains, "the visual system reorganizes its sensitivity by reallocating neural resources. Because the resources are limited, increasing sensitivity to some stimulus must be accompanied by decreasing sensitivity to some other stimulus. It is therefore expected that sensory adaptation creates a pattern of gains and losses in sensitivity."

Gepshtein cites a theoretical study of neural resource allocation in the visual system that suggested how sensitivity would change if adaptation were to cause a (re)allocation of resources across the entire range of stimuli1. In particular, he summarizes, the study suggested that the shape of the distribution of sensitivity in human vision is consistent with predictions of efficient allocation of the limited neural resources, and that changes in simulation would cause a shift of the sensitivity function. This shift would entail a characteristic pattern of local gains and losses in sensitivity.

Essentially, Gepshtein continues, their theory is based on the fact that measurements by different cells are characterized by different spatial and temporal intervals of measurement called receptive fields. This fact entails different measurement uncertainty – that is, different expected precision – for different cells, meaning that neural cells with receptive fields of different sizes are expected to be differentially useful for measuring different stimuli. "The theoretical study specifies how such cells should be allocated to stimuli by showing that the most efficient allocation of cells to stimuli results in a sensitivity function similar to the well-known spatiotemporal contrast sensitivity function, and also how changes in speed distribution should cause a shift of the function."

In greater detail, Gepshtein adds, the theory is concerned with how measurements by individual cells can be organized in the visual system to attain efficient performance of the system – that is, for the full range of visual stimuli. "Again, each cell can only measure a limited range of stimuli – but the capacity of every cell is limited by another fundamental constraint." The kind of information that can be obtained from a single cell is limited because of the uncertainty principle of measurement2.

Results of experiment 2. (A) Contrast sensitivity functions measured in the two stimulus contexts for one observer (O1). A standard model of contrast sensitivity was fitted to the estimates of sensitivity in high-speed (Upper) and low-speed (Lower) contexts. The warm and cool colors represent high and low sensitivities. Sensitivity functions for all observers are displayed in Fig. S2. (B) The change map on the bottom summarizes how sensitivity changed from the low-speed to high-speed stimulus contexts for all stimulus conditions (Eq. 1). The shades of red and blue represent the gains and losses of sensitivity, and the white regions represent no change. Above the map, samples of sensitivity changes for two speeds demonstrate that the pattern of gains and losses of sensitivity is reversed across speeds, similar to the prediction illustrated in Fig. 2C. Change maps for all observers are displayed in Fig. 5. Copyright © PNAS, doi:10.1073/pnas.1204109110

In the field of sensory perception, the principle is associated with Dennis Gabor, a brilliant engineer and inventor, who in 1946 formulated the principle and studied its consequences for auditory perception. In physics the same principle is associated with the name of Werner Heisenberg – one of the founders of quantum mechanics. "In either formulation," Gepshtein notes, "the principle captures a limit to the precision with which certain pairs of physical properties can be measured at the same time - in our case, stimulus location and frequency content." Stimulus location concerns where or when the stimulus occurs in space or time, respectively; stimulus frequency content concerns the ability to identify the stimulus.

"To test predictions of this theory," Gepshtein says, "we varied the distribution of stimuli instead of inducing adaptation by a single stimulus, and measured changes of sensitivity across a broad range of stimuli. We implemented changes of stimulus speed by using the same range of speed in all experiments, while sampling different speeds from this range more or less often." The researchers found that manipulating stimulus speeds caused a large-scale pattern of sensitivity changes and that the adaptive changes added up to an orderly pattern – a pattern similar to that predicted by the theory of efficient allocation.

Despite the extent and complexities of the challenges the researchers faced, Gepshtein says that their study was made possible by two key factors: the theoretical insight discussed above, and the new rapid methods of measurement of developed by Dr. Lesmes and his colleagues. "Previously," Gepshtein explains, "sensitivity had to be measured separately for many stimuli, which took a prohibitively long time. Dr. Lesmes' new methods allowed us to measure parameters of the sensitivity function directly, rather than fitting the sensitivity function to results of multiple separate sensitivity measurements, thereby optimizing the measurements for estimating sensitivity functions."

Moving forward, the scientists are planning the next steps in their research. "This work is one of the first demonstrations of how previously puzzling results can make sense from the perspective of efficient allocation of limited neural recourses," Gepshtein points out. "We're at the beginning of a large series of studies inspired by this approach." For example, so far the team has studied motion perception using only one spatial dimension – but to study motion direction, at least two spatial dimensions must be included. "We'll generalize the theoretical framework and the measurement procedures to study how motion sensitivity is controlled across both speed and direction," Gepshtein adds. "This will allow us to use our approach to investigate perception of natural stimuli, such as movies that capture motion in natural visual scenes."

In addition, the scientists have been concerned with stimuli at a single spatial location at a time – but Gepshtein points out that the economic view suggests that neural resources can be (re)allocated across spatial locations, just as they are (re)allocated across stimulus speeds. "It's an obvious extension of our present approach, and it's one of the steps we'll have to take in order to develop a complete understanding of motion adaptation."

It's also important, he adds, to better understand connections between their present results and those of previous adaptation studies, which as discussed found that adaptation can cause gains of losses of sensitivity. "Now we've found that gains and losses can be special cases of a large-scale pattern of sensitivity changes," Gepshtein stresses. "For the sake of completeness, it's important to show how previous results are consistent with our new results."

That said, Gepshtein cautions that the connection is not as simple as showing that the previous (local) results add up to the newly found large-scale pattern, because the local and global studies use different distributions (narrow and broad, respectively) of adapting stimuli. "From our present perspective, there's an interesting paradox here, in that the process of measurement changes the object of measurement – that is, the visual system. By nature of adaptation, different distributions cause different patterns of adaptation. This is one of the questions we're pursuing at the moment – and we're doing so by tracing adaptation effects as we change the distribution of adapting stimuli from narrow to broad."

Looking further ahead, Gepshtein says that the team is very interested in how large-scale sensitivity transformation is implemented in neural circuits. "We approach this question two ways," he explains. "Firstly, we perform simulations of neuronal plasticity in the circuits that control size, and trace the effects of this plasticity to changes in the function." (This work is done in collaboration with a group of researchers led by computer scientist Peter Jurica3 at the RIKEN Brain Science Institute.)

"Secondly," he continues, "we're beginning to investigate how neural circuits and neural cells change their preferences in response to the manipulation of speed distribution employed in the present study." This physiological project is led by Prof. Albright, who directs the Salk Institute Vision Center Laboratory.

Gepshtein says that the applications of their research are very broad. "They concern the technologies where motion sensing and compression of dynamic visual signals are involved. To illustrate, detection of motion using modern sensors is fast and inexpensive. Finding the meaning of motion signals – for example, discovering the identity of moving objects – is slow and requires considerable computational resources because information from multiple local sensors has to be integrated and analyzed. Our studies reveal how this integration is implemented in biological vision."

The emerging picture, Gepshtein notes, is that biological visual systems improve their efficiency, including processing speed, by rapidly reallocating their computational resources to important stimuli. "This reallocation is graded," he concludes. "It doesn't leave the system unprepared for perception of stimuli that are less important at the moment. Rather, all stimuli are monitored, albeit with different quality, so the resources can be rapidly moved to the newly important aspects of stimulation."

More information: Sensory adaptation as optimal resource allocation, PNAS March 12, 2013 vol. 110 no. 11 4368-4373, doi:10.1073/pnas.1204109110 [Updated version with Corrections marked]


1The economics of motion perception and invariants of visual sensitivity, Journal of Vision June 21, 2007 vol. 7 no. 8 article 8, doi: 10.1167/7.8.8
2Two psychologies of perception and the prospect of their synthesis (Section 4, pp. 247-263)
3Unsupervised adaptive optimization of motion-sensitive systems guided by measurement uncertainty, Proceedings of the Third International Conference on Intelligent Sensors, Sensor Networks and Information Processing, ISSNIP 2007, p. 179-184

Related Stories

Reward elicits unconscious learning in humans

Mar 11, 2009

A new study challenges the prevailing assumption that you must pay attention to something in order to learn it. The research, published by Cell Press in the March 12th issue of the journal Neuron, demonstrates that stimul ...

Learning through mere exposure

May 11, 2011

In cooperation with colleagues from the Leibniz Institute for Employment Research of the TU Dortmund, neuroscientists in Bochum have demonstrated that human visual perception and attention can be improved without training. ...

Touch helps make the connection between sight and hearing

Mar 19, 2009

The sense of touch allows us to make a better connection between sight and hearing and therefore helps adults to learn to read. This is what has been shown by the team of Édouard Gentaz, CNRS researcher at the Laboratoire ...

Recommended for you

Continuing the quest for better stroke therapies

14 hours ago

Helping people recover from the debilitating effects of a stroke is an immensely complex challenge that requires deep knowledge of neurophysiology as well as effective therapy. Advancing such knowledge to improve therapeutic ...

At last, hope for ALS patients?

17 hours ago

U of T researchers have found a missing link that helps to explain how ALS, one of the world's most feared diseases, paralyses and ultimately kills its victims. The breakthrough is helping them trace a path to a treatment ...

User comments

Adjust slider to filter visible comments by rank

Display comments: newest first

Whydening Gyre
3 / 5 (2) Mar 30, 2013
What they are saying is - "that new computer screen you just bought might look a little odd at first, but you'll LOVE it when you finally get used to it..."
1 / 5 (1) Mar 30, 2013
First, congrulations to the Researchers Sergei Gepshtein, Luis A. Lesmes and Thomas D. Albright.

Since my expertise is in auditory perception I share the same esteem as you for Dennis Gabor and his life's research.

Here the associated literature:
Observability of complex systems
An abstract excerpt:

"A system is called observable if we can reconstruct the system's complete internal state from its outputs. Here, we adopt a graphical approach derived from the dynamical laws that govern a system to determine the sensors that are necessary to reconstruct the full internal state of a complex system. We apply this approach to biochemical reaction systems, finding that the identified sensors are not only necessary but also sufficient for observability."

Since I can not venture past the acoustical/auditory neural resource (re-)allocation in auditory adaptation I can only allude to the visual adaptation by way of comparison.

To be cont...
1 / 5 (1) Mar 30, 2013

Let it be said that the auditory system for plasticity is far less than that of the visual peripheral system. The auditory tonotopy is written in concrete so to speak. The signals however are send further, processed and stored in the neuronal regions of the brain.

The "reallocation" of information you are referring to is called a particular type of DNA damage, known as a double-strand break, or DSB.
Here the reference:

These "breaks" and subsequent repair is not only the "reallocation" referred by you. Additionally this normal physical
process is "how large-scale sensitivity transformation is implemented in neural circuits".

This is what you are searching for. This is the supplement to the molecular processes of hearing occurring in brain. The basis of hearing neurologically starts with a brain specific molecular hormone of estrogen. What the molecular basis for vision in the brain is, I can't begin to

1 / 5 (1) Mar 30, 2013

to speculate. Nevertheless your research as well as my research comes all together under the assertion and single aspect that DSB is a normal physical neurological process associated with learning and memory.

I am grateful and thankful that I was privileged enough to have seen and read an article about your research. And to have had at least an indirect opportunity to share thoughts with you and your team.
All the best.

5 / 5 (1) Mar 31, 2013
Note coverage of "Observability of complex systems" at
Whydening Gyre
3 / 5 (2) Apr 02, 2013
Same one I've been referencing and getting 1'd for. hmmmm....
Whydening Gyre
3 / 5 (2) Apr 02, 2013
CQT. Interesting to hear (pardon the pun)that audio neural processing is estrogen related(based?). Seems appropriate since women always "know what they heard" you say (they always seem it remember that, too...).
Women say men are visual - does that mean testosterone might play a part in vision? Just wondering...
1 / 5 (1) Apr 02, 2013

Liisa A. Tremere,
Jin Kwon Jeong,
and Raphael Pinaud

Estradiol Shapes Auditory Processing in the Adult Brain by Regulating Inhibitory Transmission and Plasticity-Associated Gene Expression
The Journal of Neuroscience, 6 May 2009, 29(18):5949-5963; doi:10.1523/JNEUROSCI.0774-09.2009


Since 2009 there have been hundreds of research papers following the initial lead of this first report with the researchers from 2009 themselves following their own lead with subsequent research and findings.

Estradiol - (brain specific estrogen) is a candidate for all that which is received and processed by the brain from all the senses.

Mother nature. A woman world. Folklore is a man's world.
Once learning and memory are demystified biochemically and physically (neurologically) the rest follows.
Whydening Gyre
3 / 5 (2) Apr 02, 2013
Mother nature. A woman world. Folklore is a man's world.
Once learning and memory are demystified biochemically and physically (neurologically) the rest follows.

By "Folklore" I suspect you mean speculation(s) about Mother nature...:-)