Fig. 1. Flies faced with panoramic reverse-phi motion exhibit reverse-optomotor responses. (A) A schematic model for fly motion vision consists of two stages: (i) local motion is computed by columnar circuits within the lamina and medulla, which is followed by (ii) global integration of local motion signals in the lobula plate tangential cells. The output of the LPTCs is thought to control optomotor behavior. (B) The fly is suspended within a virtual flight arena where the amplitude of each wing-beat is tracked by an optical detector. The difference between the two wing-beats (left minus right wing-beat amplitude) is proportional to yaw torque (24). For example, when the amplitude of the left wing-beat is greater than the right, the fly is attempting to steer to the right with clockwise torque. (C) Space-time depictions of motion stimuli used in rotation experiments—all three are square-wave patterns moving from the top left to the bottom right (Movie S1). (D) Mean turning behavior of 10 flies (±SEM) in response to open-loop rotation of standard (Top), reverse-phi (Middle), and reverse-phi out-of-phase (Bottom) square-wave gratings (? = 30°). The speed of reverse-phi out-of-phase stimuli moved at one-half of the speed of the standard and reverse-phi stimuli, because motion occurred only in every second frame (the space-time plot in C). Flies were presented with motion in both directions (CW and CCW), but responses are combined and plotted for CW rotation (SI Text has a complete description of data treatment). (c) PNAS, doi: 10.1073/pnas.1100062108

(Medical Xpress) -- We experience an interesting phenomenon when the contrast of an image flickers as it moves across our visual field – namely, an illusory reversal in the direction of motion. Moreover, this reverse-phi illusion occurs in a surprisingly wide range of species, indicating that this is a common evolutionary adaptation. Recently, researchers at the Howard Hughes Medical Institute's Janelia Farm Research Campus demonstrated that motion-sensitive neurons in the brain of the ubiquitous fruit fly Drosophila melanogaster respond to the reverse-phi illusion and generate a change in its flight behavior.

The paper, authored by Prof. Michael Reiser, graduate student John Tuthill, and postdoctoral fellow Eugenia Chiappe, focuses primarily on experiments performed on tethered flies in a virtual flight simulator. While experiments in the behavior of tethered flying flies have been conducted for over 50 years, and are considered routine and reliable, aspects of this particular study posed some obstacles.

“The main technical obstacle was imaging from the fly brain while the animal was walking,” notes Tuthill. “When flies are stationary, their is in a state of relative quiescence. Their visual neuronal responses are amplified only when they move – and looking at visual neuron activity in actively behaving flies has until recently been impossible.” The team addressed this challenge by adopting techniques for 2-photon imaging in walking flies developed by their collaborators in Vivek Jayaraman’s lab at Janelia.

In addition, Tuthill adds, “this was one of the first applications of tools that have been developed for measuring calcium signals in the brains of awake, behaving flies. Now that these very difficult techniques are possible, we can ask specific questions about how information is processed in the brains of animals as they interact with the environment.”

Demonstration of the three basic visual stimuli used in Fig. 1: standard, reverse-phi, and reverse-phi out of phase motion. All stimuli move from left to right at 8 frames/s and correspond to a flattened view of the entire cylindrical display. (c) PNAS, doi: 10.1073/pnas.1100062108

This is not to say that the team isn’t continually innovating. Currently, for example, they’re developing tools to record from visual neurons in both walking and flying flies to help them understand how a fly’s visual system works as it moves through its environment. “The ideal case would be to combine behavior, electrophysiology, and imaging in a single experiment,” Tuthill envisions. “Instead of doing separate tethered flight and physiology experiments, we would measure the behavior of the fly as it flies through a virtual landscape, while simultaneously recording the activity of neurons in the visual system.” They are also interested in combining 2-photon imaging (which has relatively poor temporal resolution) with whole-cell patch clamp electrophysiology, the goal being an improved ability to record signals from neurons in the fly’s peripheral visual system that respond with a very short latency.

Another area that is extremely challenging is the vast array of unique cell types in the visual system in between the photoreceptors and motion-sensitive neurons that the team recorded from – somewhere in this dense region, consisting of two neuropils called the lamina and medulla, neurons implement the fundamental computation of motion detection. “There are many ways in which this computation could be implemented, and some have been described mathematically,” Tuthill explains, “but we don’t understand how information is processed in these intermediate circuits.” The good news is that their research suggests a way in which neural motion detection correlates light signals in space and time using a fixed temporal delay that may better illuminate research into the cellular basis of these computations. “The primary task now is to dive into this neural jungle with electrodes and microscopes and find the neurons involved,” he adds. “Only then will we know how motion detection is computationally implemented in the fly visual system.”

Going forward, Tuthill is interested in looking at genetic tools for that are becoming specific enough to allow them to assign precise functional roles to specific neuronal cell types involved in visually-guided behaviors. “Once a behavioral role is identified for a neuron type, one can then use calcium imaging or electrophysiology to understand how that neuron fulfills a certain behavioral function and operates within the dense network of the brain. This is the approach that we and others are taking to try to understand the neural mechanisms of computations like motion detection.”

Moreover, Tuthill notes that while the genetic techniques used in this paper (the GAL4-UAS expression system) are likely not relevant for treating humans because they require genetic engineering across multiple generations of flies, “because many of the same genetic pathways underlie the development of the nervous system across the animal kingdom, elucidation of developmental genes and signaling pathways in the fly will continue to have an impact on the treatment of visual disorders. We hope – although we are still a long way from knowing this – that many principles of neural computation will also be shared across species, and that someday we can apply the knowledge we have obtained in animals like flies and mice to understanding other brains, including those of humans. Therapeutic intervention at the level of individual neurons is probably a long way off given our current lack of basic knowledge about the brain, but someday it may be possible – and perhaps less far off for the retina.”

In terms of other potentially promising near-term and future applications, Tuthill points out that the basic computational principles they discover in the visual system could someday spur innovations in machine vision – although he acknowledges that it will be some time before they understand enough about how these neural circuits operate to make a deep contribution. “One interesting aspect of this particular study is that we show that the reverse-phi illusion is perceived by flies much as it is by humans. There is no particular reason why this has to be true—there are many motion detection algorithms that would not detect reverse-phi. Perhaps there is some fundamental reason that many animals perceive this illusion, and understanding why this is true could lead to improved machine vision algorithms for detecting and processing image motion in the face of a noisy and far from ideal visual world.”

More information:

-- Neural correlates of illusory motion perception in Drosophila, Published online before print May 17, 2011, doi: 10.1073/pnas.1100062108 ; PNAS, June 7, 2011, vol. 108 no. 23, pp. 9685-9690.

-- Howard Hughes Medical Institute’s Janelia Farm Research Campus

-- Prof. Michael Reiser's page

-- John Tuthill's page

-- Vivek Jayaraman’s lab