The brain learns completely differently than we've assumed since the 20th century

March 23, 2018, Bar-Ilan University
Image representing the old synaptic (red) and new dendritic (green) learning scenarios of the brain. In the center a neuron with two dendritic trees collects incoming signals via many thousands of tiny adjustable learning parameters, the synapses, represented by red valves. In the new dendritic learning scenario (right) only two adjustable red valves are located in close proximity to the computational element, the neuron. The scale is such that if a neuron collecting its incoming signals is represented by a person's faraway fingers, the length of its hands would be as tall as a skyscraper (left). Credit: Prof. Ido Kanter

The brain is a complex network containing billions of neurons, where each of these neurons communicates simultaneously with thousands of other via their synapses (links). However, the neuron actually collects its many synaptic incoming signals through several extremely long ramified "arms" only, called dendritic trees.

In 1949 Donald Hebb's pioneering work suggested that learning occurs in the by modifying the strength of the synapses, whereas function as the computational elements in the brain. This has remained the common assumption until today.

Using new theoretical results and experiments on neuronal cultures, a group of scientists, led by Prof. Ido Kanter, of the Department of Physics and the Gonda (Goldschmied) Multidisciplinary Brain Research Center at Bar-Ilan University, has demonstrated that the central assumption for nearly 70 years that learning occurs only in the synapses is mistaken.

In an article published today in the journal Scientific Reports, the researchers go against conventional wisdom to show that learning is actually done by several dendrites, similar to the slow learning mechanism currently attributed to the synapses.

"The newly discovered process of learning in the dendrites occurs at a much faster rate than in the old suggesting that learning occurs solely in the synapses. In this new dendritic learning process, there are a few adaptive parameters per neuron, in comparison to thousands of tiny and sensitive ones in the synaptic learning scenario," said Prof. Kanter, whose research team includes Shira Sardi, Roni Vardi, Anton Sheinin, Amir Goldental and Herut Uzan.

The newly suggested learning scenario indicates that learning occurs in a few dendrites that are in much closer proximity to the neuron, as opposed to the previous notion. "Does it make sense to measure the quality of air we breathe via many tiny, distant satellite sensors at the elevation of a skyscraper, or by using one or several sensors in close proximity to the nose? Similarly, it is more efficient for the neuron to estimate its incoming signals close to its computational unit, the neuron," says Kanter. Hebb's theory has been so deeply rooted in the scientific world for 70 years that no one has ever proposed such a different approach. Moreover, synapses and dendrites are connected to the neuron in a series, so the exact localized site of the seemed irrelevant.

Another important finding of the study is that weak , previously assumed to be insignificant even though they comprise the majority of our brain, play an important role in the dynamics of our brain. They induce oscillations of the learning parameters rather than pushing them to unrealistic fixed extremes, as suggested in the current synaptic learning scenario.

The new learning scenario occurs in different sites of the brain and therefore calls for a reevaluation of current treatments for disordered brain functionality. Hence, the popular phrase "neurons that fire together wire together", summarizing Donald Hebb's 70-year-old hypothesis, must now be rephrased. In addition, the learning mechanism is at the basis of recent advanced machine learning and deep learning achievements. The change in the learning paradigm opens new horizons for different types of and artificial intelligence based applications imitating our brain functions, but with advanced features and at a much faster speed.

Explore further: Physicists negate century-old assumption regarding neurons and brain activity

More information: Scientific Reports (2018). DOI: 10.1038/s41598-018-23471-7 ,

Related Stories

Physicists negate century-old assumption regarding neurons and brain activity

December 21, 2017
Neurons are the basic computational building blocks that compose our brain. Their number is approximately one Tera (trillion), similar to Tera-bits in midsize hard discs. According to the neuronal computational scheme, which ...

Team examines the molecular basis of brain plasticity and the manner in which neurons 'learn'

December 21, 2016
UC Santa Barbara neuroscientist Kenneth S. Kosik has been studying the brain for decades. His UC Santa Barbara neurobiology lab focuses on the evolution of synapses that connect neurons and the genetics of Alzheimer's disease. ...

Study reveals reciprocal activity of brain proteins necessary for learning and memory

October 13, 2017
A UCLA team reports that a protein called IDOL targets and prevents overproduction of the synaptic protein ApoER2, an adjustment that allows connections between neurons to change during the learning process for humans and ...

The brain sets a unique learning rate for everything we do, by self-adjusting to the environment

April 19, 2017
Each time we get feedback, the brain is hard at work updating its knowledge and behavior in response to changes in the environment; yet, if there's uncertainty or volatility in the environment, the entire process must be ...

Neuroscientists discover new learning rule for pattern completion

May 13, 2016
"Fire together, wire together" is the famous abridged version of the Hebbian rule. It states that neurons in the brain adapt during the learning process, a mechanism which is called neuronal plasticity. Hebb's theory dates ...

Recommended for you

Perinatal hypoxia associated with long-term cerebellar learning deficits and Purkinje cell misfiring

August 18, 2018
Oxygen deprivation associated with preterm birth leaves telltale signs on the brains of newborns in the form of alterations to cerebellar white matter at the cellular and the physiological levels. Now, an experimental model ...

CRISPR technology targets mood-boosting receptors in brain

August 17, 2018
An estimated 13 percent of Americans take antidepressant drugs for depression, anxiety, chronic pain or sleep problems. For the 14 million Americans who have clinical depression, roughly one third don't find relief with antidepressants.

People are more honest when using a foreign tongue, research finds

August 17, 2018
New UChicago-led research suggests that someone who speaks in a foreign language is probably more credible than the average native speaker.

Critical role of DHA on foetal brain development revealed

August 17, 2018
Duke-NUS researchers have found evidence that a natural form of Docosahexaenoic Acid (DHA) made by the liver called Lyso-Phosphatidyl-Choline (LPC-DHA), is critical for normal foetal and infant brain development, and that ...

Automated detection of focal epileptic seizures in a sentinel area of the human brain

August 17, 2018
Patients with focal epilepsy that does not respond to medications badly need alternative treatments.

Men and women show surprising differences in seeing motion

August 16, 2018
Researchers reporting in the journal Current Biology on August 16 have found an unexpected difference between men and women. On average, their studies show, men pick up on visual motion significantly faster than women do.


Adjust slider to filter visible comments by rank

Display comments: newest first

4 / 5 (1) Mar 23, 2018
It's probably going to be a bit of both, with synapses doing the fine grained learning and dendrites doing the coarse work. After all, the brain had to evolve somehow - it goes from simple to complex, and evolution doesn't ditch working solutions unless they are harmful.

Fast coarse selection by dendrite trees is likely to be the robust solution, and then the weights of the synapses are modified to allow for more detailed work, like responding to a person's hair color rather than just the fact that they have a head.
2.3 / 5 (3) Mar 23, 2018
This was settled science that turned out to be not so settled. The consensus is now being questioned.
not rated yet Mar 23, 2018
Looking at the picture it is easy to imagine the neuron's dendrite "arms" going up together. But now picture them as they really are wired in 3 dimensions to a cluster of other brain cells in close proximity.

This is indeed a novel way of looking at constructing back propagation type networks(in the broadest sense of them).

I could see how such a network could work, but I am at a total loss how to make one run reasonably on anything but a supercomputer. Considering the price of hardware... you might be able to buy a secretarial assistant for 700 bucks from your local computer store... in 20 years.
5 / 5 (1) Mar 24, 2018
This is big. This could improve deep learning by a large margin. I've read a (relatively new) paper indicating much better properties for such neural networks. Very exciting. It would probably require modification of existing neuron hardware chip.
not rated yet Mar 25, 2018
yes there are neural network chips -- but by far most neural networks are software based and this can be easily tested once a model is created. But the question is what type of learning does this facilitate. As not all learning is the same and the structure of the network helps to determine what is easily learned.
not rated yet Mar 26, 2018
This was settled science that turned out to be not so settled. The consensus is now being questioned.

A consensus exists relative to a question. Nobody thought to question the original assumption because it seemed to work fine, and still does.

The "settled science" isn't actually being questioned here, but simply added on.
not rated yet Mar 26, 2018
I've looked at this paper and it looks promising. We'll have to wait for a wider adoption of course. You can get pretty good idea how this new dendritic deep learning works in code (which is of my interest).


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.