The brain learns completely differently than we've assumed since the 20th century

March 23, 2018, Bar-Ilan University
Image representing the old synaptic (red) and new dendritic (green) learning scenarios of the brain. In the center a neuron with two dendritic trees collects incoming signals via many thousands of tiny adjustable learning parameters, the synapses, represented by red valves. In the new dendritic learning scenario (right) only two adjustable red valves are located in close proximity to the computational element, the neuron. The scale is such that if a neuron collecting its incoming signals is represented by a person's faraway fingers, the length of its hands would be as tall as a skyscraper (left). Credit: Prof. Ido Kanter

The brain is a complex network containing billions of neurons, where each of these neurons communicates simultaneously with thousands of other via their synapses (links). However, the neuron actually collects its many synaptic incoming signals through several extremely long ramified "arms" only, called dendritic trees.

In 1949 Donald Hebb's pioneering work suggested that learning occurs in the by modifying the strength of the synapses, whereas function as the computational elements in the brain. This has remained the common assumption until today.

Using new theoretical results and experiments on neuronal cultures, a group of scientists, led by Prof. Ido Kanter, of the Department of Physics and the Gonda (Goldschmied) Multidisciplinary Brain Research Center at Bar-Ilan University, has demonstrated that the central assumption for nearly 70 years that learning occurs only in the synapses is mistaken.

In an article published today in the journal Scientific Reports, the researchers go against conventional wisdom to show that learning is actually done by several dendrites, similar to the slow learning mechanism currently attributed to the synapses.

"The newly discovered process of learning in the dendrites occurs at a much faster rate than in the old suggesting that learning occurs solely in the synapses. In this new dendritic learning process, there are a few adaptive parameters per neuron, in comparison to thousands of tiny and sensitive ones in the synaptic learning scenario," said Prof. Kanter, whose research team includes Shira Sardi, Roni Vardi, Anton Sheinin, Amir Goldental and Herut Uzan.

The newly suggested learning scenario indicates that learning occurs in a few dendrites that are in much closer proximity to the neuron, as opposed to the previous notion. "Does it make sense to measure the quality of air we breathe via many tiny, distant satellite sensors at the elevation of a skyscraper, or by using one or several sensors in close proximity to the nose? Similarly, it is more efficient for the neuron to estimate its incoming signals close to its computational unit, the neuron," says Kanter. Hebb's theory has been so deeply rooted in the scientific world for 70 years that no one has ever proposed such a different approach. Moreover, synapses and dendrites are connected to the neuron in a series, so the exact localized site of the seemed irrelevant.

Another important finding of the study is that weak , previously assumed to be insignificant even though they comprise the majority of our brain, play an important role in the dynamics of our brain. They induce oscillations of the learning parameters rather than pushing them to unrealistic fixed extremes, as suggested in the current synaptic learning scenario.

The new learning scenario occurs in different sites of the brain and therefore calls for a reevaluation of current treatments for disordered brain functionality. Hence, the popular phrase "neurons that fire together wire together", summarizing Donald Hebb's 70-year-old hypothesis, must now be rephrased. In addition, the learning mechanism is at the basis of recent advanced machine learning and deep learning achievements. The change in the learning paradigm opens new horizons for different types of and artificial intelligence based applications imitating our brain functions, but with advanced features and at a much faster speed.

Explore further: Physicists negate century-old assumption regarding neurons and brain activity

More information: Scientific Reports (2018). DOI: 10.1038/s41598-018-23471-7 , https://www.nature.com/articles/s41598-018-23471-7

Related Stories

Physicists negate century-old assumption regarding neurons and brain activity

December 21, 2017
Neurons are the basic computational building blocks that compose our brain. Their number is approximately one Tera (trillion), similar to Tera-bits in midsize hard discs. According to the neuronal computational scheme, which ...

Team examines the molecular basis of brain plasticity and the manner in which neurons 'learn'

December 21, 2016
UC Santa Barbara neuroscientist Kenneth S. Kosik has been studying the brain for decades. His UC Santa Barbara neurobiology lab focuses on the evolution of synapses that connect neurons and the genetics of Alzheimer's disease. ...

Study reveals reciprocal activity of brain proteins necessary for learning and memory

October 13, 2017
A UCLA team reports that a protein called IDOL targets and prevents overproduction of the synaptic protein ApoER2, an adjustment that allows connections between neurons to change during the learning process for humans and ...

The brain sets a unique learning rate for everything we do, by self-adjusting to the environment

April 19, 2017
Each time we get feedback, the brain is hard at work updating its knowledge and behavior in response to changes in the environment; yet, if there's uncertainty or volatility in the environment, the entire process must be ...

Neuroscientists discover new learning rule for pattern completion

May 13, 2016
"Fire together, wire together" is the famous abridged version of the Hebbian rule. It states that neurons in the brain adapt during the learning process, a mechanism which is called neuronal plasticity. Hebb's theory dates ...

Recommended for you

Genetic changes tied to rare brain bleeds in babies

December 18, 2018
(HealthDay)—Researchers say they've identified genetic mutations linked with a blood vessel defect that can lead to deadly brain bleeds in babies.

What prevents remyelination? New stem cell research reveals a critical culprit

December 18, 2018
New research on remyelination, the spontaneous regeneration of the brain's fatty insulator that keeps neurons communicating, could lead to a novel approach to developing treatments for multiple sclerosis (MS) and other inflammatory ...

Gene variant found in brain complicit in MS onset

December 18, 2018
Multiple sclerosis (MS) is an autoimmune disease affecting the function of the central nervous system. Up to now, most of the 230 genetic variants associated with the disease have been linked to changes in immune cells. However, ...

Biologists identify promising drug for ALS treatment

December 18, 2018
A drug typically used to treat hepatitis could slow the progression of ALS, also known as Lou Gehrig's disease, according to new research by University of Alberta scientists.

Communication between neural networks

December 18, 2018
The brain is organized into a super-network of specialized networks of nerve cells. For such a brain architecture to function, these specialized networks – each located in a different brain area – need to be able to communicate ...

Neurons with good housekeeping are protected from Alzheimer's

December 17, 2018
Some neurons in the brain protect themselves from Alzheimer's with a cellular cleaning system that sweeps away toxic proteins associated with the disease, according to a new study from Columbia University and the University ...

7 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

Eikka
4 / 5 (1) Mar 23, 2018
It's probably going to be a bit of both, with synapses doing the fine grained learning and dendrites doing the coarse work. After all, the brain had to evolve somehow - it goes from simple to complex, and evolution doesn't ditch working solutions unless they are harmful.

Fast coarse selection by dendrite trees is likely to be the robust solution, and then the weights of the synapses are modified to allow for more detailed work, like responding to a person's hair color rather than just the fact that they have a head.
philstacy9
2.3 / 5 (3) Mar 23, 2018
This was settled science that turned out to be not so settled. The consensus is now being questioned.
Parsec
not rated yet Mar 23, 2018
Looking at the picture it is easy to imagine the neuron's dendrite "arms" going up together. But now picture them as they really are wired in 3 dimensions to a cluster of other brain cells in close proximity.

This is indeed a novel way of looking at constructing back propagation type networks(in the broadest sense of them).

I could see how such a network could work, but I am at a total loss how to make one run reasonably on anything but a supercomputer. Considering the price of hardware... you might be able to buy a secretarial assistant for 700 bucks from your local computer store... in 20 years.
NeutronicallyRepulsive
5 / 5 (1) Mar 24, 2018
This is big. This could improve deep learning by a large margin. I've read a (relatively new) paper indicating much better properties for such neural networks. Very exciting. It would probably require modification of existing neuron hardware chip.
El_Nose
not rated yet Mar 25, 2018
yes there are neural network chips -- but by far most neural networks are software based and this can be easily tested once a model is created. But the question is what type of learning does this facilitate. As not all learning is the same and the structure of the network helps to determine what is easily learned.
Eikka
not rated yet Mar 26, 2018
This was settled science that turned out to be not so settled. The consensus is now being questioned.


A consensus exists relative to a question. Nobody thought to question the original assumption because it seemed to work fine, and still does.

The "settled science" isn't actually being questioned here, but simply added on.
NeutronicallyRepulsive
not rated yet Mar 26, 2018
I've looked at this paper and it looks promising. We'll have to wait for a wider adoption of course. You can get pretty good idea how this new dendritic deep learning works in code (which is of my interest).

https://www.natur...-23471-7

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.