The unexpected power of baby math: Adults still think about numbers like kids

The unexpected power of baby math: Adults still think about numbers like kids

Children understand numbers differently than adults. For kids, one and two seem much further apart then 101 and 102, because two is twice as big as one, and 102 is just a little bigger than 101. It's only after years of schooling that we're persuaded to see the numbers in both sets as only one integer apart on a number line.

Now Dror Dotan, a doctoral student at Tel Aviv University's School of Education and Sagol School of Neuroscience and Prof. Stanislas Dehaene of the Collège de France, a leader in the field of numerical cognition, have found new evidence that educated adults retain traces of their childhood, or innate, —and that it's more powerful than many scientists think.

"We were surprised when we saw that people never completely stop thinking about numbers as they did when they were children," said Dotan. "The innate human number sense has an impact, even on thinking about double-digit numbers." The findings, a significant step forward in understanding how people process numbers, could contribute to the development of methods to more effectively educate or treat children with learning disabilities and people with brain injuries.

Digital proof of a primal sense

Educated adults understand numbers "linearly," based on the familiar number line from 0 to infinity. But children and uneducated adults, like tribespeople in the Amazon, understand numbers "logarithmically"—in terms of what percentage one number is of another. To analyze how educated adults process numbers in real time, Dotan and Dehaene asked the participants in their study to place numbers on a number line displayed on an iPad using a finger.

Previous studies showed that people who understand numbers linearly perform the task differently than people who understand numbers logarithmically. For example, linear thinkers place the number 20 in the middle of a number line marked from 0 to 40. But logarithmic thinkers like may place the number 6 in the middle of the number line, because 1 is about the same percentage of 6 as 6 is of 40.

On the iPad used in the study, the participants were shown a number line marked only with "0" on one end and "40" on the other. Numbers popped up one at a time at the top of the iPad screen, and the participants dragged a finger from the middle of the screen down to the place on the number line where they thought each number belonged. Software tracked the path the finger took.

Changing course

Statistical analysis of the results showed that the participants placed the numbers on the number line in a linear way, as expected. But surprisingly—for only a few hundred milliseconds—they appeared to be influenced by their innate number sense. In the case of 20, for example, the participants drifted slightly rightward with their finger – toward where 20 would belong in a ratio-based number line – and then quickly corrected course. The results provide some of the most direct evidence to date that the innate number sense remains active, even if largely dormant, in educated adults.

"It really looks like the two systems in the brain compete with each other," said Dotan.

Significantly, the drift effect was found with two-digit as well as one-digit numbers. Many researchers believe that people can only convert two-digit numbers into quantities using the learned linear numerical system, which processes the quantity of each digit separately – for example, 34 is processed as 3 tens plus 4 ones. But Dotan and Dehaene's research showed that the innate sense is, in fact, capable of handling the complexity of two-digit numbers as well.

Related Stories

Kids grasp large numbers remarkably young

Dec 18, 2013

Children as young as 3 understand multi-digit numbers more than previously believed and may be ready for more direct math instruction when they enter school, according to research led by a Michigan State ...

Brain structure shows affinity with numbers

Dec 12, 2013

The structure of the brain shows the way in which we process numbers. People either do this spatially or non-spatially. A study by Florian Krause from the Donders Institute in Nijmegen shows for the first ...

Child's 'mental number line' affects memory for numbers

Sep 09, 2010

As children in Western cultures grow, they learn to place numbers on a mental number line, with smaller numbers to the left and spaced further apart than the larger numbers on the right. Then the number line changes to become ...

Babies' number sense could predict future math skills

Oct 22, 2013

(Medical Xpress)—Infants have a primitive number sense that allows them to recognize whether a group of objects has changed in size. Scientists have suspected a correlation between this innate number sense ...

Even or odd: No easy feat for the mind

Dec 20, 2013

Even scientists are fond of thinking of the human brain as a computer, following sets of rules to communicate, make decisions and find a meal.

Recommended for you

Poor mother-baby bonding passed to next generation

2 hours ago

Trust pathways in the brain are set in infancy and passed on from mother to child, according to landmark UNSW-led research. The work relates to oxytocin levels in new mothers and proves for the first time ...

User comments

Adjust slider to filter visible comments by rank

Display comments: newest first

MrVibrating
4.5 / 5 (2) Jan 22, 2014
There's a related dynamic at play in how we process auditory pitch - a piano keyboard for example gives an impression of a linear progression of equally-divided spatial increments; all the keys are the same width with the same gaps between them, whereas pitch perception is, obviously, logarithmic in nature.

I believe this relationship is not mere coincidence, and that there's a common cognitive principle by which we represent magnitudes generally, universally across faculties - and it's logarithmic because it's concerned with the complexity of the differences between component stimuli - specifically with regards to the complexity of the processing required to resolve them.

The most straightforward function for representing this simplicity bias is in terms of integer factors - where components lying in a factor of 1 relationship of one another have zero difference, ie. max. simplicity and so min. informational entropy; factors of 2 have the absolute minimum of complexity, and so on..
Whydening Gyre
1 / 5 (3) Jan 22, 2014
It's the way the Universe counts...
This should be EXTREMELY important to the way we consider quantum functions...
MrVibrating
4.5 / 5 (2) Jan 22, 2014
Sorry to double post, can't squeeze this much further.. the pressure to process information this way is principally thermodynamic, and the form of the information in question is metadata - it's information about the entropy of the difference between component values, with respect to a network orchestrated by efficiency.

So yeah.. soz to prattle on, but it's an interesting finding in an area that's long held fascination for me...
Eikka
3.7 / 5 (3) Jan 23, 2014
This is not a problem of "childish" thought, but of vague language. The real question is, what is this "difference" between numbers you're talking about?

If given just two numbers, should the difference be measured against an invisible number line that is simply assumed to be there, or should the difference be measured as a proportionality between the two numbers. And why?

The number line answer is simply a culturally conditioned convention. Both are actually equally valid answers to the question, yet somehow the answer that makes less implicit assumptions is considered "baby math".

Preposterous.
Eikka
3.8 / 5 (5) Jan 23, 2014
Naturally, if something is measured in 1s and 2s, having one more would make a larger difference than having a hundred and adding one. This is so obvious and common-sensical that it baffles the mind to think anyone would find it a peculiar and a "non-adult" way of thinking about numbers.

Unless of course you've been thoroughly indoctrinated through your education to -not- think like that, and instead consider numbers as mere abstract entities governed by arbitrary rules instead of pertaining to something practical and real. Like I imagine a mathematician would.

MrVibrating
not rated yet Jan 23, 2014
@Eikka - as i said, the 'difference' is the complexity of the network activity needed to resolve it - it's the entropy with respect to the network's equilibrium state, where 'zero' difference has zero informational entropy, ie. lying in a factor of 1 relationship (1:1 = zero diference / information).. factors of two have a minimum of difference, factors of three slightly moreso, etc.

These integer factors are just the simplest way of expressing the 'exponent gradient' of this underlying logarithmic processing scheme we seem to be using.

Factors of 1 require no maths. Factors of two require a minimum of calculating, factors of three moreso, and so on...
Whydening Gyre
not rated yet Jan 24, 2014
It may also have to do with the fact that our visual cortex functions as a difference engine and that we "visualize" numbers (in that cortex) when presented with the described situation. Babies don't see one and 2. They see 2 "ones" of different size.
Surly
not rated yet Jan 24, 2014
Why do they take drifting right as evidence of logarithmic thought? If the subject drifted left, they could take that as evidence of the same kind of thought process that leads to putting 6 as the middle number between 1 and 40. This is weak evidence.
MrVibrating
not rated yet Jan 26, 2014
@WG - ah but do they see two squiggles or one pair of squiggles? ;) You raise a good point though - there's two magnitudes to consider, firstly the number of components (here, two), and then the values they represent. There's difference in both senses - two components is more than one; and then two is also one more than one. Perhaps this might be broken down into first-order metadata, then second-order and so on... the information about the number of components is discrete from the information these metaphors represent.

I'd venture to suggest that the area of interest here is the second-order metadata - our concept of numerical value, in the most abstract and non-specific sense. It's about how we represent or map magnitude differences internally, irrespective of the source fields, and this is why it's apparently universal - it's more fundamental than any given faculty of modality, rather than a dependency of them.

MrVibrating
4 / 5 (1) Jan 26, 2014
@Surly - the rightward drift is a key characteristic of this internal exponent gradient, which manifests in many disparate ways (underlining its intrinsic nature) - eg. some right-to-left languages such as Farsi still represent numbers left-to-right like us. Blindfolded subjects tend to over/under-estimate values depending on their R/L direction of lean.

More fundamentally this gradient correlates to a right-left spatiotemporal bias between hemispheres - while our split-brain morphology is a developmental consequence of how the blastocyst folds out into a zygote, rather than a selected development in its own right, we nonetheless seem to have capitalised on the resulting co-processing architecture to accommodate simultaneous processing of spatial and temporal components in opposite hemispheres..

However, all the information we process is actually temporal; we assign and derive 'spatial' info via temporal integration windows, and it is these TIWs that are sized asymmetrically R/L.
MandoZink
5 / 5 (1) Jan 26, 2014
Quickly comprehending the magnitude of a difference is hardly "baby math". It's an essential observation.

100% larger vs. 0.0099% larger.

It's important enough even a child recognizes that.