What the pupils tells us about language

June 15, 2017, CNRS
When confronted with a word, the pupils begin by dilating (0 – 0.5 s), following the general activation of the brain. When this initial activation has passed, the pupils retract (0.5 – 2 s). But the size of the pupil is also determined by the luminosity evoked by the words: when we read a luminance-associated word, the pupils become smaller than when we read a word associated with darkness (1 – 3 s). Credit: Sebastiaan Mathot, University of Groningen

The meaning of a word is enough to trigger a reaction in our pupil: when we read or hear a word with a meaning associated with luminosity ("sun," "shine," etc.), our pupils contract as they would if they were actually exposed to greater luminosity. And the opposite occurs with a word associated with darkness ("night," "gloom," etc.). These results, published on 14 June 2017 in Psychological Science by researchers from the Laboratoire de psychologie cognitive (CNRS/AMU), the Laboratoire parole et langage (CNRS/AMU) and the University of Groningen (Netherlands), open up a new avenue for better understanding how our brain processes language.

The researchers demonstrate here that the size of the pupils does not depend simply on the of the objects observed, but also on the luminance of the words evoked in writing or in speech. They suggest that our brain automatically creates mental images of the read or heard words, such as a bright ball in the sky for the word "sun," for example. It is thought that this mental image is the reason why the pupils become smaller, as if we really did have the sun in our eyes.

This new study raises important questions. Are these mental images necessary to understand the meaning of words? Or, on the contrary, are they merely an indirect consequence of language processing in our , as though our nervous system were preparing, as a reflex, for the situation evoked by the heard or read word? In order to respond to these questions, the researchers wish to pursue their experiment by varying the language parameters, by testing their hypothesis in other languages, for example.

Explore further: Spoken languages affect reading strategies and cognitive foundations of literacy

More information: Embodiment as preparation: Pupillary responses to words that convey a sense of brightness or darkness. PeerJ Preprints 4:e1795v1 DOI: 10.7287/peerj.preprints.1795v1

Related Stories

Spoken languages affect reading strategies and cognitive foundations of literacy

May 10, 2017
The way bilingual people read is conditioned by the languages they speak. This is the main conclusion reached by researchers at the Basque Center on Cognition, Brain and Language (BCBL) after reviewing the existing scientific ...

How the brain changes when we learn to read

May 11, 2017
Right now, you are reading these words without much thought or conscious effort. In lightning-fast bursts, your eyes are darting from left to right across your screen, somehow making meaning from what would otherwise be a ...

Team measures effects of sentence structure in the brain

April 19, 2017
When we learn to read, we say one word at a time. But how does the brain actually put words together when we read full sentences?

Our brain benefits from an overlap in grammar when learning a foreign language

June 29, 2016
Researchers from Nijmegen have for the first time captured images of the brain during the initial hours and days of learning a new language. They use an artificial language with real structures to show how new linguistic ...

Infant brains are hardwired to link images and sounds as they learn to speak

February 24, 2015
New research examining electrical brain activity in infants suggests that we are biologically predisposed to link images and sounds to create language.

Recommended for you

People with prosthetic arms less affected by common illusion

January 22, 2018
People with prosthetic arms or hands do not experience the "size-weight illusion" as strongly as other people, new research shows.

Intensive behavior therapy no better than conventional support in treating teenagers with antisocial behavior

January 19, 2018
Research led by UCL has found that intensive and costly multisystemic therapy is no better than conventional therapy in treating teenagers with moderate to severe antisocial behaviour.

Babies' babbling betters brains, language

January 18, 2018
Babies are adept at getting what they need - including an education. New research shows that babies organize mothers' verbal responses, which promotes more effective language instruction, and infant babbling is the key.

College branding makes beer more salient to underage students

January 18, 2018
In recent years, major beer companies have tried to capitalize on the salience of students' university affiliations, unveiling marketing campaigns and products—such as "fan cans," store displays, and billboard ads—that ...

Inherited IQ can increase in early childhood

January 18, 2018
When it comes to intelligence, environment and education matter – more than we think.

Modulating molecules: Study shows oxytocin helps the brain to modulate social signals

January 17, 2018
Between sights, sounds, smells and other senses, the brain is flooded with stimuli on a moment-to-moment basis. How can it sort through the flood of information to decide what is important and what can be relegated to the ...

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.