Measured -- The time it takes us to find the words we need

November 23, 2009

(PhysOrg.com) -- The time it takes for our brains to search for and retrieve the word we want to say has been measured for the first time. The discovery is reported in a paper published in the Proceedings of the National Academy of Sciences of the USA today.

Most people think that words and are the two sides of the same coin and that the form of a word is the same as its meaning, or at least, that word and meaning cannot be split. However, this is not the case. Word forms have an existence of their own in the human mind, disconnected, from meaning- at least, for a fraction of a second.

Until now, in the field of production, it was unknown when exactly a word form is retrieved by the human when, for instance, people have to name a picture.

As Professor Guillaume Thierry of Bangor University, one of the paper's authors explains:

"If you have to say the word apple upon seeing the picture of an apple, the brain does not access the word form "a-p-p-l-e" instantly, it takes time, and until now, it was unknown exactly how much time it took. Along with colleagues at Pompeau Fabra and Barcelona universities, we measured exactly when word forms are retrieved by the brain. That happens about one fifth of a second after a picture is shown."

Thierry explains: "This is a very short time, but it makes a lot of sense if one considers that the average normal speech rate is about 5 words per second. Surely, if we can produce five per second in normal speech, it means that we can dig each and every word from memory in about one fifth of a second."

Thierry and colleagues hope to understand every stage of word production: analysis of meaning, word access, word retrieval and programming of speech. They also intend to do the same thing in comprehension to reach a full understand of the stages the human mind goes through to understand and produce language.

Their experiment combined picture naming and a technique which measures electrical activity produced by the brain over the scalp. It also pioneered the recording of brain activity over the scalp, while participants spoke out loud. This proved a technical challenge as mouth movements produce electrical noise stronger than the power of signals produced by the brain.

The research is the fruit of collaboration between language laboratories in Barcelona Pompeau Fabra and Bangor universities.

More information: The time course of word retrieval revealed by event-related brain potentials during overt speech. Albert Costa, et at., PNAS. (PNAS Online Early Edition November 23-27, 2009).

Provided by Bangor University (news : web)

Related Stories

Recommended for you

Domestic violence turns women off masculine men

November 23, 2017
Women who are afraid of violence within partnerships prefer more feminine men, according to new research carried out by scientists at the University of St Andrews.

Stress in pregnancy linked to changes in infant's nervous system, less smiling, less resilience

November 23, 2017
Maternal stress during the second trimester of pregnancy may influence the nervous system of the developing child, both before and after birth, and may have subtle effects on temperament, resulting in less smiling and engagement, ...

Schizophrenia drug development may be 'de-risked' with new research tool

November 22, 2017
Researchers at Columbia University Medical Center (CUMC) and the New York State Psychiatric Institute (NYSPI) have identified biomarkers that can aid in the development of better treatments for schizophrenia.

Study finds infection and schizophrenia symptom link

November 22, 2017
If a mother's immune system is activated by infection during pregnancy, it could result in critical cognitive deficits linked to schizophrenia in her offspring, a University of Otago study has revealed.

Self-harm, suicide attempts climb among US girls, study says

November 21, 2017
Attempted suicides, drug overdoses, cutting and other types of self-injury have increased substantially in U.S. girls, a 15-year study of emergency room visits found.

Car, stroller, juice: Babies understand when words are related

November 20, 2017
The meaning behind infants' screeches, squeals and wails may frustrate and confound sleep-deprived new parents. But at an age when babies cannot yet speak to us in words, they are already avid students of language.

1 comment

Adjust slider to filter visible comments by rank

Display comments: newest first

frajo
1 / 5 (1) Nov 25, 2009
This model is too simple. The picture of an (real world) object acts as an input signal which triggers not just one (the correct) word, but triggers a whole lot of associations of several degrees (associations of associations) which multiply for every language the subject is acquainted with. Out of this (seething) pile of words and partial words the brain somehow manages to filter the one with the highest weight. Most of the time, that is. In a fifth of a second.

Of course, computers can be faster. But they don't have to parse the universe of associations a middle-aged human being has acquired. They can't even translate a poem.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.