Cognitive scientists ID new mechanism at heart of early childhood learning and social behavior

November 13, 2013, Indiana University
Wearing head-mounted eye-tracking technology, a child and his mother engage in free play. Credit: Indiana University

Shifting the emphasis from gaze to hand, a study by Indiana University cognitive scientists provides compelling evidence for a new and possibly dominant way for social partners—in this case, 1-year-olds and their parents—to coordinate the process of joint attention, a key component of parent-child communication and early language learning.

Previous research involving joint between parents and toddlers has focused exclusively on the ability of each partner to follow the gaze of the other. In "Joint Attention Without Gaze Following: Human Infants and Their Parents Coordinate Visual Attention to Objects Through Eye-Hand Coordination," published in the online journal PLOS ONE, the researchers demonstrate how hand-eye coordination is much more common, and the parent and toddler interact as equals, rather than one or the other taking the lead.

The findings open up new questions about language learning and the teaching of language. They could also have major implications for the treatment of children with early social-communication impairment, such as autism, where joint caregiver-child attention with respect to objects and events is a key issue.

"Currently, interventions consist of training children to look at the other's face and gaze," said Chen Yu, associate professor in the Department of Psychological and Brain Sciences at IU Bloomington. "Now we know that typically developing children achieve joint attention with caregivers less through gaze following and more often through following the other's hands. The daily lives of toddlers are filled with social contexts in which objects are handled, such as mealtime, toy play and getting dressed. In those contexts, it appears we need to look more at another's hands to follow the other's lead, not just gaze."

Dual eye-tracking in parent-child free-flowing play

The new explanation solves some of the problems and inadequacies of the gaze-following theory. Gaze-following can be imprecise in the natural, cluttered environment outside the laboratory. It can be hard to tell precisely what someone is looking at when there are several objects together. It is easier and more precise to follow someone's hands. In other situations, it may be more useful to follow the other's gaze.

"Each of these pathways can be useful," Yu said. "A multi-pathway solution creates more options and gives us more robust solutions."

A researcher putting a head-mounted eye-tracker on an infant

Like Google Glass, which records the views of those wearing it, researchers used innovative head-mounted eye-tracking technology that has never been used before with , the researchers recorded moment-to-moment high-density data of what both parent and child visually attend to as they play together in the lab. The researchers applied advanced data-mining techniques to discover fine-grained eye, head and hand movement patterns from the rich dataset they derived from multimodal digital data. The results reported are based on 17 parent-infant pairs. However, over the course of a few years, Yu and Smith have looked at more than 100 kids, and their data confirm their results.

"This really offers a new way to understand and teach joint attention skills," said co-author Linda Smith, Distinguished Professor in the Department of Psychological and Brain Sciences. Smith is well known for her pioneering research and theoretical work in the development of human cognition, particularly as it relates to children ages 1 to 3 acquiring their first language. "We know that although young children can follow , it is not precise, cueing attention only generally to the left or right. Hand actions are spatially precise, so hand-following might actually teach more precise gaze-following."

Explore further: Children with autism could miss out on non-verbal cues to social interaction

Related Stories

Children with autism could miss out on non-verbal cues to social interaction

September 10, 2013
Children with autism might be missing "crucial" non-verbal gestures because they typically look away more than others when listening to parents, teachers and other professionals.

Don't look now - I'm trying to think

March 7, 2012
Children with autism look away from faces when thinking, especially about challenging material, according to new research from Northumbria University.

Researchers help uncover how infants learn word meanings

August 1, 2012
Research conducted in the Cognition Laboratory at Ithaca College is helping those who study child development gain a better understanding of how children learn the meanings of words. The researchers found that if a person ...

Infant eye movement and cognition

March 9, 2012
Interactions between infants and their environment are limited because of the infants' poor motor abilities. So investigating infant cognition is no easy task. Which sensory event is the result of the infant's own motor action ...

Recommended for you

Intensive behavior therapy no better than conventional support in treating teenagers with antisocial behavior

January 19, 2018
Research led by UCL has found that intensive and costly multisystemic therapy is no better than conventional therapy in treating teenagers with moderate to severe antisocial behaviour.

Babies' babbling betters brains, language

January 18, 2018
Babies are adept at getting what they need - including an education. New research shows that babies organize mothers' verbal responses, which promotes more effective language instruction, and infant babbling is the key.

College branding makes beer more salient to underage students

January 18, 2018
In recent years, major beer companies have tried to capitalize on the salience of students' university affiliations, unveiling marketing campaigns and products—such as "fan cans," store displays, and billboard ads—that ...

Inherited IQ can increase in early childhood

January 18, 2018
When it comes to intelligence, environment and education matter – more than we think.

Modulating molecules: Study shows oxytocin helps the brain to modulate social signals

January 17, 2018
Between sights, sounds, smells and other senses, the brain is flooded with stimuli on a moment-to-moment basis. How can it sort through the flood of information to decide what is important and what can be relegated to the ...

Baby brains help infants figure it out before they try it out

January 17, 2018
Babies often amaze their parents when they seemingly learn new skills overnight—how to walk, for example. But their brains were probably prepping for those tasks long before their first steps occurred, according to researchers.

1 comment

Adjust slider to filter visible comments by rank

Display comments: newest first

beleg
not rated yet Nov 14, 2013
http://sites.univ..._010.pdf

The one year old's index finger gesture in the posted video is startling.

http://www.slate....nce.html

Neologism has gestures too. Even if the toddler hasn't spoken a recognizable word up to the point in time where the first index finger gesture is made.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.