'Read my lips'—it's easier when they're your own

People can lip-read themselves better than they can lip-read others, according to a new study by Nancy Tye-Murray and colleagues from Washington University. Their work, which explores the link between speech perception and speech production, is published online in Springer's Psychonomic Bulletin & Review.

Most people cannot read lips - just try watching television with the sound turned off and see how much of a news item you understand. If you see someone speak a sentence without the accompanying sounds, you are unlikely to recognize many words.

Tye-Murray and her team developed simple, nonsensical sentences from word boards e.g. The duck watched the boy and The snail watched the goose, so that participants would easily identify and recognize individual words. Twenty adults recorded the sentences and, after several weeks, lip-read silent video clips with sentences spoken both by themselves and by nine other participants.

Participants were able to lip-read video clips of themselves consistently more accurately than video clips of others. These findings suggest that seeing someone speak activates processes that link 'seen' words to 'actual' words in the mental lexicon, and the activation is particularly strong when you see yourself speak.

The authors conclude: "This study is one of the first to show that not only can people recognize their own actions from those of others, but they can better interpret their own actions. A strong link may exist between how we perform actions and how we perceive actions; that is, we may activate some of the very same mental representations when performing and when perceiving. These findings have important implications for understanding how we learn new actions and, particularly, for how we learn to recognize and produce speech."

More information: Tye-Murray N et al (2012). Reading your own lips: common-coding theory and visual speech perception. Psychonomic Bulletin & Review; DOI 10.3758/s13423-012-0328-5

add to favorites email to friend print save as pdf

Related Stories

Lip-read me now, hear me better later

Apr 12, 2007

Experience hearing a person's voice allows us to more easily hear what they are saying. Now research by UC Riverside psychology Professor Lawrence D. Rosenblum and graduate students Rachel M. Miller and Kauyumari Sanchez ...

The mind uses syntax to interpret actions

Nov 04, 2010

Most people are familiar with the concept that sentences have syntax. A verb, a subject, and an object come together in predictable patterns. But actions have syntax, too; when we watch someone else do something, we assemble ...

Psychologist Explores Human Perception, Finds 'Wow Factor'

Apr 11, 2006

Faces tell the stories in UC Riverside Professor Larry Rosenblum’s ecological listening lab, as volunteer test subjects show that they can “read” unheard speech -- not just from lips, but from the simple movements of ...

Recommended for you

Giving emotions to virtual characters

10 hours ago

Researchers at the Autonomous University of the State of Mexico (UAEM) were able to simulate human facial expressions in virtual characters and use them in order to create better environments within a virtual ...

Emotion-tracking software aims for "mood-aware" internet

10 hours ago

Emotions can be powerful for individuals. But they're also powerful tools for content creators, such as advertisers, marketers, and filmmakers. By tracking people's negative or positive feelings toward ads—via ...

The emotional appeal of stand-up comedy

11 hours ago

Comics taking to the stage at the Edinburgh Fringe this week should take note: how much of a hit they are with their audiences won't be down to just their jokes. As Dr Tim Miles from the University of Surrey has discovered, ...

User comments