To dispel myths, redirect the belief, study says

To dispel myths, redirect the belief, study says
Beliefs can be hard to change, even if they are scientifically wrong. But those on the fence about an idea can be swayed after hearing facts related to the misinformation, according to a study led by Princeton University. Credit: Egan Jimenez, Woodrow Wilson School

Beliefs can be hard to change, even if they are scientifically wrong. But those on the fence about an idea can be swayed after hearing facts related to the misinformation, according to a study led by Princeton University.

After conducting an experimental study, the researchers found that listening to a speaker repeating a does, in fact, increase the believability of the statement, especially if the person somewhat believes it already. But for those who haven't committed to particular beliefs, hearing correct information can override the myths.

For example, if a policymaker wants people to forget the inaccurate belief that "Reading in dim light can damage children's eyes," they could instead repeatedly say, "Children who spend less time outdoors are at greater risk to develop nearsightedness." Those on the fence are more likely to remember the correct information and, more importantly, less likely to remember the misinformation, after repeatedly hearing the correct information. People with entrenched beliefs are likely not to be swayed either way.

The sample was not nationally representative, so the researchers urge caution when extrapolating the findings to the general population, but they believe the findings would replicate on a larger scale. The findings, published in the academic journal Cognition, have the potential to guide interventions aimed at correcting misinformation in vulnerable communities.

"In today's informational environment, where and beliefs are widespread, policymakers would be well served by learning strategies to prevent the entrenchment of these beliefs at a population level," said study co-author Alin Coman, assistant professor of psychology at Princeton's Woodrow Wilson School of Public and International Affairs and Department of Psychology.

Coman and Madalina Vlasceanu, a graduate student at Princeton, conducted a main study, with a final total of 58 , and a replication study, with 88 participants.

In the main study, a set of 24 statements was distributed to participants. These statements, which contained eight myths and 16 correct pieces of information in total, fell into four categories: nutrition, allergies, vision and health.

Myths were comprised of statements commonly endorsed by people as true, but that are actually false, such as "Crying helps babies' lungs develop." The correct and related piece of information would be: "Pneumonia is the prime cause of death in children."

First, the participants were asked to carefully read these statements, which were described as statements "frequently encountered on the internet." After reading, participants rated whether they believed the statement was true on a scale from one to seven (one being "not at all" to seven being "very much so.") Next, they listened to an audio recording of a person remembering some of the beliefs the participants had read initially. In the recording, the speaker spoke naturally, as someone would recalling information. The listeners were asked to determine whether the speakers were accurately remembering the original content. Each participant listened to an audio recording containing two of the correct statements from each of two categories.

Participants were then given the category name—nutrition, allergies, vision, or health—and were instructed to recall the statements they first read. Finally, they were presented with the initial and asked to rate them based on accuracy and scientific support.

The researchers found that listeners do experience changes in their beliefs after listening to information shared by another person. In particular, the ease with which a belief comes to mind affects its believability.

If a belief was mentioned by the person in the audio, it was remembered better and believed more by the listener. If, however, a belief was from the same category as the mentioned belief (but not mentioned itself), it was more likely to be forgotten and believed less by the listener. These effects of forgetting and believing occur for both accurate and inaccurate beliefs.

The results are particularly meaningful for policymakers interested in having an impact at a community level, especially for health-relevant inaccurate beliefs. Coman and his collaborators are currently expanding upon this study, looking at 12-member groups where people are exchanging information in a lab-created social network.


Explore further

Social exclusion leads to conspiratorial thinking, study finds

More information: The paper, "Mnemonic accessibility affects statement believability: the effect of listening to others electively practicing beliefs," first appeared online in Cognition.
Journal information: Cognition

Citation: To dispel myths, redirect the belief, study says (2018, September 24) retrieved 17 October 2019 from https://medicalxpress.com/news/2018-09-dispel-myths-redirect-belief.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
97 shares

Feedback to editors

User comments

Sep 24, 2018
The problem with this idea is that one assumes that beliefs are like building blocks that are interchangeable with something true. Most of the time a belief fills in a spot in a very weak spot of a very complicated structure. The truth likely will not be able to replace the belief without much more falling apart.

Sep 25, 2018
Part 1/3: The driver of the belief in a myth determines whether it can be overturned with logic. If the myth has been derived from faulty logic and/or consideration of what turns out to be misinformation then correcting the logic and/or the information will suffice. But if the belief is driven by emotion then correcting their logic or information will only result in the individual changing the logic and/or information upon which the belief is based.

I mentioned this in an essay that included the concept of 'Retrospective Justification'. In what we consider to be the normal process a person will be presented with information usually via their own observation of newspaper, web content, books and other people's opinions and stitch them together in their own mind to arrive at a theory or model representing that information. In 'Retrospective Justification' we start with the model and then find observations and logic to match it.

Sep 25, 2018
Part 2/3: To the observer seeing such a belief it will present *as if* the observation and logical deduction was the means by which the conclusion was arrived at when actually the order was reversed.

Alternative paths to the conclusion include innate motivations and drives or the same as developed in, for instance, childhood, or that have developed as a result of a person's current living conditions becoming attached to a particular belief so that the belief feels real, feels right, has all the hallmarks of being true, will probably have been arrived at in an epiphany and so on.

If one's life or the life of loved ones is bad for no apparent reason there is still a drive to find the cause and this presents to the conscious mind as an unspecified force in the background controlling the condition. A person, in the normal course, would then attempt to discover what this thing is.

Sep 25, 2018
Part 3/3: In our hunter gatherer past we might discover a predator lurking in the woods that is sneaking into the camp and taking children, for instance, so the motivation compels tribal members to discover and kill it or drive it away. But if it is disease that is taking people then the same underlying drive occurs anyway, and even if it is economic loss then the same drive is triggered and we get the impression that there is lurking just out of view a vicious predator, now in human form, and if we are surrounded by people anyway then the imagery takes the form of a conspiracy, a thought hidden in the jungle of minds just beyond our encampment.

Sep 25, 2018
" In 'Retrospective Justification' we start with the model and then find observations and logic to match it."


All beliefs are like that. Whether you start from one end or the other - from belief to observation, or observation to belief, you turn around and do the reverse, and then turn around again. It's a thinking paradigm, where logic justifies a conclusions, which then prompts the search of further logic to justify the same conclusions while ignoring information to the contrary. It doesn't matter where you start in this process, because the end result is the same. Beliefs tend to be intellectual echo chambers.

Where the article goes wrong is suggesting that you should replace one belief with another (that you deem true based on your thought paradigm), like "dim light" vs. "not spending time outdoors" - both are just bare assertions that actually lack any logic or information as to why they are true.


Sep 25, 2018
If instead you say "children who don't see sufficient amounts of daylight are at risk of developing nearsightedness because the eyeball's growth is regulated by the cells' reception of bright blue light", you have an assertion that contains the information by which you can judge it to be true, or at least plausible.

That's why it's called knowledge instead of belief.

Sep 25, 2018
Since "I don't know" wasn't an option, the studies conclusion about "beliefs" seems dubious.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more