Binary bias distorts how we integrate information

information
Credit: CC0 Public Domain

When we evaluate and compare a range of data points—whether that data is related to health outcomes, head counts, or menu prices—we tend to neglect the relative strength of the evidence and treat it as simply binary, according to research published in Psychological Science, a journal of the Association for Psychological Science.

"People show a strong tendency to dichotomize data distributions and ignore differences in the degree to which instances differ from an explicit or inferred midpoint," says psychological scientist Matthew Fisher of Carnegie Mellon University, first author on the research. "This tendency is remarkably widespread across a diverse range information formats and content domains, and our research is the first to demonstrate this general tendency."

In a series of six studies, Fisher and coauthor Frank C. Keil of Yale University examined how people tend to reduce a continuous range of data points into just two categories.

"Especially in the Internet age, people have access to an overwhelming amount of information," says Fisher. "We have been interested in how people make sense of all the data at their fingertips."

Fisher and Keil hypothesized that people would implicitly create an "imbalance score," analyzing the difference in data points that fall on one side of a given boundary and those that fall on the other side. If people are evaluating data from different studies investigating the relationship between caffeine and health, for example, they would quickly categorize data as either showing an effect or not, regardless of the relative strength of the evidence.

In one online study, Fisher and Keil randomly assigned a total of 605 participants to consider a specific topic related to either scientific reports, eyewitness testimonies, social judgments, or consumer reviews. They saw a series of 17 claims about the relationship between two variables, such as taking a certain medication and experiencing feelings of hunger (e.g., "One group of scientists found that the new medication makes feeling hungry 2 times more likely," "One group of scientists found that the new medication makes feeling hungry 4 times less likely").

After viewing the claims, participants then summarized the evidence, choosing the rating that best captured their overall impression.

As hypothesized, the imbalance score—the number of strong and weak negative evidence claims subtracted from the number of strong and weak positive evidence claims—was associated with participants' summary judgments. Their summary judgments were also influenced by the first piece evidence they saw.

Further for the impact of imbalance score on participants' estimates emerged in two additional online studies, in which people saw data presented in various forms, including vertical and horizontal bar charts, pie charts, verbal descriptions with or without percentages, and dot plots.

The binary bias even appeared in the context of real-world decision making: Participants seemed to collapse data into two categories whether they were evaluating menu prices or determining which factories had higher carbon dioxide output. In both of these domains, participants' judgments were influenced by the imbalance score implied by the data.

"We were surprised by the pervasiveness of the effect across contexts and content domains," says Fisher. "The binary bias influenced how people interpret sequences of information and a wide variety of graphical displays."

The fact that the bias is so pervasive suggests that it is not due to a specific feature of visualization or statistical information but is instead a general cognitive illusion. Fisher and Keil suspect that this cognitive distortion may offer a cognitive shortcut that allows us to process large amounts of information relatively efficiently.

"Our work suggests the bias is a basic processing mechanism which is applied across many contexts, including health, financial and public-policy decisions," the researchers conclude.


Explore further

Online illusion: Unplugged, we really aren't that smart

More information: Matthew Fisher et al, The Binary Bias: A Systematic Distortion in the Integration of Information, Psychological Science (2018). DOI: 10.1177/0956797618792256
Journal information: Psychological Science

Citation: Binary bias distorts how we integrate information (2018, October 25) retrieved 23 September 2019 from https://medicalxpress.com/news/2018-10-binary-bias-distorts.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
494 shares

Feedback to editors

User comments

Oct 25, 2018
Considering that the average IQ is 100, just smart enough for a few basic language and social skills, this finding is not surprising. Those who are not intellectually lazy or dumb are unlikely to be so blind as to see the whole world in Black and White only...

Oct 26, 2018
Binary 'Black-and-White' thinking is not necessarily an intellectual thing, people can simply choose to see the world this way. The scale that is appropriate here is wisdom with binary thinking marking the low end point and an ability to see all sides and understand all motivations, at least in principle, as being on the high side.

The wise have a broader view than the narrow minded world view that sees everything from only their own perspective and does not reduce complex issues to binary choices. Intellectual laziness and wisdom and mutually exclusive properties.

Oct 27, 2018
@RobertKarlStonjek, IQ averages at 100, simply because it is a Q test. When the population becomes more intelligent (which it is at an astonishing rate), the average IQ remains 100 and forever will.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more