How do we balance self-interest with fairness for others?
Human beings, like all living things, are driven by an innate sense of self-preservation. But humans have also built cities and governments, complex industries and lasting cultural institutions—none of which would be possible without long-term cooperation.
That sort of cooperation hinges not only on a desire for individual benefits, but a sense of communal fairness and what economists call social preferences.
A new study from University of Chicago cognitive neuroscientists offers a glimpse at how people weigh self-interest against fairness for others—and provides evidence that the former takes precedent over the latter.
"In everyday life, the decisions that you make affect not only yourself but other people," said Keith Yoder, a UChicago postdoctoral scholar and first author of the study. "How do we balance those concerns?"
Using machine learning to analyze brain scans and electrophysiological signals, Yoder and Prof. Jean Decety, a leading scholar of moral psychology, found that multiple cortical networks are dedicated to processing decisions that benefit the self—and that self-interest dominates early stages of decision-making. Moreover, fairness for self and fairness for others had non-overlapping patterns of neural activation.
The results suggest that people make self-interested choices more readily than they make choices that uphold fairness for others, but also reveal that fairness is an important consideration once individual needs are met.
The scans were done while individuals participated in a three-party ultimatum game. Participants who received fair monetary offers, the study found, were more likely to also accept offers that were fair to a third party.
"If people already have enough for themselves, they're going to be more fair to others," said Decety, the Irving B. Harris Distinguished Service Professor of Psychology and Psychiatry. "To survive, you have to care about yourself first. It doesn't mean that you don't care for others, but you have to survive first."
Published in the journal Neuropsychologia, the study set up an experiment with an anonymous "proposer," who would divide $12 between themselves, the participant and a neutral observer. Over dozens of exchanges, the proposer made four possible offers: one that kept nearly all the money for themselves; one that divided the money equally in three parts; and two that shared the money with either the participant or the observer—leaving the other with just one dollar.
Because the offers were hypothetical, the participants' decisions to accept or reject were not driven by actual financial incentives. "The task really taps into low-level, genuine preferences for fairness," said Yoder.
The researchers implemented the game with two groups: 32 people were scanned with functional MRI, while 40 underwent high-density electroencephalography (EEG), which tracks electrical activity in the brain. The differences in EEG data were especially striking: Machine learning analysis could predict self-interested decisions within 200 milliseconds, but required nearly 600 milliseconds to predict decisions that ensured fairness for others.
The results, Yoder said, help provide insight into the basic neural mechanisms that undergird more complex decisions.
"We have to figure out how to allocate resources fairly," Yoder said. "Understanding how people make those decisions is very important. These calculations can get incredibly complex, and people tend to rely on heuristics, where they'll use shortcuts. And it seems like rejecting unfair offers functions as a heuristic—people can make this decision very, very quickly."