New study shows how your moral behavior may change depending on the context

Credit: CC0 Public Domain

When it comes to making moral decisions, we often think of the golden rule: do unto others as you would have them do unto you. Yet, why we make such decisions has been widely debated. Are we motivated by feelings of guilt, where we don't want to feel bad for letting the other person down? Or by fairness, where we want to avoid unequal outcomes? Some people may rely on principles of both guilt and fairness and may switch their moral rule depending on the circumstances, according to a Radboud University—Dartmouth College study on moral decision-making and cooperation. The findings challenge prior research in economics, psychology and neuroscience, which is often based on the premise that people are motivated by one moral principle, which remains constant over time. The study was published recently in Nature Communications.

"Our study demonstrates that with , people may not in fact always stick to the golden rule. While most people tend to exhibit some concern for others, others may demonstrate what we have called 'moral opportunism,' where they still want to look moral but want to maximize their own benefit," said lead author Jeroen van Baar, a postdoctoral research associate in the department of cognitive, linguistic and psychological sciences at Brown University, who started this research when he was a scholar at Dartmouth visiting from the Donders Institute for Brain, Cognition and Behavior at Radboud University.

"In everyday life, we may not notice that our morals are context-dependent since our contexts tend to stay the same daily. However, under new circumstances, we may find that the moral rules we thought we'd always follow are actually quite malleable," explained co-author Luke J. Chang, an assistant professor of psychological and and director of the Computational Social Affective Neuroscience Laboratory (Cosan Lab) at Dartmouth. "This has tremendous ramifications if one considers how our moral behavior could change under new contexts, such as during war," he added.

To examine moral decision-making within the context of reciprocity, the researchers designed a modified trust game called the Hidden Multiplier Trust Game, which allowed them to classify decisions in reciprocating trust as a function of an individual's moral strategy. With this method, the team could determine which type of moral strategy a study participant was using: inequity aversion (where people reciprocate because they want to seek fairness in outcomes), guilt aversion (where people reciprocate because they want to avoid feeling guilty), greed, or moral opportunism (a new strategy that the team identified, where people switch between inequity aversion and guilt aversion depending on what will serve their interests best). The researchers also developed a computational, moral strategy model that could be used to explain how people behave in the game and examined the brain activity patterns associated with the moral strategies.

The findings reveal for the first time that unique patterns of brain activity underlie the inequity aversion and guilt strategies, even when the strategies yield the same behavior. For the participants that were morally opportunistic, the researchers observed that their brain patterns switched between the two moral strategies across different contexts. "Our results demonstrate that people may use different moral principles to make their decisions, and that some people are much more flexible and will apply different principles depending on the situation," explained Chang. "This may explain why people that we like and respect occasionally do things that we find morally objectionable."

Explore further

Morals versus money: How we make social decisions

More information: Jeroen M. van Baar et al, The computational and neural substrates of moral strategies in social decision-making, Nature Communications (2019). DOI: 10.1038/s41467-019-09161-6
Journal information: Nature Communications

Provided by Dartmouth College
Citation: New study shows how your moral behavior may change depending on the context (2019, April 19) retrieved 14 October 2019 from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Feedback to editors

User comments

Apr 21, 2019
It's been long suggested that people develop through a hierarchy of moral reasoning that goes from the blind egoism of a baby through instrumental egoism and social conformity, to moral relativity and universal principled morality.

At each stage higher, the person uses more nuanced reasoning about morality and gains the ability to switch and compare strategies, but this appears to be opportunism when seen from the lower levels because one is judging it from either an egoistic point of view, or a social conformist point of view, and neither permits switching strategies.

That is, at the higher levels the person may acknowledge that there's nothing that says they shouldn't make choices that benefit themselves as long as it's not contradicting anything else. The person allows themselves to be egoist, yet this is easily confused with the egoism of the lower levels where the starting point and the aim of the moral reasoning is to find an excuse or avoid a punishment.

Apr 21, 2019
It's also been suggested that most people never make it up the hierarchy much past half-way, to somewhere between social conformity (peer pressure) and legal conformity (belief in authorities).

Higher order moral arguments are then assimilated through these points of view and treated as moral absolutes at that level. They may understand for example the concept of a social contract, but then turn back and assume it means whatever the church or state, or their neighbors have been saying, and the term becomes just another name for the existing social paradigm. Someone arguing that morality can change according to need is either seen as a hypocritical rulemaker, or a dangerous maverick, and therefore misguided, immoral, or evil.

(Though there are others who make the opposite conclusion as they find the new proposals to their liking. For example, by arguing moral relativity at the level of simple egoism.)

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more