Mirror neuron activity predicts people's decision-making in moral dilemmas, study finds

January 5, 2018 by Leigh Hopper, University of California, Los Angeles
Researchers found that the brain’s inferior frontal cortex (circled) is more active in people who are more averse to harming others when facing moral dilemmas. Credit: UCLA Health

It is wartime. You and your fellow refugees are hiding from enemy soldiers, when a baby begins to cry. You cover her mouth to block the sound. If you remove your hand, her crying will draw the attention of the soldiers, who will kill everyone. If you smother the child, you'll save yourself and the others.

If you were in that situation, which was dramatized in the final episode of the '70s and '80s TV series "M.A.S.H.," what would you do?

The results of a new UCLA study suggest that scientists could make a good guess based on how the responds when watch someone else experience pain. The study found that those responses predict whether people will be inclined to avoid causing harm to others when facing .

"The findings give us a glimpse into what is the nature of morality," said Dr. Marco Iacoboni, director of the Neuromodulation Lab at UCLA's Ahmanson-Lovelace Brain Mapping Center and the study's senior author. "This is a foundational question to understand ourselves, and to understand how the brain shapes our own nature."

In the study, which was published in Frontiers in Integrative Neuroscience, Iacoboni and colleagues analyzed mirror neurons, brain cells that respond equally when someone performs an action or simply watches someone else perform the same action. Mirror neurons play a vital role in how people learn through mimicry and feel empathy for others.

When you wince while seeing someone experience pain—a phenomenon called "neural resonance"—mirror neurons are responsible.

Iacoboni wondered if neural resonance might play a role in how people navigate complicated problems that require both conscious deliberation and consideration of another's feelings.

To find out, researchers showed 19 volunteers two videos: one of a hypodermic needle piercing a hand, and another of a hand being gently touched by a cotton swab. During both, the scientists used a functional MRI machine to measure activity in the volunteers' brains.

Researchers later asked the participants how they would behave in a variety of moral dilemmas, including the scenario involving the crying baby during wartime, the prospect of torturing another person to prevent a bomb from killing several other people and whether to harm research animals in order to cure AIDS.

Participants also responded to scenarios in which causing harm would make the world worse—inflicting harm on another person in order to avoid two weeks of hard labor, for example—to gauge their willingness to cause harm for moral reasons and for less-noble motives.

Iacoboni and his colleagues hypothesized that people who had greater neural resonance than the other participants while watching the hand-piercing video would also be less likely to choose to silence the baby in the hypothetical dilemma, and that proved to be true. Indeed, people with stronger activity in the inferior frontal cortex, a part of the brain essential for empathy and imitation, were less willing to cause direct harm, such as silencing the baby.

But the researchers found no correlation between people's brain activity and their willingness to hypothetically harm one person in the interest of the greater good—such as silencing the baby to save more lives. Those decisions are thought to stem from more cognitive, deliberative processes.

The study confirms that genuine concern for others' pain plays a causal role in moral dilemma judgments, Iacoboni said. In other words, a person's refusal to silence the baby is due to concern for the baby, not just the person's own discomfort in taking that action.

Iacoboni's next project will explore whether a person's decision-making in moral dilemmas can be influenced by decreasing or enhancing activity in the areas of the brain that were targeted in the current study.

"It would be fascinating to see if we can use brain stimulation to change complex moral decisions through impacting the amount of concern people experience for others' pain," Iacoboni said. "It could provide a new method for increasing concern for others' well-being."

The research could point to a way to help people with mental disorders such as schizophrenia that make interpersonal communication difficult, Iacoboni said.

Explore further: Study says empathy plays a key role in moral judgments

More information: Leonardo Christov-Moore et al. Deontological Dilemma Response Tendencies and Sensorimotor Representations of Harm to Others, Frontiers in Integrative Neuroscience (2017). DOI: 10.3389/fnint.2017.00034

Related Stories

Study says empathy plays a key role in moral judgments

May 22, 2013
Is it permissible to harm one to save many? Those who tend to say "yes" when faced with this classic dilemma are likely to be deficient in a specific kind of empathy, according to a report published in the scientific journal ...

The family relationship between film characters clearly affects the reactions in the viewers' brain

October 27, 2017
Are we more prone to help the person that resembles us the most? Social neuroscientists have studied the effects of similarity by showing a re-edited version of the film My Sister's Keeper to a group of subjects and by giving ...

Your brain might be hard-wired for altruism

March 18, 2016
It's an age-old quandary: Are we born "noble savages" whose best intentions are corrupted by civilization, as the 18th century Swiss philosopher Jean-Jacques Rousseau contended? Or are we fundamentally selfish brutes who ...

Empathy and moral choices—study limits the role of emotions in moral decisions

February 17, 2017
Empathy and emotional awareness do not affect our moral decisions. This is suggested by a new study published on Social Neuroscience and led by SISSA neuroscientist Marilena Aiello. Our choices do not depend on our empathy. ...

Ill-gotten gains are worth less in the brain

May 1, 2017
The brain responds less to money gained from immoral actions than money earned decently, reveals a new UCL-led study.

Recommended for you

Perinatal hypoxia associated with long-term cerebellar learning deficits and Purkinje cell misfiring

August 18, 2018
Oxygen deprivation associated with preterm birth leaves telltale signs on the brains of newborns in the form of alterations to cerebellar white matter at the cellular and the physiological levels. Now, an experimental model ...

People are more honest when using a foreign tongue, research finds

August 17, 2018
New UChicago-led research suggests that someone who speaks in a foreign language is probably more credible than the average native speaker.

Critical role of DHA on foetal brain development revealed

August 17, 2018
Duke-NUS researchers have found evidence that a natural form of Docosahexaenoic Acid (DHA) made by the liver called Lyso-Phosphatidyl-Choline (LPC-DHA), is critical for normal foetal and infant brain development, and that ...

Automated detection of focal epileptic seizures in a sentinel area of the human brain

August 17, 2018
Patients with focal epilepsy that does not respond to medications badly need alternative treatments.

Men and women show surprising differences in seeing motion

August 16, 2018
Researchers reporting in the journal Current Biology on August 16 have found an unexpected difference between men and women. On average, their studies show, men pick up on visual motion significantly faster than women do.

Brain response study upends thinking about why practice speeds up motor reaction times

August 16, 2018
Researchers in the Department of Physical Medicine and Rehabilitation at Johns Hopkins Medicine report that a computerized study of 36 healthy adult volunteers asked to repeat the same movement over and over became significantly ...

10 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

Eikka
3.8 / 5 (4) Jan 05, 2018
Questions like this are moral false dilemmas, because the question is always set up in a way to force one to act unethically, and if you should find a loophole - like sticking your finger in the baby's mouth for a pacifier - they always find some excuse why you couldn't solve it that way.

So in essence, the people put under the question are not making free decisions - the researchers are deciding for them. By suppressing free thought, the researchers are distorting the decisionmaking process by forcing the subject to think unethically, when in reality they might be trying to solve the problem in an ethical way. Once primed such, no true behaviour can result.

This problem is illustrated by the short fictional story about Mr. Feynman and the Lightbulb: "What would Feynman do?"
https://blogs.msd...nman-do/
Eikka
4.5 / 5 (2) Jan 05, 2018
In other words, when one is primed to think unethically, "I must kill one of them", then the question becomes about how to choose, and that question also involves evaluating the social effects of the choice, i.e. "which choice reflects the best on me". People with stronger social affinity are more likely to go for the socially acceptable choice, like "save the baby" as an intuition.

Empathy involves reasoning about other minds and externalizing your own self, so the observation doesn't actually tell whether the people are feeling empathy for the baby, or empathy for themselves in avoiding social shame.

To test that, you'd have to isolate the social aspect by ensuring that nobody will know whether one chose to kill the baby.
Ojorf
1 / 5 (1) Jan 06, 2018
Questions like this are moral false dilemmas, because the question is always set up in a way to force one to act unethically, and if you should find a loophole - like sticking your finger in the baby's mouth for a pacifier - they always find some excuse why you couldn't solve it that way.


I don't understand you, that would defeat the purpose of the study.
It's not meant to be a study in creative problem solving, they explicitly want to force you to choose the least bad (in your opinion) of only two rotten choices. How else can you study morals? Only ask pleasant questions?
And it works.
Would you kill an innocent person to save two other innocents or 5, 10, 100, 1000?

Different people will give different answers that's how you study these things.
TheGhostofOtto1923
not rated yet Jan 07, 2018
"Researchers found that the brain's inferior frontal cortex (circled) is more active in people who are more averse to harming others... The study confirms that genuine concern for others' pain plays a causal role in moral dilemma judgments"

-Perhaps they've found the location of the psychopath defect. Next will be to examine psychopath brains for evidence.
Eikka
not rated yet Jan 07, 2018
It's not meant to be a study in creative problem solving


Most people are trying to be good, and suppressing that reaction by force has the effect of fudging up the result.

It no longer represents the person's morality, but their behaviour in an entirely artifical situation where they are being primed to act against their moral instincts. In such situations, people's moral compass tends to go hang.

Unfortunately it's no longer ethically permissible to do tests like the Milgram experiment, or the electric shock test.
TheGhostofOtto1923
not rated yet Jan 08, 2018
Unfortunately it's no longer ethically permissible to do tests like the Milgram experiment, or the electric shock test
Sure it is.

"Haggard and his colleagues wanted to find out what participants were feeling. They designed a study in which volunteers knowingly inflicted real pain on each other...

"In his experiments [2016], the volunteers (all were female, as were the experimenters, to avoid gender effects) were given £20 (US$29). In pairs, they sat facing each other across a table, with a keyboard between them. A participant designated the 'agent' could press one of two keys; one did nothing. But for some pairs, the other key would transfer 5p to the agent from the other participant, designated the 'victim'; for others, the key would also deliver a painful but bearable electric shock to the victim's arm."
Eikka
not rated yet Jan 08, 2018
Sure it is.


Not really. In the Milgram experiment, the test subjects were intentionally made to believe they were seriously harming another person, and told to continue. They were subjected to a genuinely traumatic experience.

The Stanford experiment in turn quicky spiraled down to cruelty and physical violence, with one "inmate" suffering a mental breakdown.

Then there were other questionable studies, like the one where they gathered a church full of priests and gave them all LSD without them knowing they'd been drugged - to see what would happen.
TheGhostofOtto1923
not rated yet Jan 08, 2018
Then there were other questionable studies, like the one where they gathered a church full of priests and gave them all LSD without them knowing they'd been drugged - to see what would happen
- And black soldiers were infected with syphilis. So what?

What do any other immoral experiments have to do with this one?
The Stanford experiment in turn quicky spiraled down to cruelty and physical violence, with one "inmate" suffering a mental breakdown
Just a suggestion... you really ought to post refs and excerpts rather than ad libbing.
rrwillsj
1 / 5 (1) Jan 08, 2018
This article is another step up the technological ladder that could have grave consequences for our society. The devices to read human thought, are veing researched and developed.

Which leads to several issues we, as a society, will need to address within this century.

Compulsory or voluntary? With all the privately owned firearms available... How do you convince the guntotters that they should obediently allow themselves to be tested? And what if they decide not to abide by the 'scientific' testing results?

How will we confirm that the results are correct? And not a manipulation by some special-interest group?

When it goes wrong, and an innocent party comes to harm? Whether by failing to detect a criminal psychopath or by persecuting a sane person with accusations of being a threat to society? How will you determine the error and make amends?
Eikka
not rated yet Jan 19, 2018
So what?


The point is, after the Standford experiment, ethical standards were raised to exclude such studies. It is no longer permissible to perform them.

It is not possible, or permissible, anymore to design a test where a person is genuinely convinced that they're in command of another person's life or death - so it is not possible to commit an experiment that would reveal how they really would react to such a moral dilemma.

So these test devolve down to social expectations. The mirror neurons activate when people think of what other people would think of them, like trying to reason what choice would reflect the best on their person on a social point of view, rather than feeling empathy for the potential victims.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.