Friday, April 2, 2010

Magnets Can Alter Moral Judgement By Changing Brain Activity

US scientists have discovered that appyling a magnetic field to a particular place on the scalp can alter people's moral judgement by interfering with activity in the right temporo-parietal junction (TPJ) of the brain. They said their finding helps us better understand how the brain constructs morality.

You can read about the study, led by researchers from the Massachusetts Institute of Technology (MIT), in Cambridge, Massachusetts, in the 29 March online issue of the Proceedings of the National Academy of Sciences, PNAS. The research was led by Dr Rebecca Saxe, assistant professor of brain and cognitive sciences at MIT.

Lead author Dr Liane Young, a postdoctoral associate in Saxe's department, told the media that because people are normally very confident and consistent in making moral judgements, it comes as surprise to learn that their ability to do so can altered like this.

"You think of morality as being a really high-level behavior. To be able to apply (a magnetic field) to a specific brain region and change people's moral judgments is really astonishing," said Young in a statement.

She said the study reveals "striking evidence" that the right TPJ, which sits on the surface of the brain, above and behind the right ear, plays a crucial role in making moral judgements.

When we make moral judgements about other people we often need to infer their intentions. For instance, when a hunter on a hunting trip shoots a fellow hunter, did he mistake his colleague for prey, or was he secretly jealous?

This ability has been termed "theory of mind", that is the ability to attribute mental states such as beliefs, intentions, and other qualities to oneself and others, and also to understand that other people's mental states can be different to one's own.

Ten years ago Saxe identified that the TPJ played a role in theory of mind and wrote about it in her PhD thesis in 2003. Since then she has been using functional magnetic resonance imaging (fMRI) to show that the right TPJ is active when people are asked to make moral judgements that require them to think about the intentions of others.

Other studies have also shown that the TPJ is highly active when we think about other people's intentions, their beliefs and their thoughts.

For this study, Saxe, Young and colleagues wanted to investigate what might happen if they could actually disrupt activity in the right TPJ.

In this case, instead of the usual fMRI, they did two sets of experiments where they used a non-invasive method called transcranial magnetic stimulation (TMS) to apply a magnetic field to a small area of the skull (on the scalp) to create weak electric currents that stop nearby brain cells from firing normally for a while.

They found that this was enough to impair subjects' ability to make moral judgments that involve an understanding of other people's intentions: as in for example, a failed murder attempt.

In the first set of "offline stimulation" experiments, they exposed volunteers to the TMS method for 25 minutes and then asked them to take a test where they read about several scenarios and then had to judge the actions of the characters portrayed on a scale of one to seven (from "absolutely forbidden" to "absolutely permissible").

For example, for one scenario they were asked to judge how permissible would it be for a man to allow his girlfriend to walk across a bridge he knew to be unsafe, even if she does eventually cross it safely. In such a scenario, judging the man solely on the outcome would hold him blameless, even though he apparently intended harm.

In the second set of "online stimulation" experiments, the volunteers underwent a 500-millisecond burst of TMS at the point when they were asked to make a moral judgement.

In both experiments, Saxe, Young and colleagues found that disrupting the right TPJ resulted in volunteers being more likely to judge failed attempts to harm as morally permissible.

They suggested this was because they were relying more on information about the outcome than inference on intention, since the process that normally helped them get information on intention was disrupted by the electrical current from the TMS.

"It doesn't completely reverse people's moral judgments, it just biases them," explained Saxe.

The researchers also found that when they applied TMS to a brain region near the right TPJ , the volunteers' judgments were nearly identical to those of volunteers who received no TMS at all.

They concluded that:

"Relative to TMS to a control site, TMS to the RTPJ caused participants to judge attempted harms as less morally forbidden and more morally permissible. Thus, interfering with activity in the RTPJ disrupts the capacity to use mental states in moral judgment, especially in the case of attempted harms."

When we judge other people, understanding their intentions is just one aspect of what we take into account. We also assess things like their previous record, what we understand about their desires, and what constraints they might be under. We are also guided by our own ideas about loyalty, fairness and integrity, said Saxe.

Moral judgement is not a single process, even though it might feel like it, explained Saxe, who described it as more a mixture of "competing and conflicting judgments, all of which get jumbled into what we call moral judgment".

Dr Walter Sinnott-Armstrong, professor of philosophy at Duke University, who was not involved in this research, said that by going beyond fMRI, the study marks a major step forward for the field of moral neuroscience:

"Recent fMRI studies of moral judgment find fascinating correlations, but Young et al usher in a new era by moving beyond correlation to causation," said Sinnott-Armstrong.

The National Center for Research Resources, the MIND Institute, the Athinoula A. Martinos Center for Biomedical Imaging, the Simons Foundation and the David and Lucille Packard Foundation funded the study.

No comments:

Post a Comment