/ |

EMOTIONS IN MORAL JUDGMENT

When the heart rules the head

by Rowan Hooper

Are we at the mercy of emotional centers in the brain when we make moral decisions, or can we override them? Is there a “hard-wired,” physiological component to emotions, or are they cultural products, gradually emerging as a result of our upbringing and experience?

The latest research, using functional magnetic resonance imaging, suggests that brain regions associated with emotion are more active when making personal moral decisions, than when making other types of decisions. In other words, emotions influence how our brains work (a convenient excuse if men want to cry at the end of “Titanic”).

Moral philosophers are fond of using thought games to look at ethics and the role of emotions in making moral judgments. A typical dilemma goes like this: A runaway train is heading for five people, who will be killed if the train proceeds on its present course. You can save them, but it means diverting the train to an alternate track, where a single person will be killed. Most people agree that it is morally acceptable to sacrifice the one to save the five.

But what about this: You are standing next to a stranger on a footbridge. Only you can save the five — by pushing the stranger onto the tracks to block the train. This time, most people say they won’t do it. Philosophers have scratched their beards for years about this. Why is it OK to sacrifice one for five in the first case, but not in the second?

Researchers led by Joshua Greene at Princeton University, have now used FMRI technology to look at the dilemma. What they find is not a philosophical answer to the puzzle — they don’t say why our morals direct us differently in different situations — but they do find a neurological answer. The brain processes the two dilemmas in different ways.

The first situation, the runaway train, used regions associated with memory and information processing. The second, the footbridge dilemma, used areas of the brain that are responsible for emotion.

Magnetic resonance imaging is carried out by placing the subject into a strong magnetic field. This enables the operator to measure which areas of the brain are operating by tracking changes in blood flow — and thus oxygenation — to the brain.

Using the technique, researchers found that emotion was “engaged” in the brain only for a certain set of moral dilemmas. Similar to the footbridge dilemma, these included a case of using another person’s organs to save the lives of five others, but causing the death of the donor; and a case involving a lifeboat that will sink unless some passengers are thrown into the sea. Such cases are up-close and personal, or, in philosopher-talk, they deal with “moral-personal conditions.”

Moral-impersonal conditions included the runaway train dilemma; deciding what to do when finding a wallet full of money in the street; and voting for a policy likely to cause more deaths than its alternatives. When considering these, emotion-centers in the brain were less active, and short-term memory-processing areas were more active.

We can override our emotions, but we have to work at it. In the footbridge dilemma, people who decided to push the stranger in front of the train took much longer to reach their decision than those who decided against it. “This says that emotion is not just incidental,” said Greene, “but really exerts a force on people’s judgments.”

FMRI has already revealed things about the workings of the brain that, until recently, were thought of as beyond scientific analysis. “FMRI has been important,” said Greene in an e-mail interview, “but the best is yet to come. In the coming years and decades we’re going to see truly novel results, particularly in the social domain.”

Novel results are already coming. In today’s Nature, scientists use FMRI to explain the tingly feeling you get when you meet the eyes of a stranger across a crowded bar.

Knut Kampe and colleagues at University College London’s Institute of Cognitive Neuroscience showed volunteers photographs of different faces, measured changes in brain activity using FMRI, and asked the subjects to rate the photos for attractiveness. When the image’s gaze was directed at the observer of the opposite sex, it had a positive correlation with the “babe value” given by the volunteer and the activity in the ventral striatum, a brain area linked to the anticipation of reward. Hence the tingly feeling.

When the eyes of the photo did not meet those of the observer, the correlation was reversed.

“Returned eye gaze from an attractive face represents a more favorable result than expected,” the researchers write. “Likewise, missing eye contact with an unattractive face may be a relief.”

Scientists seem to be covering the sort of ground once monopolized by romantic pulp novels loved by grandmothers. Eyes meeting across the bar? We can explain it, say the scientists.

But, say Princeton’s brain team, the study raises a more general question: Will a greater understanding of the mechanisms that give rise to our moral judgments alter our attitudes toward the moral judgments we make?

Philosophers cannot yet answer. “Science by itself makes no value judgments,” emphasized Greene. “It does not tell us what is good or bad, what is worth caring about or casting aside. The people who find science ‘dehumanizing’ only do so because they have a particular view about what humans are supposed to be like. My advice to them? Keep an open mind. Chances are they’ll find that what they really care about remains after science has had its say.”

Even philosophers would surely agree, however, that crying at “Titanic” — whether male or female — is only morally acceptable if someone is chopping an onion under your nose.