Sunday, October 2, 2011

The essence of moral reasoning, part 2: making moral decisions

In my previous post on morality, I argued that simply saying that people are inclined to be "evil" fails as an explanation for human cruelty – it not only fails to take into account certain facts about our biological inclination toward empathy, but it leaves important questions about how we make moral decisions – and how our empathetic hard-wiring can be eroded – unanswered. I touched briefly on the fact that many of our "moral" decisions are made impulsively, and we reason about them retroactively. In this post, I want to examine more thoroughly some of the processes involved in making moral decisions – including how our empathetic nature can both influence us and be eroded.


Empathy on the brain


In his excellent book The Science of Evil, Cambridge psychologist Simon Baron-Cohen discusses the many regions of the brain that are associated with our ability to empathize with others. What's fascinating is that damage to specific regions of the brain cause empathy to malfunction in highly specific ways. Damage one part of the brain, and we can lose our ability to feel any empathy at all (sociopathology); damage a different part of the brain and we may be able to feel empathy, but be confused as to how to react appropriately; damage yet another different part of the brain and we may have difficulty reading facial expressions and accurately inferring others' emotional states.

The list goes on, but what this shows is that the physical state of our brain has a powerful effect on our ability to make moral decisions. If we can't feel empathy or accurately understand others' emotional states, for example, then our decisions will be more utilitarian and egocentric. This is not speculation or hypothesis, either – it is well documented in the behavior of those suffering from personality disorders.

The causes of personality disorders appear to be combinations of nature and nurture – something which may seem obvious, but is now well substantiated with research. Abuse and neglect during childhood can have an irreversible effect on the development of the brain which manifests in antisocial behavior during adolescence and adulthood. This has important implications: the deck is not stacked the same for everyone. Empathy exists on a bell curve, with individuals at one extreme being highly extroverted, sensitive, and compassionate; and individuals at the other extreme being literally incapable of feeling empathy. It's beyond dispute: our biology has a powerful influence on how we make moral decisions.


The brain in the environment

Of course, our brains don't just sit in vats all day; they're constantly bombarded with sensory input from our environment. We live in an environment in which competition over limited resources influences our moral reasoning. Much like soldiers trained to de-humanize their enemy, we can be conditioned to overcome our basic human empathy. And even if we are relatively normal on the empathy bell-curve, desperate situations may influence us to temporarily repress our empathy for others.

Our moral decisions take two forms: the impulsive, emotionally-driven reactions I mentioned previously; and moral reasoning, in which we attempt to dispassionately judge what is fair. The famous Trolley Problem illuminates the conflict between these two broad reason:

In the first scenario, subjects are presented with a trolley on course to kill five people; with a flip of a switch, the trolley can be re-directed to another track, where one person will be killed. Most individuals immediately judge that it is better to throw the switch and kill one instead of five.

In the second scenario, there is again a trolley headed toward five unfortunate souls. This time, you are on a bridge where a large man is standing. If you push him off the bridge, his corpse will stop the trolley.

In the second scenario, most respondents hesitate – and it is precisely that hesitation which betrays the fact that our moral decisions are not entirely rational (quite the contrary, in fact). In utilitarian terms, both situations are identical; but the second scenario requires us to directly harm a bystander, which causes the empathic circuitry in our brains to remind us that it is unfair to hurt another human being. Interestingly, people low on the empathy bell curve due to lack of development or damage to parts of the brain responsible for empathy (their "empathy circuitry") do not hesitate, since for these individuals all moral decisions are primarily utilitarian.

So we now know that emotions heavily influence our moral decisions, and that individuals whose empathy circuitry in their brains have been adversely affect by genetics and/or their environment will not make the same moral decisions as most of us simply because they do not feel a compulsion to nurture others, to ease or prevent their suffering, or to otherwise respond to their distress. They are unlikely to value fairness or self-sacrifice, as their inability to empathize with others creates an egocentric form of moral judgment. Since moral norms are concerned with how we ought to behave, it's vital that we recognize the pivotal role our biology plays in shaping our relationships with others, that we may properly understand how moral proscriptions are derived.

In the final post in this series, I'll get away from the emotional component and talk about moral reasoning specifically, and how we can use information to make non-arbitrary moral judgments.

No comments:

Post a Comment