Tuesday, May 18, 2010

Moral/ethical question


The John Templeton Foundation often posts moral/ethical questions. Here is one with a selected response by Christine M. Korsgaard. If you are interested in other responses, just ask.

Does moral action depend on reasoning?

Christine M. Korsgaard

[Christine M. Korsgaard is the Arthur Kingsley Porter Professor of Philosophy and the director of graduate studies in philosophy at Harvard University. Her books include The Sources of Normativity; Creating the Kingdom of Ends; The Constitution of Agency; and Self-Constitution: Agency, Identity, and Integrity.]

Yes, if...

that means that moral action depends on reason. I prefer to put it this way because we do not have to go through a process of reason-ing in order to arrive at a view of what morality requires on every occasion. Often, we simply know. But moral action does not merely depend on reason. Moral action is rational action, because the moral law is a law of reason.

Two distinctions will help to clarify this claim. The first is between intelligence and reason. Intelligence is a power that looks outward, to the world around the intelligent animal. Speaking roughly, an intelligent animal is one who learns from his experiences, displays some awareness of what causes what, and can use that awareness to solve problems. Reason, by contrast, looks inward, to what is going on in the animal's own mind. A rational animal is aware of the grounds of her beliefs and actions, of the way in which perception tends to influence her beliefs or desire tends to influence her actions. She is able to ask herself whether the forces that incline her to believe or to do certain things amount to good reasons to believe or do those things, and then to determine what she believes and does accordingly.

Because we can make these assessments, rational animals can exert a kind of control over our beliefs and actions that other animals, even very intelligent ones, cannot. It is because of this control that human beings, probably alone among the animals, are responsible for what we do. Ultimately, reason is expressed in two activities that are probably uniquely human: the self-conscious construction of systems of belief, employing consciously held standards of the adequacy of evidence and the validity of arguments; and the self-conscious determination of our actions, employing consciously held standards of the goodness of ends and of what it is right or wrong to do.

What are those standards? This brings me to a second distinction: between "rationality" as the term is commonly used in the social sciences and "rationality" as the term is sometimes used by philosophers. Social scientists associate "practical rationality" with the idea of maximizing the satisfaction of your own interests or those of some larger group. This introduces a puzzling disanalogy between practical rationality and "theoretical rationality," which refers to thinking logically and consistently.

In contrast with social scientists, many philosophers believe that there are standards of practical reason that govern the determination of our actions in the way that standards of logic and evidence govern the determination of our beliefs. And many philosophers believe that among these standards is a principle of reason that resembles the Golden Rule. Variously formulated, reason requires that you act only on principles that would be acceptable if anyone acted on them, or that you act only on principles that you could use to justify your actions to others, or that you act only on principles that take into account the recognition that you are just one person among others whose concerns are just as important as your own. This kind of principle directs us to act in ways that are intuitively recognizable as moral, but it does so based on a view about what counts as a good reason for action—one that takes the analogous reasons of others into account.

I see no reason to believe that we have to possess free will in some impossible sense in order to be motivated by such a principle; that our actions must be uncaused, for instance. Rather, the special sort of freedom that human beings have is the result of the fact that we can be motivated by such principles. The other animals, I suppose, normally can resist an impulse to act only under the influence of a stronger impulse. But rational animals can resist the impulse to act under the influence of the thought that the action is wrong. That is still a causal influence, but it does not threaten our freedom. After all, why do we care about freedom if it is not because we wish to believe that we can be governed by our own conceptions of what we ought to do?

When I say that moral action is rational, then, I mean that it is action in which this extra level of conscious control, the deliberate regulation by rational principles, is exercised. To put it more simply, moral action is not mere behavior that is altruistic, cooperative, or fair. It is action governed by a conception of the way that you ought to act. To be a moral being is to be capable of being motivated to do what you ought to do because you believe you ought to do it. And that requires reason.

On this view, morality is a high standard, one that human beings are capable of meeting but that we often fail to meet. Does it follow from this that human beings are not really rational? Or that morality is grounded not in reason but in emotional and psychological forces beyond our control? It is also true that human beings believe many absurd, irrational, superstitious, and essentially magical things. Anyone who studies the history of science knows how, even in the midst of careful scientific practice, irrational forces can skew belief. Scientists of the past found firm "evidence" of the superiority of certain races over others, or of men over women. Scientists before Darwin and Wallace ignored the plainest evidence that different species are literally related to each other. Human beings also believe many things that are in fact true on grounds that are inadequate or irrelevant. But we do not conclude from this that our beliefs are never really the result of putting the standards of evidence and argument to work, or that science is not really a product of reasoning and emerges instead from emotional and psychological forces beyond our control. Instead we simply try to do better.

There is no more reason to doubt that human beings can take control of our actions with the aid of practical reason than there is to doubt that we can take control of our beliefs with the aid of theoretical reason. What there is every reason to believe, however, is that acting in a way that is practically rational or moral, like believing in a way that is theoretically rational or scientific, is something that it is very difficult to do.

No comments: