What is universal about morality is morality itself: cross-culturally, people think that some behaviors are wrong, and so deserve disapproval and punishment.6 However, while the capacity for moral judgment is universal, there is tremendous variability in which behaviors people think are immoral.5 We suspect that debates about moral universals are actually motivated by a different question: Can people reach a consensus about which actions should be morally prohibited?

Moral consensus is not only an abstract philosophical matter. People are worried about moral consensus for good reason. When a community disagrees about the moral laws of the land, they can no longer rely on the rules to settle conflicts. Disputes are more likely to escalate with costly consequences for everyone.2,3 Moreover, rival coalitions struggle to impose their rules on those who disagree, further fomenting costly fighting.10

In this context, what is universal about morality is disagreement. In all societies, people disagree, often violently, about which actions are immoral. For instance, recently, a prominent politician in India offered a $1.5 million bounty to decapitate a popular actress who portrayed a Hindu queen in a way he thought was morally offensive. Artistic expression or capital offense? Unfortunately, these kinds of moral disagreements are universal.

Human moral judgment allows virtually any action to become a prohibited and punishable offense. A key reason is that individuals need to keep track of the moral rules in a community so they can avoid crossing moral boundaries. Because moral rules are variable and changing, people need a flexible moral psychology that can moralize whichever actions are taboo in a given group.9 However, this does not mean that people only passively accept their group’s rules; they also actively advocate for the moral rules they prefer, especially when they can find supporters to join their cause.7

We can find a path to moral consensus by focusing on our shared concerns for people’s welfare, rather than contentious and divisive moral principles.

Many moral prohibitions have strategic consequences because they constrain some people more than others.4 When a particular action is punished – such as same-sex marriage, eating beef, black magic, disobeying authority, stem cell research – the subset of people who want to take that action are worse off; those who don’t want to do the action are unaffected or even gain a relative advantage. Given these strategic consequences, people tend to fight to sway the rules that affect them the most.11

This means that people’s efforts to persuade a community to adopt a moral rule – thou shalt not X – are essentially efforts to coerce a subset of the community into a moral regime they would rather not be in. In practice, then, a society’s morality creates a form of mob rule in which the moral prohibitions are determined by the most powerful coalition, which is often the one backed by the more numerous faction. Majoritarian political regimes, while having many virtues, allow majorities to coerce minorities with the sticks afforded them by moral rules.

Amid all of this conflict, however, our moral psychology does have elements that can promote consensus. When almost everyone benefits from a moral prohibition, it generally becomes a matter of consensus because everyone ends up advocating for the same rules. This applies to the most universal prohibitions such as those against (unprovoked intentional) killing, harming, stealing, and lying. These agreeable morals can be leveraged to build consensus.

This idea underlies utilitarian philosophy, that a rule should be adopted if it leads to net benefits to society.1 This philosophy basically attempts to build consensus around the concept of welfare, while diminishing the large variety of contentious moral rules about other matters such as taboos surrounding food, sex, or supernatural beliefs.

We can find a path to moral consensus by focusing on our shared concerns for people’s welfare, rather than contentious and divisive moral principles. All normal humans have at least some sense of compassion and concern for others’ welfare. Importantly, our sense of compassion is psychologically distinct from our moral principles and prohibitions2. Contrary to traditional views, people do not actually need moral rules to care about others’ well-being. Instead, we should aim to use our universal sense of compassion to guide the choice of moral prohibitions toward greater consensus.

This idea differs from what we typically see in politics, where politicians appeal to coalitions and moral principles, emphasizing who is right and who is wrong.8,11 In contrast, leaders who wish to build a broad consensus should emphasize how their policies will improve people’s welfare, especially by meeting people’s most pressing needs.

References

  1. Bentham, J. (1789). An Introduction to the Principles of Morals and Legislation. London: T. Payne and Son.
  2. DeScioli, P., & Kurzban, R. (2009). Mysteries of morality. Cognition, 112(2), 281-299.
  3. DeScioli, P., & Kurzban, R. (2013). A solution to the mysteries of morality. Psychological Bulletin, 139(2), 477.
  4. DeScioli, P., Massenkoff, M., Shaw, A., Petersen, M. B., & Kurzban, R. (2014). Equity or equality? Moral judgments follow the money. Proceedings of the Royal Society of London B: Biological Sciences, 281(1797), 2014-2112.
  5. Haidt, J. (2007). The new synthesis in moral psychology. Science, 316(5827), 998-1002.
  6. Hauser, M. (2006). Moral Minds: How Nature Designed Our Universal Sense of Right and Wrong. Ecco/HarperCollins Publishers.
  7. Kurzban, R., Dukes, A., & Weeden, J. (2010). Sex, drugs and moral goals: Reproductive strategies and views about recreational drugs. Proceedings of the Royal Society of London B: Biological Sciences, 277(1699), 3501-3508
  8. Petersen, M. B., (2016). Evolutionary political psychology. Handbook of Evolutionary Psychology. Buss, D. M. (ed.). 2 ed. Wiley, Vol. 2, p. 1084-1102.
  9. Rozin, P. (1999). The process of moralization. Psychological Science, 10(3), 218-221
  10. Tooby, J., & Cosmides, L. (2010). Groups in Mind: The Coalitional Roots of War and Morality, from Human Morality & Sociality: Evolutionary & Comparative Perspectives, Henrik Høgh-Olesen (Ed.), Palgrave MacMillan, New York, pp. 91-234.
  11. Weeden, J., & Kurzban, R. (2014). Hidden Agenda of the Political Mind: How Self-Interest Shapes Our Opinions and Why We Won't Admit It. Princeton University Press.

This article is from TVOL's project titled “This View of Morality: Can an Evolutionary Perspective Reveal a Universal Morality?” You can download a PDF of the project [here], comment on this article below, or comment on the project as a whole in the Summary and Overview.