đď¸AIP 113 Everything You Know About Morality Is (Probably) Wrong
Growing up, morality was quite simple to me: Iâwith my enlightened liberal viewsâwas right, and everyone else was simply misguided or on drugs.
I was raised in a small rural college town with two liberal parents. I was eating liberos every morning for breakfast cereal.
Coming to Cornell University, my liberal views only became stronger. The university's slogan is literally "any person, any study." I interacted with people of all races, cultures, and goals, but the environment still favored more liberal views.
When I did interact with conservatives or those outside the liberal sphere, it was a bizarre experience. My Christian friends would make statements like "life is sacred, even embryos" and "homosexuality isn't natural." When visiting southern states, I'd hear statements like, "If they come for my guns, I'll make 'em run" (usually they'd make a shooting motion with their hand). Interacting with these sentiments was like communicating with an alien race.
Even if I tried my best to understand, listening to the reasoning behind every argument they made, the fundamental value differences were so wide we might as well have been split by an entire galaxy.
My college years went on, and I made friends of all sorts of social identities, including people who were asexual, non-binary, Indianâyup, somehow they didn't exist in my grade back homeâand more. In effect, I became more and more passionate about LGBTQ+ rights, stopping oppression overseas, and other liberal views.
The more liberal I became, the more it had this sort of inevitable feel to it, as if all the moral reasons were coming together into a perfect prism, similar to solving a Rubix cube for the first time. That's why I was liberal was it not? Because my reasoning was better.
Then I came across Moral Psychology researcher Jonathan Haidt's book, The Righteous Mind.
Haidt had a very different view of how morality worked. Throughout most of human history, we have assumed our moral reasoning is largely a product of rational thought coming to a conclusion. We are born a blank slate and come to our moral beliefs through careful analysis over time. Philosophers like Plato, Kant, Bentham, and more saw reason as we see Keanu Reeves, the supreme form of our humanity.
Jonathan Haidt doesn't see it this way. In his book, he argues morality works through the social intuitionist model.
The Social Intuitionist Model
The social intuitionist model theorizes whenever we come across a moral issue, we form intuitions about how we feel regarding that issue first, and then come up with post hoc moral reasoning as to why we believe that.
In effect, reasoning supports intuition, not the other way around.
To prove this, he came up with clever stories designed to trigger our moral intuitions without having easy rational justifications for why they were wrong. For example, what do you think about this story:
"Julie and Mark are brother and sister. They are traveling together in France on summer vacation from college. One night, they are staying alone in a cabin near the beach. They decide that it would be interesting and fun if they tried making love. At the very least, it would be a new experience for each of them. Julie was already taking birth control pills, but Mark uses a condom, too, just to be safe. They both enjoy making love, but they decide never to do it again. They keep that night as a special secret, which makes them feel even closer to each other. What do you think about that? Was it okay for them to make love?"
If you're like most people, your response is" absolutely not. "And yet, from a liberal standpoint, if you try to come up with a justification for it, you'll struggle immensely.
There's no harm being done here. It explicitly says that in the scenario. There's no treading on fairness or liberty. Yet, for most people, something still feels wrong about it. We'll get to later what one could say is wrong about it, but for now, know that from a liberal perspective, it's very hard to argue against.
Haidt described this phenomenon as moral dumbfoundingâour intuition feels like a moral issue is wrong, but we can't come up with post hoc reasons for justifying it. If our morality was primarily based on reason, this wouldn't happen. We'd also change our moral opinions much more, but most don't.
If morality is more about intuition than reasoning, it becomes very important to ask: what makes up our intuition?
Part of it, thankfully, does come through the reasoning process, some more than others. The more we reason, the more hard-baked beliefs about the world get baked in and become intuitive reactions to moral problems. But two other parts make up a large chunk of our moral pie.
Firstly, our morality is built in huge part by what is moral to those around usâthe social part of the social intuitionist model.
This is because a large purpose of morality is to come across as moral to those around us. We deploy our moral reasoning skills to support our team, and come across as committed to our team. This helps us make friends, benefit from reciprocity, and more.
To truly appreciate this point, consider how this was being argued as far back as the Ancient Greeks. Socrates believed people can be genuinely motivated by moral principles themselvesâthat we can come to value justice intrinsically through philosophical reasoning and understanding. Glaucon (Plato's brother), however, believed people were moral primarily for strategic reasonsâwe act justly because of social reputation and consequences, not because we truly value justice for its own sake. If you removed the consequences, many people would act selfishly.
Intuitively, I think we all want to believe Socrates. I like to think I'm a pretty good chap. But the reality is it's more likely we're 75% Glaucon, 25% Socrates. Most of us hold strong enough values in some areas we would still do the right thing in private. And yet, studies show we are vastly more likely to cheat on a test in dim lighting[1], and cheat less when there is a cartoonlike image of an eye nearby[2], or when the concept of God is activated in memory merely by asking people to unscramble sentences that include words related to God[3].
None of this would make sense if morality was primarily rooted in ourselves.
Secondly, a big part of our morality is innate.
Through millions of years of evolution, certain moral foundations have been built to help spread our genes to the next generation. This doesn't mean we can't shift those things. Nature provides a first draft, which experience then revises. "Built in" doesn't mean unmalleable. It means organized in advance of experience.
To be clear, I'm not saying arguments can't change people's minds; I was convinced eating peanut butter for every meal was actually not a healthy idea. We'll get to later how we can open our minds and change others in a world of social intuitionist morality. I'm saying that reasoning plays much less of a role in morality than many people realize. And appreciating this can help us not only open our minds but empathize more in our terribly polarized climate.
We can learn to value reasoning over intuition more. I like to think I've moved more toward reasoning than most people. I've convinced myself of several moral beliefs I didn't find intuitive. Perhaps I just reasoned my way into new intuitions, but at least that's better than going with my first gut instinct.
In any case, it's paramount for us to understand what these built-in moral foundations are as it will help us understand what our own moralities are constructed out of, why conservatives have an advantage in moral discussions, and ultimately how we can navigate a world of such drastic moral differences.
To look at what these foundations are built on, we must move to Moral Foundations Theory.
Moral Foundations Theory
Moral Foundations Theory, created by Jonathan Haidt and colleagues, proposes that human moral reasoning stems from several innate, intuitive foundations including care/harm, fairness/cheating, loyalty/betrayal, authority/subversion, sanctity/degradation, and liberty/oppression, which explain differences in moral judgments across cultures and political ideologies. Let's explore the evolutionary underpinnings of each and how they manifest in our moral intuitions.
Care/Harm Foundation: evolved in large part through our need to help those in danger or sufferingâparticularly childrenâso our species wouldn't die out. It makes us care for others, especially those in need, including animals.
Fairness/Cheating Foundation: evolved in response to the adaptive challenge of reaping the rewards of cooperation without getting exploited. It makes us sensitive to indications that another person is likely to be a good (or bad) partner for collaboration and reciprocal altruism. It makes us shun or punish cheaters and reward proportionally to what was put in.
Loyalty/Betrayal Foundation: evolved in response to the adaptive challenge of forming and maintaining coalitions. It makes us sensitive to signs that another person is (or is not) a team player. It makes us trust and reward such people, and it makes us want to hurt, ostracize, or even kill those who betray us or our group.
Authority/Subversion Foundation: evolved in response to the adaptive challenge of forging relationships that will benefit us within social hierarchies. It makes us sensitive to signs of rank or status, showing respect and putting more emphasis on their opinions, as well as more sensitive to signs that other people are (or are not) behaving properly, given their position.
Sanctity/Degradation Foundation: evolved initially in response to the adaptive challenge of the omnivore's dilemma which is navigating the ability to eat almost any calorie source but having to weigh the risk it might kill you as well. As society progressed this adaptive challenge transformed into the broader challenge of living in a world of pathogens and parasites. It includes the behavioral immune system, which can make us wary of a diverse array of symbolic objects and threats. It makes it possible for people to invest objects with irrational and extreme valuesâboth positive and negativeâwhich are important for binding groups together.
Liberty/Oppression Foundation: evolved in response to the adaptive challenge of not being overly restricted by those higher up in social hierarchy or the systems they create. It makes us want to have equal rights and be free from infringements on our liberties.
With an understanding of the six moral foundations we can turn back to my childhood.
Why I Found Conservative Morality So Dumbfounding
One of the most interesting findings from Haidt's research is liberals tend to care about three of the moral foundations over the other three: care/harm, fairness/cheating, and liberty/oppression. Conservatives in contrast, weigh all six foundations about equally.
Look at the graph below; the more conservative one rates themselves, the more the foundations come to around the same point in relevancy (this study was done before the sixth foundationâliberty/oppressionâwas defined so it doesn't show on the graph).

The first evidence for Moral Foundations Theory. (Adapted with permission from Graham, Haidt, and Nosek 2009, p. 1033; published by the American Psychological Association.)
This was why I found conservative morality so dumbfounding. I was tasting the moral cuisine with three taste receptors whereas they were using all six! Anytime they reasoned using one of the three receptors I couldn't taste, it was utterly mind boggling. And since my intuitions were based largely in those three receptors, it was extremely difficult for me to reason my way into understanding any of them.
Itâs essential to understand these foundations arenât rigid categories, theyâre dimensional.
I can and do value a foundation like authority to some degree, but relative to the average American conservative, itâs so much lower it can still feel like weâre speaking a different language. Yet, in the other direction, some liberals are so skeptical authority it can feel crazy to me.
A great example of this happened while I was teaching a backcountry cooking class for Cornell Outdoor Education. Two co-facilitators and eight students were camping out at Fox Fire Lean Two when we came across a dropped vape on the ground. I spotted a hunter with a rifle over his back walking farther down the path, and told the rest of the group I was going to go give it back to him.
To my great surprise, a large portion of the group thought it was a terrible idea. âHe has a gun,â they said. As a hunter myself, I found it utterly bizarre. What did they think was going to happen? Iâd walk up to the hunter, say, âhello sir, hereâs your vape you dropped.â The hunter would turn around, thank me, and then shoot me point blank with his rifle.
This example highlights how relative the moral foundations are and how much they can make other peopleâs beliefs feel so foreign.
It's a lack of appreciation for these differences that is creating so much polarization in our society. In Haidt's words, "morality binds and blinds. It binds us into ideological teams that fight each other as though the fate of the world depended on our side winning each battle. It blinds us to the fact that each team is composed of good people who have something important to say" (Haidt, 2013, p. 365).
Once I realized this, I became extremely interested in investigating how liberals and conservatives valued the foundations differently because it might help mend that polarization. Here's what I found.
Care/Harm Foundation: both liberals and conservatives value care/harm, but liberals simply care more.
Fairness/Cheating Foundation: both liberals and conservatives care about fairness/cheating but conservatives put more of an emphasis on the proportionality aspect of people getting what they put in than liberals who care more about equal outcomes and opportunity (depending on the liberal of course).
Loyalty/Betrayal Foundation: Liberals ambivalent, conservatives value it a lot. They value loyalty for one's groups, nation, and more.
Authority/Subversion Foundation: Liberals ambivalent, conservatives value it a lot. They value respect for one's elders, parents, leaders, and more.
Sanctity/Degradation Foundation: Liberals ambivalentâsay it with meâconservatives value it a lot. They value purity of soul, mind, and more.
Liberty/Oppression Foundation: both liberals and conservatives care about liberty/oppression but in different ways. Liberals care more about equal rights for everyone, especially oppressed groups and minorities. Conservatives care more about traditional notions of liberty like the right to be left alone, and free from government infringement.
Why Conservatives Have An Advantage In Moral Matters
As we see from the studies on moral reasoning, conservatives tend to use all six moral foundations, whereas liberals prioritize just three: care/harm, fairness/cheating, and liberty/oppression.
If each moral foundation were an ingredient in a cooking competition, conservatives would have access to many more flavors and recipes than liberals. And we see this effect play out in our politics. In Haidt's words, "Moral psychology can help to explain why the Democratic Party has had so much difficulty connecting with voters since 1980. Republicans understand the social intuitionist model better than Democrats do. They have a better grasp of Moral Foundations Theory; they trigger every single taste receptor" (Haidt, 2013, p. 213-214).
What's going on with liberals? Why do they believe (and I until recently) believe speaking to just these three moral foundations is the way to go?
For most of the last century, moral research has been done on WEIRD societies: Western, Educated, Industrialized, Rich, and Democratic. This specific group tends to be more left-leaning and prioritize autonomy over the other moral domains. So, for much of the time, morality has been taught in higher educationâwhich is dominated by liberalsâonly those three moral foundations have been promoted. In effect, liberals get the perception this is all there is to morality and are dumbfounded when other people think differently.
And other people do think differently. As a generalizable sample of how people see morality, WEIRD societies are one of the worst that you could use. Across the world, all six moral foundations are promoted in different societies in completely different ways. Just as the cuisines of the world differ profoundly, so do the moral foundations.
To be clear, I'm still more on the left than the right. I believe the three moral foundations prized by liberals are more important than the other three, but I think liberals go too far in how little they value the other three as a byproduct. Reading this book has made me appreciate the moral foundations underpinning conservatives and people in other political ideologies. I actually took the moral foundations test after reading the book, and this is what I got:

If you want to take the test yourself, you can here.
Unfortunately, I don't have my results from before reading the book, but I can make guesses. My values of purity, authority, and loyalty would all be low, and everything else would be about the same.
This leads us to our last question: how can we navigate a world of such differing morality with this in mind?
How Can Navigate Morality With This In Mind?
Let's start with what you shouldn't do: argue senselessly.
Haidt uses the analogy of an emotional dog and its rational tail to describe why you shouldn't.
According to him, "The social intuitionist model offers an explanation of why moral and political arguments are so frustrating: because moral reasons are the tail wagged by the intuitive dog. A dog's tail wags to communicate. You can't make a dog happy by forcibly wagging its tail. And you can't change people's minds by utterly refuting their arguments." (Haidt, 2013, p. 56).
Again, arguments can change people's minds. But argumentation is most effective when we first make the emotional dog happy. There are three ways we can do this.
Firstly, build open-mindedness and curiosity.
A huge reason for our moral polarization is that we simply don't try to understand the other side. Often, when we do, our own side creates a bastardized straw man version of what the other side believes, and then we belittle that.
If you want to truly understand another group, follow the sacredness from them. What do they treat with the reverent love as a Christian treats the cross? As a first step, think about the six moral foundations, and try to figure out which one or two are carrying the most weight in a particular controversy.
Put yourself in their shoes. What might have led them to believe that? Why is that moral foundation important to them?
Secondly, empathize.
If you really want to open your mind, open your heart first.
Empathy is an antidote to righteousness, although it's very difficult to empathize across a moral divide. If you can have at least one friendly interaction with a member of the "other" group, you'll find it far easier to listen to what they're saying, and maybe even see a controversial issue in a new light. You may not agree, but you'll probably shift from Manichaean disagreement to a more respectful and constructive yin-yang disagreement.
You can really go for it with this step by making friends with people of differing morals. Of course, be mindful of your well-being while doing this. There's a huge different between someone who has different values and someone who acts those values out in harmful ways. For example, I have a number of great Christian friends because even though their values are different than mine, they don't act them out in ways that actively hurt me or my friends. Being friends with a fundamentalist Christian who routinely tells me I'm going to hell, or consistently takes part in protests against abortion would be much harder for me.
Despite these caveats, making friends with those of differing opinions, from my experience, is single-handedly the most effective way to open your mind. One of the main, if not the main, reason morality exists is for its social function. Having a meaningful relationship you care about who has differing morals will do boatloads in helping you understand where the other side is coming from.
Thirdly, create a group mutually interested in breaking each other's self-deceptions.
People on an individual level can't be expected to overcome their biases and intuitions. We are much better at spotting other people's deception than our own. So, if your goal is to find the truth, a group of people blinded together through mutual interest in spotting and navigating other people's deception is best.
This single idea is the foundation of science, corporate brainstorming sessions, hiring committees, and more. This is why diversity is integral, as people of vastly different beliefs and backgrounds will better spot other people's deception.
You can do this at the level of your friend group, seeking people of differing values and interests. You can do it at the level of your community seeking to join groups that will challenge each other. You can also do it at the level of your town, city, or nation, seeking to live in places with a culture of dialogue.
If you can do all three of these things, especially the first two, you're much more likely to sway someone with rational arguments. The emotional dog has been appeased, and arguments might cause it to sway its tail naturally.
Morality For A Better Society
We've walked a complex moral landscape together, yet the journey's most profound insight may be this: our deepest moral disagreements aren't battles between good and evil, but conversations between different moral languages. Each side speaks fluently in their native moral foundations while struggling to comprehend the others.
When we stop trying to forcibly wag the emotional dog's tail and instead connect heart-to-heart first, we might discover that those alien minds across the galaxy aren't so incomprehensible after all. They're just people with different taste receptors in their moral cuisineâpeople who, like us, are simply trying to create meaning in a complicated world.
The question isn't whether we can eliminate moral differences, but whether we can learn to navigate them with curiosity instead of contempt. In that space between understanding and disagreement lies our best hope for a society that bends toward both justice and unity.
What's the next step? Perhaps taking the moral questionnaire yourself, and then talking about your results with a friend, hopefully someone you think might have some differences. Just make sure to appease their intuitions first.
References
- Zhong, C.-B., Bohns, V. K., & Gino, F. (2010). Good lamps are the best police: Darkness increases dishonesty and self-interested behavior. Psychological Science, 21(3), 311â314. https://doi.org/10.1177/0956797609360754 âŠď¸
- Haley, K. J., & Fessler, D. M. T. (2005). Nobody's watching? Subtle cues affect generosity in an anonymous economic game. Evolution and Human Behavior, 26(3), 245â256. https://doi.org/10.1016/j.evolhumbehav.2005.01.002 âŠď¸
- Shariff, A. F., & Norenzayan, A. (2007). God is watching you: Priming God concepts increases prosocial behavior in an anonymous economic game. Psychological Science, 18(9), 803â809. https://doi.org/10.1111/j.1467-9280.2007.01983.x âŠď¸