In The Geek Way, Andrew McAfee quotes Dan Williams as writing,
A good argument for strong norms of free expression is not that it leads to truth, but that it’s a form of system design that protects against the harms produced by small but well-organised groups that impose self-serving orthodoxies and taboos on the broader population. When that happens, you can’t challenge the orthodoxies without risking social punishment. By upholding norms in favour of free expression, however, you lower the scope & costs of such punishment.
The quote is from a tweet. It left me curious to read more of Williams. In an essay that he posted called “Socially Adaptive Minds,” Williams wrote,
Unlike the merely natural world, the social world that we inhabit often rewards epistemic irrationality and punishes epistemic rationality.
If we did not live in a social world, it probably would pay to be rational. The survival value of seeing the world as it is would have inclined humans to evolve into pure truth-seekers.
But in the social world, there can be a payoff to steering away from the truth. Williams writes,
For example, there is what people routinely believe: from the religious and ideological myths, superstitions, and conspiracy theories that defy all evidence and reason to the more mundane self-serving and self-aggrandizing beliefs that people gravitate towards. These latter illusions result from a bizarre feature of the human mind: in addition to lying to others, we also lie to ourselves – a quirk of human psychology sustained by those psychological defense mechanisms that so pre-occupied the great psychoanalysts: denial, repression, deflection, and more.
…We form such beliefs not because they are best supported by the available evidence but because they enable us to achieve various social goals. They enable us to fit in and stand out. They signal our commitment to cooperative social norms and advertise our loyalty to our coalition. They enable us to persuade other agents of our social value. Of course, we, the believers, are blissfully oblivious to such social functions of our beliefs, not least because we always have a rationalisation at hand when somebody challenges them. According to this research, however, this obliviousness is itself a kind of self-deception.
…much of human reasoning is plausibly socially adaptive reasoning. According to an increasingly influential position in psychology, the self-serving and self-justifying character of human reason is a feature, not a bug, enabling us to achieve various social goals such as persuading our peers, defending our reputations, and cheering on our tribes that are often in tension with epistemic goals.
In short,
the chief thing that humans need to do is get along and get ahead in social games in which knowledge and rationality are often losing moves.
Daniel Kahneman influenced economists and others to think of irrationality in individual terms. He described a set of biases embedded in individual minds, and in order to be more rational we have to overcome those biases.
Instead, Williams is suggesting that irrationality is a group phenomenon. We should explain epistemic irrationality by looking at the group layer of society, not at the individual layer.
At the group level, epistemic irrationality can serve as a loyalty test. Anyone can profess a true belief—doing so does not signal your membership in a group. But if you go along with a false belief, then you are more likely to be a loyal group member.
I like to say that we decide what to believe by deciding who to believe. We are especially inclined to believe people who are close to us in the small group layer of society—our friends and family. Our desire relative to the group is to belong. We adopt false beliefs that enable us to belong.
Suppose that a high school clique adopts the belief that ordinary heterosexual preferences are low status. As someone wanting to belong to that clique, you might want to deceive other members into thinking that you are some flavor of LGBTQ+. That deception will be even more effective if you can deceive yourself into such a belief.
In fact, that is my explanation for the growth in self-reported non-binary sexual identities.
Elsewhere, Williams introduces the term coalitional press secretary. This combines two features of our psychology. We long to belong to a team. As many psychologists have argued, our mind acts as a press secretary, justifying and defending its beliefs and actions. In Julia Galef’s terminology, the press secretary acts in a biased soldier mindset, as opposed to a more neutral, truth-seeking scout mindset. Williams is modifying the press-secretary metaphor to have it operate on the behalf of one’s group.
Group-level irrationality clearly poses a threat to overall well-being. The antidote is norms and institutions that foster a competitive marketplace of ideas. We see such norms and institutions in science, adversarial legal proceedings, markets for goods and services, and in media.
But we have seen in recent decades that these institutional mechanisms, which Jonathan Rauch calls the Constitution of Knowledge, can be fragile. He sees them as under assault primarily by the Trump faction of the Republican party. I see the more dangerous assault as coming from the social justice faction on the left. Going back to the beginning of this essay, free speech norms are an important pillar that is under attack.
There is hardly any task more urgent than overcoming the assault on the norms and institutions that overcome group-level epistemic irrationality.
“The survival value of seeing the world as it is would have inclined humans to evolve into pure truth-seekers.” In all of nature humans are the only irrational creature. Even where there is chaos in the universe, it isn’t irrational by choice. Imagine a flock of geese wherein some “decide” to only stay alert for threats from the air (hawks) since threats on the ground are less prevalent. The goose who is no longer alert on the ground ends up a pile of feathers, punished by epistemic rationality that Williams notes.. As humans drift further away from the natural world (e.g, screen time), it becomes easier to pretend the world is something it isn’t, even if it imperils overall well-being.
Great piece. This is really interesting thinking. Believing in stories is how humans coordinate on a group level so this behavior is baked into DNA. I also think it is worse now because you have a large generation of younger people slowly taking power from a large generation of older people holding onto it.