Political Psychology Links, 12/02/2025
Will Storr on the status game; Michael Magoon on radical ideologies; Dan Williams on elite gatekeeping
In a podcast with Russ Roberts, Will Storr says,
in order to make those groups functional, you need a system of incentive and punishment, rewards and punishment. And, that’s the status game. So, as I said earlier, we are deeply interested in: How useful am I being to other people? Because that’s what status is: it’s a score of our perceived value.
And
status leaks. And so, we want to be around higher-status people than us, because their status leaks out onto us.
It bothers me that people want to be around the rich or the powerful.
And
We identify with that feeling of, ‘I should have more status than I do.’ I think it is a very, very common feeling amongst humans.
It’s very easy to notice the guy who has higher status than you at work, even though you think you’re more valuable. It’s much harder to notice the guy who has lower status than you and may be more valuable.
Radical ideologies are not truth-seeking systems. They are designed to change reality, not to understand it. This is why they always eventually collide with the material world.
…They are emotionally comforting but epistemically catastrophic. They offer:
Certainty over truth,
Moral identity over accuracy,
Moral posturing over material results.
and
They grow from psychological temperaments, neuroticism, and cognitive bias; they attract and organize mental disorders; and they weaponize emotion and identity to override the rational mind. Their founders often exhibit pronounced psychological disturbances, and their followers are frequently drawn from individuals who crave certainty, belonging, or moral purity.
and much later:
When an ideology demands outcomes that reality will never produce, its followers must choose between two paths:
admit the ideology is wrong, or
lie.
Radical movements always choose lying. Lying becomes a moral duty, a sign of loyalty, even an act of revolutionary virtue.
The essay is long, and it presents a grand theory of ideology that I find persuasive. But I recommend approaching it skeptically. Which statements are not falsifiable? Can you think of exceptions? Where are the grey areas between “ideology” and other beliefs?
Efforts to censor and de-amplify disfavoured views bred widespread anger and resentment among those who saw unaccountable elites exerting undemocratic control over the public conversation. … And it’s difficult to see how any top-down effort to control the information environment can work without merely exacerbating the anti-elite resentment that fuels the very content such efforts aim to address. …
Elite gatekeeping doesn’t just filter out the most egregious forms of misinformation. It also typically filters out legitimate grievances and reasonable challenges to establishment orthodoxies.
He says that instead of trying to shut down debate, elites should try to participate. He says that the case for shutting down populist media amounts to
the idea that those who support populist or illiberal politics are persuadable—but only by bad ideas.
In a world where some people resist rational argument but are persuaded by bad ideas is a world where gatekeeping is justified.
but research from social scientists like Alexander Coppock, Ben Tappin, and others consistently shows that rational persuasion is broadly effective at changing people’s minds.
Really?
I keep coming back to people decide what to believe by deciding who to believe. I think that this takes us back to the topic of status. I suspect that Alan is inclined to believe Bob if Bob seems to boost Alan’s status and to disbelieve Bob if Bob seems to threaten Alan’s status. This leads to a suggestion—not original with me—to try to get people to see their status as correlated with objective truth-seeking.
substacks referenced above:
@
@




I think some people get stuck in the cognitive trap of not being persuadable on ideas related to what Eric Weinstein calls the gated institutional narrative. They've seen too much sketchy stuff (on climate change, gender ideology, corrupt academic disciplines like sociology and various "studies," etc) coming from these institutions to be willing to buy what they're selling anymore. In other words, they decide what to believe based on who they *disbelieve.* The problem, of course, is that the gated institutional narrative is not always wrong.
Re: "people decide what to believe by deciding who to believe".
Arnold highlights trust as the fundamental mechanism in belief-formation about complex issues.
There are other major causes of beliefs:
• Exposure to facts
• Exposure to specific causal theories
• Pressure to reckon with inconsistency among beliefs
• Myriad behind-the-back psychological mechanisms; for example, the availability heuristic, cognitive dissonance reduction, wishful thinking, counter-wishful thinking, social desirability bias, stubbornness, pride, and so on.
I would highlight two features of ideology:
• Backwards reasoning, from a preferred conclusion to reasons for it, i.e., rationalization.
• Neglect of (admittedly difficult) norms of rational belief-formation, namely, Bayesian updating, inference to the best explanation of evidence, and statistical inference.
What makes an ideology "radical"? Magoon's answer seems to be that radical ideologies are caused largely by mental disorders (mental illness). Is there a definition of radical ideology, distinct from its causes? Must any belief system critical of the establishment be largely an expression of mental illness? Is the definition of mental illness too expansive if it covers a quarter or half of the population (Magoon's figures)?