I have identified five common NPC species into which the majority of netizens fall. Analyzing the shortcuts they take is crucial to understanding the information landscape. Further, since you’ve likely been using at least one of these shortcuts yourself, considering them will help you to identify the flaws in your own belief-forming behaviors.
I claim that we decide what to believe by deciding who to believe, and Gurwinder’s descriptions are consistent with that. Each of his “NPC species” uses a different strategy for deciding who to believe.
Conformists trust the process by which society reaches consensus, so accept the mainstream view on all things.
If I were writing to appeal to conformists, my essays would be filled with “Studies show. . .” and “This chart proves. . .”
Where relying on a consensus view goes wrong is when the consensus is formed artificially. There is the famous example of Asch conformity: you show on a screen two lines, with A longer than B, to four people, one of whom is the subject and the other three are confederates of the investigator. If the three confederates announce that that B is the longer line, many subjects will also say that. As Gurwinder comments on another experiment,
the more people agree, the less likely they are thinking for themselves.
The second strategy is contrarianism, which means always going against the consensus. The obvious drawback of this strategy is that the consensus is often correct. It is hard to see contrarianism as a viable truth-seeking strategy.
This leads to Gurwinder’s third strategy.
contrarians will often be tempted to put all their trust in a single, charismatic, anti-establishment demagogue. In so doing, they devolve into the oldest NPC species: the disciple.
What would Balaji do? Or Vitalik? The crypto world attracts a lot of folks who adopt the disciple strategy.
The fourth strategy Gurwinder calls tribalist. In the early days of the pandemic, the progressive tribe saw worriers as anti-Asian racists. But at some point the tribe shifted, and people who went maskless in public were considered sociopaths. The problem with tribalism is that your tribe can easily be wrong.
The fifth strategy he calls “the averager.” This is the centrist who tries to split the difference. But sometimes truth is like the proverbial argument between two women over who is the mother of the baby—splitting the difference is not the solution.
Gurwinder ends up advocating intellectual humility.
Your brain will always try to save time when forming beliefs — it’s what it does — but the best way to save time is not to take a shortcut to “truth,” it’s to take no route at all. So if you want to stop being an NPC, simply say “I don’t know” on all the matters that don’t concern you. And this will give you the time to not be an NPC on all the matters that do.
If Robin Hanson were here, he would tell you that this sort of humility could be enforced if we had prediction markets. When spouting an uninformed opinion can cost you money, you are less likely to spout it.
Another Hansonian thought would be to treat Gurwinder’s five strategies not as strategies for truth-seeking but as strategies for status-seeking. The conformist seeks to curry favor with those in power. The contrarian and the disciple are making a bet that the power structure will be overthrown, and their status will soar if their views please the new elite. The tribalist is aiming for high status within the tribe. The averager is hoping to survive even as first one tribe is on top, and then the other.
There is probably an evolutionary dynamic in which all of these strategies survive. In a relatively stable environment, conformity and tribalism probably work best, and the equilibrium share of conformists and tribalists in the population will be high. When there is radical change, some contrarians and disciples will achieve success. Averagers, who hedge their bets, will be safer until the environment stabilizes again.
Gurwinder’s recommended approach of humility may be a way to avoid making cognitive errors. But it does not seem to be a path to high status. Certainly not in politics in the United States.
This essay is part of a series on human interdependence.
Substacks referenced above:
@
Let me add a 6th cognitive strategy. Let's call it *wise, limited deference.* Most people don't have time or inclination for it. Here are the steps:
a) Learn rudiments of epistemology; for example, criteria of 'arguments to the best explanation of available evidence,' Bayes Rule, J. S. Mill's several 'methods' of induction, and Kling's FITs criteria.
b) Keep an eye out for persons who apply such rudiments, and keep tabs on their performance (predictions, warranted 'I told you so,' and the like). Read them regularly. Note: A person who has different values than you might nonetheless deserve wise cognitive deference. Focus on reliability about facts and mechanisms.
c) In novel situations, and whenever deference might be efficient, cautiously defer to this set of persons, who have demonstrated sound cognitive habits and good judgment.
d) Make a point of occasionally reading also smart people who disagree.
e) If your luminaries seem to get it wrong on an issue that you care to get right, then try and figure out the issue yourself if you can, and voice your reasoned disagreement.
This cognitive strategy—wise, limited deference, plus occasional effort at reasoned disagreement—will maximize your chances of forming true beliefs, and might even gain you a modicum of status among truth-seekers.
Note: Leave plenty of room for the cognitive humility strategy, mentioned in Arnold's essay, even though it has low status.
All around, we have learned we have fewer to trust for anything resembling fact.