Trust and Accountability
Are members of one society inherently more cooperative?
I dislike the phrase “high-trust society.” It suggests that you observe more cooperation in one society than another because the people in one society are just inherently more cooperative. It smacks of what anthropologists call “naive sociology,” when a group believes that other groups are inherently composed of bad people. It is sort of like the fundamental attribution error applied to an entire out-group.
I believe instead that you observe more cooperation where cooperation is likely to be rewarded and cheating is likely to be punished. Human cooperation is one of the most important aspects of society. It is very complex, and it becomes more complex the larger the form of social organization.
Because of this complexity, I expect that people with low intelligence will have difficulty developing and maintaining systems of cooperation. But that is not in any way to suggest that high intelligence is a sufficient condition for fostering cooperative behavior in an individual or in a society. Very intelligent people can be experts at cheating.
We should never lose sight of the game-theoretic aspects of cooperation. Cheaters will try to evade detection by pretending to be cooperators. Everyone needs to develop the ability to identify cheaters and protect ourselves from them.
We evolved to resent cheaters and also anyone who we see as not cooperating to punish cheaters. We remove from office the “soft on crime” district attorney.
Something that crypto fans should consider is the point thatmade in an interview with .
What most people want from a trusted service is something like, "This will work, and if someone tries to screw me over or steal my stuff, they will be stopped, or punished." Crypto doesn't only fail to provide that in most cases, it is literally built to make the provision of that service nearly impossible. Some of that is embedded in the technology itself — the more you work to fortify a decentralized technology from oversight and interference, the harder it becomes to reverse or police outrageous behavior.
There are people who worry about whether SBF will be punished, because of the way he cultivated the media and people in power. But at least the authorities might punish him. A process of punishment is not built into the crypto ecosystem. It’s supposed to be automatically immune from cheating. Except when it isn’t.
Advantages that people associate with a high-trust society stem from the fact that people do not fear being cheated. You walk around a city because you do not fear robbery. You respect public officials, because you do not fear that they will use their power to exploit you. You shop confidently, because you do not fear that merchants will rip you off. Merchants display their merchandise on shelves, because they do not fear that customers will steal. You put your savings in banks, because you do not fear that you will be unable to withdraw it when you need it. You believe mainstream journalists, because you do not fear that they will manipulate you and distort the truth. You send your children to public school, because you do not fear that teachers will be incompetent or cause harm.
My problem with the phrase “high-trust society” is that it seems to imply a society where people have just decided to not to worry about cheating. Naive sociology assumes that the absence of cheating shows you something about the population’s genetic characteristics or cultural upbringing. It as is if there has been a spontaneous agreement to be nice to one another.
Instead, consider treating the absence of cheating as a sign that a society’s accountability systems are working. That is, a high-trust society is one that has an effective process in place for detecting and punishing cheaters. It has come up with generally reliable solutions to the game-theoretic challenges of cooperation.
I predict that you will only see high trust where you see effective accountability. I challenge you to point to an institution or segment of society that is accorded high trust with little accountability, or vice-versa.
The trust-accountability correlation helps us to understand the process of trust breaking down. Trust breaks down when accountability systems decay.
For example, the accountability system for Congress requires competitive elections. But the high rate of incumbent re-election and the preponderance of one-party districts weakens this system. The fact that there is no punishment whatsoever for Congressmen who violate the norm of balancing the budget has also greatly reduced accountability. People re-elect their own Congressman, but they don’t trust Congress.
Consider this, from:
In March 2022, the so-called Blockchain Eight―a group of 4 Republican and 4 Democratic house representatives―sought to stall a new SEC’s probe into FTX and several other crypto firms. Of those eight officials, 5 had received donations directly from FTX staff, including more than $500,000 to North Carolina’s Rep. Ted Budd from a super PAC started by a top deputy of SBF’s at FTX, Ryan Salame.
If you think Budd was going to guard the henhouse, you might want to ask why he took so much money from the fox. Has he been held accountable? Budd, a Republican, is no longer going to be in the House. He has moved up to the Senate.
In the realm of media, it is hard to see an accountability mechanism at work. No one will be fired for giving free PR to Sam Bankman-Fried, even those he bribed for coverage.
Some outsiders who correctly challenged parts of the mainstream narrative on COVID were canceled. No member of the mainstream press will be punished for echoing and protecting that narrative. It makes sense that trust in media has collapsed.
At one time, a religious community could hold its members accountable for living up to moral standards. This ability has declined considerably in recent years, coinciding with or perhaps causing the drop in religious affiliation.
Going forward, I would like to hear less about a “high-trust society.” Instead, tell me about what makes for a high-accountability society. What sort of mechanisms help to insure that cheaters are detected, caught, and punished? What sorts of developments can lead to the decay of those mechanisms?
This essay is part of a series on human interdependence.
Allowing people to vote with their feet is perhaps the best way of holding institutions accountable. The worst way to build a accountability is with gatekeepers and censorship. Peer review illustrates these points. https://experimentalhistory.substack.com/p/the-rise-and-fall-of-peer-review
As a practical matter, you may want a different word than 'accountability'. The reason I like the expression 'high trust' and dislike 'high accountability' is that once people start talking about accountability, what we end up with is a system whereby the usual suspects are scapegoated and
take the blame. This I know is not what you want. But the thing that erodes trust the most in my experience is discovering that people have lied to you, most especially if they did so as part of a cover-up. But when you investigate why cover-ups happen, you often find it is to hide the most minor of transgressions. People do not feel free to admit the most inconsequential of sins because they think the accountability police will unleash all sorts of hell upon them. The accountability police is disproportionately formed of people who love humiliating others and making them suffer -- it is why
they sign up for such things.
So, if you want real accountability the first thing you may need to do excuse such people from your process. This has turned out to be very difficult to implement in practice. Teaching people how to admit minor wrong doing and have it not matter very much is a hard lesson, especially if you do not wish to instead teach 'the well connected people can get away with anything' which is a different serious risk.