Seminar recap, session 4, 11/24
Examples of institutional breakdown
The purpose of this session was to discuss examples of institutional breakdown.
One example was a major corporate initiative where progress was tracked using color codes: green (on track); yellow (some problems); or red (big trouble). When a low-level manager coded something as red, his boss would find a reason to change it to yellow, and then the next manager up the chain would find a reason to change it to green. That way, the top managers never knew how badly the project was going. When they finally found out, they had to cancel the project.
Halberstam’s description of the Vietnam war is a similar example. As reports of battle results were passed up the chain of command, bad news was filtered out. This was in part because of the way that internal bureaucratic political battles were being fought. Civilians and the military were at odds, with the military gaining more bureaucratic power as the number of Americans in combat increased.
Halberstam also characterizes decision-makers as having narrow backgrounds. They understood strategy and international relations at a theoretical level, but they did not have combat experience or cultural knowledge of Vietnam itself.
A more recent example might be the media coverage of Rittenhouse, which prior to the trial created a number of impressions that were misleading, or even false. Here, there was no chain of command through which information flowed. It was closer to what Eric Weinstein calls the DISC, or distributed information-suppression complex. People who know better are inhibited from speaking out.
Another example is Walter Duranty’s reporting for the NYT on the Soviet Union. His misleading stories were taken at face value and earned a Pulitzer. Even when the truth came out there was little punishment or accountability at the paper.
As an example of institutional irrationality, consider the rule near the LA port that containers may not be stacked more than two high. Or consider zoning regulation in general.
The FDA and CDC during the pandemic is another example. They would not admit what they did not know, and instead pretended to know things that turned out to be false. They lack a sense of urgency, most recently in approving Paxlovid.
Speaking of officials jealous of their authority, an Italian heterodox scientist correctly warned of an earthquake, and he was chastised by the mainstream earthquake scientists.
Our tax system is irrational, although this has been true for a long time.
Can we find examples outside of the West? Perhaps the Mughal emperor Aurangzeb, alienating Hindus and others.
In China, around 1700 the emperor decided to put a stop to naval exploration and to close the country off in general. Is today’s ruler repeating some of this behavior? Is Asian institutional failure due to excessive trust of central authority? Japan and MITI?
When banks, industry, and government are in harmony, this weakens price signals and hinders the market. Or when Wall Street assumes that the Fed and financial regulators have everything under control while the regulators assume that Wall Street is well managed, neither is sufficiently alert to risks.
The 2008 crisis can be viewed as a moral failure. That is, bank executives knowingly took excessive risks in the expectation of a bailout. But it also can be seen as a cognitive failure, in that they and their regulators were not aware of the extent of the risks that were accumulating.
In finance, you have to wonder how you are making a profit. It’s a competitive industry, so are you being extra clever, extra lucky, or extra imprudent? I worked for a CEO who would actually ask those sorts of questions as he dropped in on various departments. He wanted to make sure that the firm’s profits were sustainable. Could a CEO do that today, or would his concerns and doubts wind up posted on Twitter the next morning?
Finance, tech, and the military seem to produce many examples of institutional failure.
Until at least the late 19th century, doctors did more harm than good. Yet medicine as an institution persisted. Maybe we need the Robin Hanson explanation, that medical treatment is something you want others to have in order to show that you care.
What if one bad actor causes the problem? Should you blame the institution? I am inclined to say yes. For example, any bank should have institutional defenses against being ruined by a single rogue trader.
We still need to refine the topic of institutional irrationality. Maybe a Jeff Foxworthy approach to institutions. “If ___, then you might be an institution.” It’s important to distinguish irrationality from a one-off failure or mistake. People see the George Floyd case and are ready to throw out the institution of police. They see households having trouble paying bills and they want to break contracts (e.g. student loans, eviction moratorium).
Institutional rationality may have a component related to learning and error correction. I would say that a rational institution will make good mistakes and learn from them. A good mistake is something that occurs even when the information flow within the institution works well. A good mistake is relatively minor and helps you learn.
Networks differ from hierarchies in terms of how information flows. The NYT is a hierarchy. “Journalism” seems more like a network. Do networks and hierarchies have different failure modes?