Arnold writes, "I would say that the “regime” could be the Chinese government, or it could be the Woke mob. If you want to demonstrate your power, sticking to the truth is not sufficient. Forcing people to accept lies is much more intimidating." To be sure, the "regime" includes elements from both extremes of the political spectrum. For example, book banners are active on the left -"decolonizing libraries" - and the right, such as school boards banning books that purportedly contain unconventional viewpoints or requiring schoolbooks to contain certain political views.
Is there much scholarly treatment of the efficacy of such propaganda? Obviously it's bad, but I tend to expect it's not quite as bad as the worst expectations.
The biggest reason is that I think most people know when they're being intimidated and don't like it. People may be intimidated into professing false beliefs but I'm much more skeptical about the ability to intimidate people into holding false beliefs.
My high school kids, for instance, have been exposed to plenty of woke nonsense. I don't think they've been successfully indoctrinated in any way.
Instead, it seems that pretty much everything else that irritating biddies try to preach to teenagers, it's treated with humor and mockery. Maybe it's just because we're far outside the norm in critical thinking, but I think it's true of a lot of their peer group as well. The propaganda is still bad and still reduces the ability to search for truth, but I have the sense that, for most people, a lot of deprogramming isn't necessary.
Re: "where people are trying to intimidate you into holding false beliefs, it takes courage and effort to try to maintain the search for truth."
More often than not, lack of belief is the better part of wisdom. True, it takes a bit of courage and effort to say, "I don't know, it's complicated, I've haven't looked into it carefully, it's outside my purview," when people are trying to intimate you into holding false beliefs. But it takes less courage and effort than the search for truth.
Another alternative is wise deference in belief-formation. For example, if someone pressures me to believe X about an issue in economics (e.g., inflation), I might reply, "I trust Arnold Kling on such issues, but I acknowledge that it's complicated, since Tyler Cowen happens to disagree."
Occasionally, I put real independent effort into search for truth about an issue in the world. But most people don't have time or resources or temperament to do so.
Arnold writes, "I would say that the “regime” could be the Chinese government, or it could be the Woke mob. If you want to demonstrate your power, sticking to the truth is not sufficient. Forcing people to accept lies is much more intimidating." To be sure, the "regime" includes elements from both extremes of the political spectrum. For example, book banners are active on the left -"decolonizing libraries" - and the right, such as school boards banning books that purportedly contain unconventional viewpoints or requiring schoolbooks to contain certain political views.
Is there much scholarly treatment of the efficacy of such propaganda? Obviously it's bad, but I tend to expect it's not quite as bad as the worst expectations.
The biggest reason is that I think most people know when they're being intimidated and don't like it. People may be intimidated into professing false beliefs but I'm much more skeptical about the ability to intimidate people into holding false beliefs.
My high school kids, for instance, have been exposed to plenty of woke nonsense. I don't think they've been successfully indoctrinated in any way.
Instead, it seems that pretty much everything else that irritating biddies try to preach to teenagers, it's treated with humor and mockery. Maybe it's just because we're far outside the norm in critical thinking, but I think it's true of a lot of their peer group as well. The propaganda is still bad and still reduces the ability to search for truth, but I have the sense that, for most people, a lot of deprogramming isn't necessary.
Re: "where people are trying to intimidate you into holding false beliefs, it takes courage and effort to try to maintain the search for truth."
More often than not, lack of belief is the better part of wisdom. True, it takes a bit of courage and effort to say, "I don't know, it's complicated, I've haven't looked into it carefully, it's outside my purview," when people are trying to intimate you into holding false beliefs. But it takes less courage and effort than the search for truth.
Another alternative is wise deference in belief-formation. For example, if someone pressures me to believe X about an issue in economics (e.g., inflation), I might reply, "I trust Arnold Kling on such issues, but I acknowledge that it's complicated, since Tyler Cowen happens to disagree."
Occasionally, I put real independent effort into search for truth about an issue in the world. But most people don't have time or resources or temperament to do so.