Re: "social epistemology has made progress in terms of *processes*. The Enlightenment norms, including freedom of speech and the scientific method, are important tools. So are institutions in which beliefs are contested, such as courtrooms, markets, elections, and academic journals and conferences".
It would be reasonable to ask about each of these institutions in which beliefs are contested: Has the institution decayed in terms of its process role in social epistemology? Has the institution become less reliable in getting at (or nearer) the truth?
The matter is complicated by the fact that a subset of these institutions contest beliefs *and values*, and do so in ways that entangle belief-formation and preference-expression. For example, about elections, Bryan Caplan likes to say that democracy is largely about "what *sounds* good". Individual voters have little weight in the electoral outcome, and therefore little incentive to seek truth about complex policy issues.
One may reasonably ask: Have selection effects and treatment effects in academe in recent decades shifted the academic process towards orthodoxy and away from open and honest contestation of beliefs?
Have you had close colleagues and family members get answers from your clone? Because they are the ones I would expect to find flaws. I also wonder whether you can get a clone to say things like 'well, as a younger man I believed X, but as I grew wiser I modified my stance and now believe something somewhat different'? Does the clone have a sense of a personality that grows in knowledge and wisdom over time?
That’s a good point. I suspect a lot of people have things they might have written when they were younger and since lost interest in, revising their thoughts and opinions but not writing a correction. I wonder how many people speaking with a clone of themselves based on their body of work would find themselves saying “When the hell did I say that?!… oh… right.”
I think "markets fail, use markets" is too simplistic. There really is no way for a market to develop in which those who benefit from emitting CO2 into the atmosphere (everyone) and those who suffer harm from the accumulation of CO2 in the atmosphere (everyone) to establish an optical distribution of benefits and harms.
You are a very process oriented guy in general. I think this connects to your opposition to philanthropy. You think that "donate to nonprofits" is a bad rule to follow so you oppose people donating regardless of the specific consequences of a given donation.
Although I disagree with you about philanthropy, I agree that social epistemology should be about fostering good processes and so I agree with you here.
I wonder if certain personality types or other features tend to make one more disposed to caring about process vs outcomes. It strikes me that many people I have worked with struggle to think about business processes and just want to skip to the end, so maybe process oriented thinking is just uncommon. But can it be taught, or are certain types of people predisposed to it? No idea, but it seems important.
The outcome versus process viewpoint is on the money. Also, bad information has always been passed around by humans (e.g., gossip), and we seem to lap it up like dogs. We prefer drama and novelty and it's easy to hatch stories that meet this inclination. Getting to the truth is harder work and sometimes boring.
I’m sure it could hugely help you write a conversational style book, tho that seems to not be the kind of book you’d prefer to write. It would likely be more like a series of Current Thing posts, then a carefully considered, = semi-self censored, thoughtful posts avoiding much offense to anybody.
Now I’m thinking of the quote about:
There is more truth in wine than in most books.
Because tipsy folks are more willing to say what they do feel, rather than only what they think that they should feel. Humans are both what they do feel, and what they think, and especially what they do, yet also how they make other people feel.
Do you want the ai-ASK to be more like you as a professor, or you as an intellectual drinking buddy willing to speculate more about likely but uncertain truths? I’d prefer more ASK speculation.
I wouldn’t mind having a conversation with the clone about how to design a solution to a given market failure.
Or at least how to think through which principles apply to create the policy that would be most likely to make a positive difference to, say, aggregate GDP.
"Williams seems to say we should not do anything, and just be grateful for the true beliefs that people do hold. Against Williams are the people who want to explicitly intervene in social media to fight misinformation."
As is often the case, the right answer is somewhere in the middle. We should be grateful for true beliefs AND fight misinformation. But even that misses the mark a bit. Beliefs tend to be a mish-mash of facts and opinion with no right or wrong. Sometimes it can be hard to draw the line between fact and opinion.
Of course it would be great if nobody got the facts wrong but personally, I'd rather if people were a little more unsure their beliefs were facts rather than opinion.
One thing I like to say - science is mostly wrong. What I mean, is that vast majority of hypotheses, throughout history, have been wrong. Being wrong is part of the search for truth. Stephen Hsu almost directly addresses this in the 3rd to last question.
One thing I also find with disinformation people, they believe they have the truth - there is no need for debate or actual science. As you say, that is dogma.
It's also worth considering to what extent market bargaining undermines the truth discovery process. In academia, grant-hunting is said to cause many irrelevant and manipulated studies. In the courts, civil settlements, plea bargains, along with sealed proceedings like arbitration and mediation prevent public oversight of judicial processes. It also gravely pollutes criminal justice statistics to both the upside and the downside. In the stock market, we are also happy with letting markets decide prices unless the political authority decides that the prices have become too unstable. But as long as it produces correct-ish valuations, people are mostly pretty happy with it.
Yet these processes are the result of market actors pursuing the best possible outcomes for their situation. If you optimized for increasing public understanding of the truth, you would not get good outcomes from the market perspective -- in fact, it would be catastrophic. Society tends to be happy with "mostly correct" or at least "directionally correct," but sometimes these processes go berserk as they have in academia because the mechanisms for keeping them on track go unused.
Against Williams are the people who want to explicitly intervene in social media to fight misinformation.
The disinformation fighters, perhaps including B.P.S., would argue that there is such a thing as truth. So we should censor falsehood. From an outcomes perspective, that makes sense.
What about the intermediate position, "fight" misinformation (becasue we believe in truth) but not "censor."
Re: "social epistemology has made progress in terms of *processes*. The Enlightenment norms, including freedom of speech and the scientific method, are important tools. So are institutions in which beliefs are contested, such as courtrooms, markets, elections, and academic journals and conferences".
It would be reasonable to ask about each of these institutions in which beliefs are contested: Has the institution decayed in terms of its process role in social epistemology? Has the institution become less reliable in getting at (or nearer) the truth?
The matter is complicated by the fact that a subset of these institutions contest beliefs *and values*, and do so in ways that entangle belief-formation and preference-expression. For example, about elections, Bryan Caplan likes to say that democracy is largely about "what *sounds* good". Individual voters have little weight in the electoral outcome, and therefore little incentive to seek truth about complex policy issues.
One may reasonably ask: Have selection effects and treatment effects in academe in recent decades shifted the academic process towards orthodoxy and away from open and honest contestation of beliefs?
Here is a deal that no censor has or will ever take:
"I give you the power to censor what I write as long as you give me the power to censor what you write."
Yep, cuts both ways.
Have you had close colleagues and family members get answers from your clone? Because they are the ones I would expect to find flaws. I also wonder whether you can get a clone to say things like 'well, as a younger man I believed X, but as I grew wiser I modified my stance and now believe something somewhat different'? Does the clone have a sense of a personality that grows in knowledge and wisdom over time?
That’s a good point. I suspect a lot of people have things they might have written when they were younger and since lost interest in, revising their thoughts and opinions but not writing a correction. I wonder how many people speaking with a clone of themselves based on their body of work would find themselves saying “When the hell did I say that?!… oh… right.”
I think "markets fail, use markets" is too simplistic. There really is no way for a market to develop in which those who benefit from emitting CO2 into the atmosphere (everyone) and those who suffer harm from the accumulation of CO2 in the atmosphere (everyone) to establish an optical distribution of benefits and harms.
Institutions matter. Some are better than others.
You are a very process oriented guy in general. I think this connects to your opposition to philanthropy. You think that "donate to nonprofits" is a bad rule to follow so you oppose people donating regardless of the specific consequences of a given donation.
Although I disagree with you about philanthropy, I agree that social epistemology should be about fostering good processes and so I agree with you here.
I wonder if certain personality types or other features tend to make one more disposed to caring about process vs outcomes. It strikes me that many people I have worked with struggle to think about business processes and just want to skip to the end, so maybe process oriented thinking is just uncommon. But can it be taught, or are certain types of people predisposed to it? No idea, but it seems important.
This was the exactly what I needed to clarify my views on this topic. Thanks for the well-reasoned and concise thoughts.
The outcome versus process viewpoint is on the money. Also, bad information has always been passed around by humans (e.g., gossip), and we seem to lap it up like dogs. We prefer drama and novelty and it's easy to hatch stories that meet this inclination. Getting to the truth is harder work and sometimes boring.
Well put.
Please authorize me to talk to your clone.
I’m sure it could hugely help you write a conversational style book, tho that seems to not be the kind of book you’d prefer to write. It would likely be more like a series of Current Thing posts, then a carefully considered, = semi-self censored, thoughtful posts avoiding much offense to anybody.
Now I’m thinking of the quote about:
There is more truth in wine than in most books.
Because tipsy folks are more willing to say what they do feel, rather than only what they think that they should feel. Humans are both what they do feel, and what they think, and especially what they do, yet also how they make other people feel.
Do you want the ai-ASK to be more like you as a professor, or you as an intellectual drinking buddy willing to speculate more about likely but uncertain truths? I’d prefer more ASK speculation.
I wouldn’t mind having a conversation with the clone about how to design a solution to a given market failure.
Or at least how to think through which principles apply to create the policy that would be most likely to make a positive difference to, say, aggregate GDP.
"Williams seems to say we should not do anything, and just be grateful for the true beliefs that people do hold. Against Williams are the people who want to explicitly intervene in social media to fight misinformation."
As is often the case, the right answer is somewhere in the middle. We should be grateful for true beliefs AND fight misinformation. But even that misses the mark a bit. Beliefs tend to be a mish-mash of facts and opinion with no right or wrong. Sometimes it can be hard to draw the line between fact and opinion.
Of course it would be great if nobody got the facts wrong but personally, I'd rather if people were a little more unsure their beliefs were facts rather than opinion.
A relevant article - https://www.palladiummag.com/2022/12/12/political-academia-with-stephen-hsu/
One thing I like to say - science is mostly wrong. What I mean, is that vast majority of hypotheses, throughout history, have been wrong. Being wrong is part of the search for truth. Stephen Hsu almost directly addresses this in the 3rd to last question.
One thing I also find with disinformation people, they believe they have the truth - there is no need for debate or actual science. As you say, that is dogma.
Is "askASK.ai" too cute, and likely too obscure? I daresay.
It's also worth considering to what extent market bargaining undermines the truth discovery process. In academia, grant-hunting is said to cause many irrelevant and manipulated studies. In the courts, civil settlements, plea bargains, along with sealed proceedings like arbitration and mediation prevent public oversight of judicial processes. It also gravely pollutes criminal justice statistics to both the upside and the downside. In the stock market, we are also happy with letting markets decide prices unless the political authority decides that the prices have become too unstable. But as long as it produces correct-ish valuations, people are mostly pretty happy with it.
Yet these processes are the result of market actors pursuing the best possible outcomes for their situation. If you optimized for increasing public understanding of the truth, you would not get good outcomes from the market perspective -- in fact, it would be catastrophic. Society tends to be happy with "mostly correct" or at least "directionally correct," but sometimes these processes go berserk as they have in academia because the mechanisms for keeping them on track go unused.
the longer a system stays in place, the more people figure out ways to game it.
Against Williams are the people who want to explicitly intervene in social media to fight misinformation.
The disinformation fighters, perhaps including B.P.S., would argue that there is such a thing as truth. So we should censor falsehood. From an outcomes perspective, that makes sense.
What about the intermediate position, "fight" misinformation (becasue we believe in truth) but not "censor."