We need a better way to evaluate the experts, over the long term. Some of the most convincing liars on the planet are high functioning psychopaths. Everybody trusts them, even the psychologists who study them and know very much that this is precisely what they should not do. The psychopaths know how to make the trust happen. And anybody who thinks they can 'just tell' when they are being lied to, is likely to fall for the psychopaths and con-men every time. You have to go with the evidence, which is a toughie when there isn't a lot to be found (yet) and you have to make policy before you have enough.
One thing we would benefit from was if experts -- anybody, really -- were required to state when they published an opinion how certain they are that what they are stating is true and correct. Or, in some cases, 'how big are the error bars'? This gives the non-expert a fair shot at determining what things are settled, mostly settled, and what are still very much figuring out the unknown. Also, at this point we have a better chance of knowing which experts are honest, and possessed of a certain basic humility which we all want to find in those we trust. A certain level of 'got it wrong' is not only fine, but essential when you are thinking your way through the unknown. But you cannot improve by getting things less wrong, by being what Nassim Taleb calls 'anti-fragile' and learning from your mistakes if you aren't calling out the mistakes in the first place. A consistent pattern of getting things very wrong, and not learning from one's errors either out of malice or incompetence is something we need to be able to recognize. And then we need to act on it, and strip these people of influence and trust.
The Spectator magazine in the UK ran this piece in 16 April 2020, entitled "Six questions that Neil Ferguson should be asked". https://www.spectator.co.uk/article/six-questions-that-neil-ferguson-should-be-asked They are good questions. And they demonstrate a pattern of being spectacularly wrong, so wrong that this is the 'expert opinion' that everybody should have known to never trust. Unless our institutions do a much better job of judging which so-called experts are worth trusting, which are rubbish, and which fit somewhere in between, there is no hope for the rest of us, and unless they do this by deferring to the evidence, and not some form of 'Oxford physicist. Big Credential. Must be True!' we will continue to trust the wrong people.
I also read Huemer's blog, but I did not bother to comment on this post.
What Huemer misses is the expertise on... experts. There is a pretty substantial literature on expert opinion and it is not very supportive of the heuristic.
I'm thinking people like Khanneman and Tversky, Hastie and Dawes, Tetlock.
Sure, if what you want from experts is recitations of non controversial facts or calculations using the tools of their field, they're great. Also, if you have an expert who repeatedly makes many predictions about the same situation and then gets feedback on the outcome, they're pretty good (the classic example is pathologists and cancer biopsies).
But this is not in fact what most arguments about deferring to experts are about. Rather, it's about claims by experts about interventions (or implied interventions) into complicated situations beyond a straightforward application of tools with well established accuracy and often extending across domains. Moreover, these claims are often framed in terms of certainty or near certainty.
Just think about statements from well credentialed economists about the economy. Do you think following their (implied) prescriptions based on their "well established" models is advisable?
If not, why would you think other fields are any different?
How did those predictions from epidemiologists and doctors on covid work out? The standard reading of the literature from people like Khanneman and Tversky, Hastie, and Dawes, and Tetlock would have said this was exactly the type of situation where "experts" would be unreliable. And they were.
Ironically, Huemer's epistemological model of expertise is oversimplified so it gets the answer wrong. A common problem with experts in many fields.
What's particularly humorous, and I need to read Huemers post maybe he addresses it, is that in his older essay "In defense of passivity" he directly explains why to not trust experts. As you pointed out, he uses the results of experts on experts to dismiss most people who are thought of as experts! Curious to read this new post he has made as his original essay was highly influential on my worldview.
I am reminded of the book Range, which makes the claim that generalists tend to outperform experts on novel issues, and experts tend to explain away every failure of their incorrect predictions and double down rather than change their minds like a generalist who doesn’t have as much invested in any one subject and can pivot toward reality when he or she sees that an original prediction is not correct.
"But, as Huemer points out, we cannot operate that way. There is no way that I can personally verify every hypothesis about the world. Instead, I decide which people to trust about particular topics."
It's not that hard to logically work out whether what most "experts" are claiming is likely or actually true or false. When it comes to more complex things, or things outside your ability to determine, you can almost always look at the track record for honesty and accuracy for individuals and institutions. If that's not available, history provides some guidance. What remains is the leading edge of research, which usually has very little bearing on your day to day life.
The problem isn't the ability to determine whether or not an institution or person is lying or not. It's the courage to face the fact that most of them are bold-faced liars and what that means for you and those you care about.
Apparently, most people are gutless cowards, and will pay the price for that cowardice.
I don't need to be expert on respiratory viruses to be skeptical of homemade masks stopping the transmission of such viruses. And when I see the "experts" promote cloth masks to stop a respiratory virus I can then discern that the entirety of their advice is questionable.
Common sense is very helpful in navigating a complex world. But one needs to have the courage to use common sense, and not defer to the expert narrative.
‘That’s why they’re called “experts”. They are called experts because they have failed upwards into positions of unassailable power and control in the area of their alleged expertise. They are in the top of the age range and have left learning behind. Their knowledge and ideas are outdated but they see no need and are too lazy to update. ‘Young upstarts’ dare not challenge because their jobs /future depend on the experts who rule their profession. And these experts are the only people ‘Governments’ (such as they are) consult because they are ‘the experts’, the High Priests of The Science™️. Which is why our societies and economies were ruined because of a Cold virus, why useless masks and anything but safe and effective experimental products were forced on us. Why we have an energy supply shortage, and will ruin what’s left of our societies and economies to ‘save the Planet’. Don’t forget the most influential ‘expert’ in that latter was a mentally disturbed, truant, Swedish teenage girl who in former times would have been sent to either a nunnery or lunatic asylum - where in my view all ‘experts’ should be.
I wonder if your loss of trust is because of a change in those people you used to trust …or a change in your knowledge of those people. I often feel the same and wonder about the change. We have had a acceleration of communication that I know maybe too much about people. I am not sure of the answer in my case.
Experts can be knowledgeable and helpful in their area of expertise but ignorant and unhelpful in many others. Consider Jim Cramer of CNBC. I believe he is expert in understanding Wall Street. He is not expert in giving good financial advice. Similar with Paul Krugman. Expert in certain areas of Economics. Woefully incorrect in judging politics and policies.
I would argue society needs experts and also a system of weeding out non-experts. At the same time, for there to be freedom & liberty, people must be allowed to great latitude in choosing what expertise they follow. Society governed by "experts" has been tried and has created great misery.
Huemer's gun control example is illustrative. Here is an issue with many self proclaimed experts. And they disagree! But we also have data. In a better world, the data would inform society on the role and use of personal firearms. But most often we get rhetoric and emotional pleadings, and not facts, concerning gun control. In fact, I have no idea what is even meant by the label "gun control expert'. Someone using that label is someone I would dismiss as a hack.
I would add, true experts do not try to exercise expertise outside their actual area. Worst reascent example CDC making recommendations for how to slow the spread of COVID rather than giving individuals and public officials the epidemiological benefits of different measures so that they could put that together with THEIR expertise on the costs of the measures to carry out a proper CBA.
1) Is this person capable of understanding the truth?
2) If this person understands the truth, will they tell it to me?
3) If the answer to #1 and #2 is "sometimes", what factors impact that?
4) Can I trust this person in future novel situations I have not yet anticipated?
When we look at failures of experts its mostly that they:
1) Choose self preservation over others welfare
2) Cared about their teams welfare and not yours
Biden is president today because a black politician from South Carolina endorsed him. Immediately, all black people essentially converged on supporting Biden. And while black people are not a majority of Dem primary voters, the white vote was split amongst many candidates. By acting in unison and trusting a single individual, black people got their candidate. And Biden has always made sure to prioritize the desires of black people, and was the only one not dumb enough to endorse Defund the Police.
A cognitive quandary arises when experts disagree. For example, I respect Arnold Kling and Tyler Cowen, and defer here and there to each of them. But I found myself in a quandary when they disagreed about inflation. (Tyler stated that he was on "team transitory".) Each of them followed 'best practices of public experts' (Points 1, 2, 3, and 4, listed today in Arnold's blogpost). I chose to believe Arnold. But why? (It turned out that Arnold was correct.)
Alas, there are incentives in the forum, around polarization, for experts to line up on opposite sides of many questions. Deference is complicated! Of course, a layperson can choose to throw up one's hands, when fair-minded experts disagree.
Tyler is pretty open about the idea that you should lie to people on utilitarian grounds (Straussian, "read the room"). Perhaps he was just lying about being on team transitory for social desirability reasons.
It's an idea already explicit in Plato (Republic, 389b-c):
But further we must surely prize truth most highly. For if we were right in what we were just saying and falsehood is in very deed useless to gods, but to men useful as a remedy or form of medicine, it is obvious that such a thing must be assigned to physicians and laymen should have nothing to do with it. The rulers then of the city may, if anybody, fitly lie on account of enemies or citizens for the benefit of the state; no others may have anything to do with it, but for a layman to lie to rulers of that kind we shall affirm to be as great a sin, nay a greater, than it is for a patient not to tell physician or an athlete his trainer the truth about his bodily condition, or for a man to deceive the pilot about the ship and the sailors as to the real condition of himself or a fellow-sailor, and how they fare.
It is tempting to use trust networks, but very dangerous. I can't count how many times I have seen things I thought were credible fall apart because of people relying on people who rely on people etc. The financial markets is a good example. The movie The big short shows how everyone believed in everyone elses expertise etc. If the housing market was rotten, someone in this chain of experts would have pointed it out. And everyone thinks everyone else has done the homework. Watch the new Netflix show about Bernie Madoff. "It can't be a Ponzi scheme. The SEC has investigated him, and if there was something rotten they would have found it. And Madoff is at the top of this and that institution, like Nasdaq etc." When this many experts etc start believing other experts, you get the illusion of a reliable institution. I call it the belief in the goodness of power.
It is of course not possible to verify every hypothesis about the world, but there are areas where there are higher risks, like politics. I often find it useful to ask if a conclusion or hypothesis feels good. That's a red flag. If someone says: "If people just talked to each other more, there would be less antagonism." That may very well be true, but it feels good, right? When something feels good, people accept it. But it is often wrong.
Phil Tetlock has a compatible view, which is that experts as such are relatively inaccurate forecasters, while what he calls "Superforecasters" (educated lay people) acting in groups tend to be more accurate forecasters because they consider a wider variety of information sources and viewpoints and pool their perspectives into a consensus.
Frank Diebold (U. Penn.) has a similar view about judgmental forecasts, for example, economists forecasting next year's GDP growth. He has shown empirically and theoretically that consensus forecasts tend to be more accurate over time than any individual's forecasts.
Maybe what we should take from the last couple of years is that 1) the good experts still get it wrong a lot…particularly when life is moving fast. 2) trusting the experts you have, even when wrong, is better than abandoning the system. That is, a society that calmly follows its shamans in times of stress maybe is better off than one that self-destructs. Eh?
If political power continues to concentrate, the potential harm from expert opinion will increase. If political power lessens in concentration, the potential harm from expert opinion will lessen.
We need a better way to evaluate the experts, over the long term. Some of the most convincing liars on the planet are high functioning psychopaths. Everybody trusts them, even the psychologists who study them and know very much that this is precisely what they should not do. The psychopaths know how to make the trust happen. And anybody who thinks they can 'just tell' when they are being lied to, is likely to fall for the psychopaths and con-men every time. You have to go with the evidence, which is a toughie when there isn't a lot to be found (yet) and you have to make policy before you have enough.
One thing we would benefit from was if experts -- anybody, really -- were required to state when they published an opinion how certain they are that what they are stating is true and correct. Or, in some cases, 'how big are the error bars'? This gives the non-expert a fair shot at determining what things are settled, mostly settled, and what are still very much figuring out the unknown. Also, at this point we have a better chance of knowing which experts are honest, and possessed of a certain basic humility which we all want to find in those we trust. A certain level of 'got it wrong' is not only fine, but essential when you are thinking your way through the unknown. But you cannot improve by getting things less wrong, by being what Nassim Taleb calls 'anti-fragile' and learning from your mistakes if you aren't calling out the mistakes in the first place. A consistent pattern of getting things very wrong, and not learning from one's errors either out of malice or incompetence is something we need to be able to recognize. And then we need to act on it, and strip these people of influence and trust.
The Spectator magazine in the UK ran this piece in 16 April 2020, entitled "Six questions that Neil Ferguson should be asked". https://www.spectator.co.uk/article/six-questions-that-neil-ferguson-should-be-asked They are good questions. And they demonstrate a pattern of being spectacularly wrong, so wrong that this is the 'expert opinion' that everybody should have known to never trust. Unless our institutions do a much better job of judging which so-called experts are worth trusting, which are rubbish, and which fit somewhere in between, there is no hope for the rest of us, and unless they do this by deferring to the evidence, and not some form of 'Oxford physicist. Big Credential. Must be True!' we will continue to trust the wrong people.
I also read Huemer's blog, but I did not bother to comment on this post.
What Huemer misses is the expertise on... experts. There is a pretty substantial literature on expert opinion and it is not very supportive of the heuristic.
I'm thinking people like Khanneman and Tversky, Hastie and Dawes, Tetlock.
Sure, if what you want from experts is recitations of non controversial facts or calculations using the tools of their field, they're great. Also, if you have an expert who repeatedly makes many predictions about the same situation and then gets feedback on the outcome, they're pretty good (the classic example is pathologists and cancer biopsies).
But this is not in fact what most arguments about deferring to experts are about. Rather, it's about claims by experts about interventions (or implied interventions) into complicated situations beyond a straightforward application of tools with well established accuracy and often extending across domains. Moreover, these claims are often framed in terms of certainty or near certainty.
Just think about statements from well credentialed economists about the economy. Do you think following their (implied) prescriptions based on their "well established" models is advisable?
If not, why would you think other fields are any different?
How did those predictions from epidemiologists and doctors on covid work out? The standard reading of the literature from people like Khanneman and Tversky, Hastie, and Dawes, and Tetlock would have said this was exactly the type of situation where "experts" would be unreliable. And they were.
Ironically, Huemer's epistemological model of expertise is oversimplified so it gets the answer wrong. A common problem with experts in many fields.
What's particularly humorous, and I need to read Huemers post maybe he addresses it, is that in his older essay "In defense of passivity" he directly explains why to not trust experts. As you pointed out, he uses the results of experts on experts to dismiss most people who are thought of as experts! Curious to read this new post he has made as his original essay was highly influential on my worldview.
I am reminded of the book Range, which makes the claim that generalists tend to outperform experts on novel issues, and experts tend to explain away every failure of their incorrect predictions and double down rather than change their minds like a generalist who doesn’t have as much invested in any one subject and can pivot toward reality when he or she sees that an original prediction is not correct.
"But, as Huemer points out, we cannot operate that way. There is no way that I can personally verify every hypothesis about the world. Instead, I decide which people to trust about particular topics."
It's not that hard to logically work out whether what most "experts" are claiming is likely or actually true or false. When it comes to more complex things, or things outside your ability to determine, you can almost always look at the track record for honesty and accuracy for individuals and institutions. If that's not available, history provides some guidance. What remains is the leading edge of research, which usually has very little bearing on your day to day life.
The problem isn't the ability to determine whether or not an institution or person is lying or not. It's the courage to face the fact that most of them are bold-faced liars and what that means for you and those you care about.
Apparently, most people are gutless cowards, and will pay the price for that cowardice.
I don't need to be expert on respiratory viruses to be skeptical of homemade masks stopping the transmission of such viruses. And when I see the "experts" promote cloth masks to stop a respiratory virus I can then discern that the entirety of their advice is questionable.
Common sense is very helpful in navigating a complex world. But one needs to have the courage to use common sense, and not defer to the expert narrative.
‘That’s why they’re called “experts”. They are called experts because they have failed upwards into positions of unassailable power and control in the area of their alleged expertise. They are in the top of the age range and have left learning behind. Their knowledge and ideas are outdated but they see no need and are too lazy to update. ‘Young upstarts’ dare not challenge because their jobs /future depend on the experts who rule their profession. And these experts are the only people ‘Governments’ (such as they are) consult because they are ‘the experts’, the High Priests of The Science™️. Which is why our societies and economies were ruined because of a Cold virus, why useless masks and anything but safe and effective experimental products were forced on us. Why we have an energy supply shortage, and will ruin what’s left of our societies and economies to ‘save the Planet’. Don’t forget the most influential ‘expert’ in that latter was a mentally disturbed, truant, Swedish teenage girl who in former times would have been sent to either a nunnery or lunatic asylum - where in my view all ‘experts’ should be.
Richard Feynman also valued #4 with the ability to explain being strongly linked to the depth of understanding.
Feynman immediately came to mind as I read #4. He was to science (and to physics more particularly), what Sowell is to economics.
I wonder if your loss of trust is because of a change in those people you used to trust …or a change in your knowledge of those people. I often feel the same and wonder about the change. We have had a acceleration of communication that I know maybe too much about people. I am not sure of the answer in my case.
Experts can be knowledgeable and helpful in their area of expertise but ignorant and unhelpful in many others. Consider Jim Cramer of CNBC. I believe he is expert in understanding Wall Street. He is not expert in giving good financial advice. Similar with Paul Krugman. Expert in certain areas of Economics. Woefully incorrect in judging politics and policies.
I would argue society needs experts and also a system of weeding out non-experts. At the same time, for there to be freedom & liberty, people must be allowed to great latitude in choosing what expertise they follow. Society governed by "experts" has been tried and has created great misery.
Huemer's gun control example is illustrative. Here is an issue with many self proclaimed experts. And they disagree! But we also have data. In a better world, the data would inform society on the role and use of personal firearms. But most often we get rhetoric and emotional pleadings, and not facts, concerning gun control. In fact, I have no idea what is even meant by the label "gun control expert'. Someone using that label is someone I would dismiss as a hack.
I would add, true experts do not try to exercise expertise outside their actual area. Worst reascent example CDC making recommendations for how to slow the spread of COVID rather than giving individuals and public officials the epidemiological benefits of different measures so that they could put that together with THEIR expertise on the costs of the measures to carry out a proper CBA.
There are multiple components around trust:
1) Is this person capable of understanding the truth?
2) If this person understands the truth, will they tell it to me?
3) If the answer to #1 and #2 is "sometimes", what factors impact that?
4) Can I trust this person in future novel situations I have not yet anticipated?
When we look at failures of experts its mostly that they:
1) Choose self preservation over others welfare
2) Cared about their teams welfare and not yours
Biden is president today because a black politician from South Carolina endorsed him. Immediately, all black people essentially converged on supporting Biden. And while black people are not a majority of Dem primary voters, the white vote was split amongst many candidates. By acting in unison and trusting a single individual, black people got their candidate. And Biden has always made sure to prioritize the desires of black people, and was the only one not dumb enough to endorse Defund the Police.
Re: deference to experts.
A cognitive quandary arises when experts disagree. For example, I respect Arnold Kling and Tyler Cowen, and defer here and there to each of them. But I found myself in a quandary when they disagreed about inflation. (Tyler stated that he was on "team transitory".) Each of them followed 'best practices of public experts' (Points 1, 2, 3, and 4, listed today in Arnold's blogpost). I chose to believe Arnold. But why? (It turned out that Arnold was correct.)
Alas, there are incentives in the forum, around polarization, for experts to line up on opposite sides of many questions. Deference is complicated! Of course, a layperson can choose to throw up one's hands, when fair-minded experts disagree.
Tyler is pretty open about the idea that you should lie to people on utilitarian grounds (Straussian, "read the room"). Perhaps he was just lying about being on team transitory for social desirability reasons.
It's an idea already explicit in Plato (Republic, 389b-c):
But further we must surely prize truth most highly. For if we were right in what we were just saying and falsehood is in very deed useless to gods, but to men useful as a remedy or form of medicine, it is obvious that such a thing must be assigned to physicians and laymen should have nothing to do with it. The rulers then of the city may, if anybody, fitly lie on account of enemies or citizens for the benefit of the state; no others may have anything to do with it, but for a layman to lie to rulers of that kind we shall affirm to be as great a sin, nay a greater, than it is for a patient not to tell physician or an athlete his trainer the truth about his bodily condition, or for a man to deceive the pilot about the ship and the sailors as to the real condition of himself or a fellow-sailor, and how they fare.
It is tempting to use trust networks, but very dangerous. I can't count how many times I have seen things I thought were credible fall apart because of people relying on people who rely on people etc. The financial markets is a good example. The movie The big short shows how everyone believed in everyone elses expertise etc. If the housing market was rotten, someone in this chain of experts would have pointed it out. And everyone thinks everyone else has done the homework. Watch the new Netflix show about Bernie Madoff. "It can't be a Ponzi scheme. The SEC has investigated him, and if there was something rotten they would have found it. And Madoff is at the top of this and that institution, like Nasdaq etc." When this many experts etc start believing other experts, you get the illusion of a reliable institution. I call it the belief in the goodness of power.
It is of course not possible to verify every hypothesis about the world, but there are areas where there are higher risks, like politics. I often find it useful to ask if a conclusion or hypothesis feels good. That's a red flag. If someone says: "If people just talked to each other more, there would be less antagonism." That may very well be true, but it feels good, right? When something feels good, people accept it. But it is often wrong.
That an Expert can explain their work is the most important part
for us to make an informed decision we don't need the details of HOW he got to his conclusion
we just need to know options, possible results and what to measure
A Real Expert (tm) would tell you:
If we do nothing then by time t the worst case scenario would be N0 and best case would be N1
and that by time t1<t we should be able to know where we are going to end by looking at a set of variables S
same for interventions X,X1,X2 ....
anyone who can't or won't give that kind of advice is not an expert but a political player/conman
Phil Tetlock has a compatible view, which is that experts as such are relatively inaccurate forecasters, while what he calls "Superforecasters" (educated lay people) acting in groups tend to be more accurate forecasters because they consider a wider variety of information sources and viewpoints and pool their perspectives into a consensus.
Frank Diebold (U. Penn.) has a similar view about judgmental forecasts, for example, economists forecasting next year's GDP growth. He has shown empirically and theoretically that consensus forecasts tend to be more accurate over time than any individual's forecasts.
Maybe what we should take from the last couple of years is that 1) the good experts still get it wrong a lot…particularly when life is moving fast. 2) trusting the experts you have, even when wrong, is better than abandoning the system. That is, a society that calmly follows its shamans in times of stress maybe is better off than one that self-destructs. Eh?
If political power continues to concentrate, the potential harm from expert opinion will increase. If political power lessens in concentration, the potential harm from expert opinion will lessen.