experts tend to be better than non-experts at correctly assessing difficult issues, due to their obvious cognitive advantages over lay people. E.g., the experts typically have greater than average intelligence, much greater knowledge about the issue in question, and have also spent more time thinking about the issue than you. That’s why they’re called “experts”.
He is arguing that we should look to experts rather than think purely for ourselves. That seems right. But I think that the advice to think for yourself is meant to steer people away from a heuristic of trusting high-status people who are not necessarily experts. The athlete who appears in the commercial for a financial service.
Speaking of the default heuristic, in a podcast with Razib Khan, Oliver Traldi responds to the question of why people’s beliefs on issues shift when their political party leaders shift.
it could be that Democrats trust Obama and Republicans trust Trump. So when they hear that Obama said something, they say, Okay, I trust Obama. So I’m going to shift my view. . .And trusting other people is something that we have to do. . .We trust each other as experts or peers, or as people who are kind of stating things that they’ve come into contact with personally. So another explanation, rather than it being purely tribal, is that it could be that people shift based on sort of getting signals which they trust from prominent members of the party that they support. . .
Note the sentence that I put in bold. My way of putting it is that we decide what to believe by deciding who to believe.
Philosophers too often treat epistemology as the lone individual confronting the world and deciding what to believe. I have to use logic and my own observations.
But, as Huemer points out, we cannot operate that way. There is no way that I can personally verify every hypothesis about the world. Instead, I decide which people to trust about particular topics.
When people change their minds, I am skeptical that they give an accurate account of why they did so. In fact, sometimes they will just flat-out deny changing their mind. “Oh, I was always OK with gay marriage.” “I never thought it was a good idea to close schools.”
I believe that a fundamental problem in our society today is that the heuristic “Trust the person with the credential” has broken down. I cannot trust the person with the Ph.D to follow the standards of rigor that I once associated with academia. I cannot trust mainstream journalists to follow the standards of objectivity that I once thought they adhered to. I cannot trust public officials to adhere to what was once my conception of the rule of law.
The discussion of these issues in the comments section of Huemer’s post is quite good. One commenter writes,
1. Experts don’t mind reasonable questions.
2. Experts anticipate and respond to reasonable objections.
3. Experts can show their work.
4. Experts can explain part of their work to a reasonably intelligent layman. A theoretical physicist couldn’t get someone to truly understand relativity in a day, for example, but could explain how all observers perceive light as going the same speed. They could then explain the implications and what evidence we have for the truth of relativity.
Huemer disagrees with (4)—some experts have a hard time talking to people outside of their field. Venture capitalist Michael Gibson strongly values (4), but perhaps that is because he is in the business world.
(2) and (3) are points that you have seen me make before, multiple times.
Note that I often read the comments on substacks that I follow. The ratio of substance to trolling is very high! That might not last as more people join substack, but if a writer’s comment section starts to be come a sewer, he or she can always limit comment privileges to paid subscribers.
What will replace the credential heuristic? One guess is that a network heuristic will emerge. I certainly trust Razib Khan’s expertise on population genetics. And I can probably add to my trust network many of the people that Razib trusts. And perhaps the people that those people recommend, and so on, although my trust level probably decays as the network distance from Razib goes up.
My idea for replacing higher education incorporates a network heuristic.
We need a better way to evaluate the experts, over the long term. Some of the most convincing liars on the planet are high functioning psychopaths. Everybody trusts them, even the psychologists who study them and know very much that this is precisely what they should not do. The psychopaths know how to make the trust happen. And anybody who thinks they can 'just tell' when they are being lied to, is likely to fall for the psychopaths and con-men every time. You have to go with the evidence, which is a toughie when there isn't a lot to be found (yet) and you have to make policy before you have enough.
One thing we would benefit from was if experts -- anybody, really -- were required to state when they published an opinion how certain they are that what they are stating is true and correct. Or, in some cases, 'how big are the error bars'? This gives the non-expert a fair shot at determining what things are settled, mostly settled, and what are still very much figuring out the unknown. Also, at this point we have a better chance of knowing which experts are honest, and possessed of a certain basic humility which we all want to find in those we trust. A certain level of 'got it wrong' is not only fine, but essential when you are thinking your way through the unknown. But you cannot improve by getting things less wrong, by being what Nassim Taleb calls 'anti-fragile' and learning from your mistakes if you aren't calling out the mistakes in the first place. A consistent pattern of getting things very wrong, and not learning from one's errors either out of malice or incompetence is something we need to be able to recognize. And then we need to act on it, and strip these people of influence and trust.
The Spectator magazine in the UK ran this piece in 16 April 2020, entitled "Six questions that Neil Ferguson should be asked". https://www.spectator.co.uk/article/six-questions-that-neil-ferguson-should-be-asked They are good questions. And they demonstrate a pattern of being spectacularly wrong, so wrong that this is the 'expert opinion' that everybody should have known to never trust. Unless our institutions do a much better job of judging which so-called experts are worth trusting, which are rubbish, and which fit somewhere in between, there is no hope for the rest of us, and unless they do this by deferring to the evidence, and not some form of 'Oxford physicist. Big Credential. Must be True!' we will continue to trust the wrong people.
I also read Huemer's blog, but I did not bother to comment on this post.
What Huemer misses is the expertise on... experts. There is a pretty substantial literature on expert opinion and it is not very supportive of the heuristic.
I'm thinking people like Khanneman and Tversky, Hastie and Dawes, Tetlock.
Sure, if what you want from experts is recitations of non controversial facts or calculations using the tools of their field, they're great. Also, if you have an expert who repeatedly makes many predictions about the same situation and then gets feedback on the outcome, they're pretty good (the classic example is pathologists and cancer biopsies).
But this is not in fact what most arguments about deferring to experts are about. Rather, it's about claims by experts about interventions (or implied interventions) into complicated situations beyond a straightforward application of tools with well established accuracy and often extending across domains. Moreover, these claims are often framed in terms of certainty or near certainty.
Just think about statements from well credentialed economists about the economy. Do you think following their (implied) prescriptions based on their "well established" models is advisable?
If not, why would you think other fields are any different?
How did those predictions from epidemiologists and doctors on covid work out? The standard reading of the literature from people like Khanneman and Tversky, Hastie, and Dawes, and Tetlock would have said this was exactly the type of situation where "experts" would be unreliable. And they were.
Ironically, Huemer's epistemological model of expertise is oversimplified so it gets the answer wrong. A common problem with experts in many fields.