We need a better way to evaluate the experts, over the long term. Some of the most convincing liars on the planet are high functioning psychopaths. Everybody trusts them, even the psychologists who study them and know very much that this is precisely what they should not do. The psychopaths know how to make the trust happen. And any…
We need a better way to evaluate the experts, over the long term. Some of the most convincing liars on the planet are high functioning psychopaths. Everybody trusts them, even the psychologists who study them and know very much that this is precisely what they should not do. The psychopaths know how to make the trust happen. And anybody who thinks they can 'just tell' when they are being lied to, is likely to fall for the psychopaths and con-men every time. You have to go with the evidence, which is a toughie when there isn't a lot to be found (yet) and you have to make policy before you have enough.
One thing we would benefit from was if experts -- anybody, really -- were required to state when they published an opinion how certain they are that what they are stating is true and correct. Or, in some cases, 'how big are the error bars'? This gives the non-expert a fair shot at determining what things are settled, mostly settled, and what are still very much figuring out the unknown. Also, at this point we have a better chance of knowing which experts are honest, and possessed of a certain basic humility which we all want to find in those we trust. A certain level of 'got it wrong' is not only fine, but essential when you are thinking your way through the unknown. But you cannot improve by getting things less wrong, by being what Nassim Taleb calls 'anti-fragile' and learning from your mistakes if you aren't calling out the mistakes in the first place. A consistent pattern of getting things very wrong, and not learning from one's errors either out of malice or incompetence is something we need to be able to recognize. And then we need to act on it, and strip these people of influence and trust.
The Spectator magazine in the UK ran this piece in 16 April 2020, entitled "Six questions that Neil Ferguson should be asked". https://www.spectator.co.uk/article/six-questions-that-neil-ferguson-should-be-asked They are good questions. And they demonstrate a pattern of being spectacularly wrong, so wrong that this is the 'expert opinion' that everybody should have known to never trust. Unless our institutions do a much better job of judging which so-called experts are worth trusting, which are rubbish, and which fit somewhere in between, there is no hope for the rest of us, and unless they do this by deferring to the evidence, and not some form of 'Oxford physicist. Big Credential. Must be True!' we will continue to trust the wrong people.
We need a better way to evaluate the experts, over the long term. Some of the most convincing liars on the planet are high functioning psychopaths. Everybody trusts them, even the psychologists who study them and know very much that this is precisely what they should not do. The psychopaths know how to make the trust happen. And anybody who thinks they can 'just tell' when they are being lied to, is likely to fall for the psychopaths and con-men every time. You have to go with the evidence, which is a toughie when there isn't a lot to be found (yet) and you have to make policy before you have enough.
One thing we would benefit from was if experts -- anybody, really -- were required to state when they published an opinion how certain they are that what they are stating is true and correct. Or, in some cases, 'how big are the error bars'? This gives the non-expert a fair shot at determining what things are settled, mostly settled, and what are still very much figuring out the unknown. Also, at this point we have a better chance of knowing which experts are honest, and possessed of a certain basic humility which we all want to find in those we trust. A certain level of 'got it wrong' is not only fine, but essential when you are thinking your way through the unknown. But you cannot improve by getting things less wrong, by being what Nassim Taleb calls 'anti-fragile' and learning from your mistakes if you aren't calling out the mistakes in the first place. A consistent pattern of getting things very wrong, and not learning from one's errors either out of malice or incompetence is something we need to be able to recognize. And then we need to act on it, and strip these people of influence and trust.
The Spectator magazine in the UK ran this piece in 16 April 2020, entitled "Six questions that Neil Ferguson should be asked". https://www.spectator.co.uk/article/six-questions-that-neil-ferguson-should-be-asked They are good questions. And they demonstrate a pattern of being spectacularly wrong, so wrong that this is the 'expert opinion' that everybody should have known to never trust. Unless our institutions do a much better job of judging which so-called experts are worth trusting, which are rubbish, and which fit somewhere in between, there is no hope for the rest of us, and unless they do this by deferring to the evidence, and not some form of 'Oxford physicist. Big Credential. Must be True!' we will continue to trust the wrong people.