If you think biological neural networks are doing something special — that they contain some secret sauce that computers lack — you’ll tend to have strong priors against AGI being around the corner. In contrast, if you think artificial neural networks capture certain universal properties of biological neurons, if only in a lower dimensional form, then you’ll tend to have shorter AI timelines, as the differences between the brain and a deep reinforcement neural network becomes more a matter of degree than kind.
I am in the “something special” camp. I read about the brain having modules, not just a mass of neurons.
A lot of my hope is that the scenarios in question simply do not come to pass because systems with the necessary capabilities are harder to create than we might think, and they are not soon built. And I am not so worried about imminently crossing the relevant capability thresholds. Given the uncertainty, I would much prefer if the large data centers and training runs were soon shut down, but there are more limits on what I would be willing to sacrifice for that to happen.
He is a less radical AI doomer than Eliezer Yudkowski, but still a doomer.
substacks referenced above:
@
@
I think machine intelligence and biological intelligence will end up being analogous to machine flight and biological flight. The artificial kind effectively mimics the core uses of the biological kind, but does so in a lower dimensional way that never fully assembles the details that make the living entity unique.
We'll get AI's that replicate the human mind across many different domains, but the lack of whole body intra-cellular systems prevents it ever from being able to present itself as fully human.
This is mostly good news, since it means AI doomers are way off and their most fearful prognostications will end up looking silly. It does mean society will have a whole host of institutional dilemmas as a result of AI, but they are of the more prosaic kind: widening gaps in worker and firm productivity, incentive structures that can be regularly gamed with the use of intelligent programs, and the acceleration of the baumol effect.
I wonder whether the “something special” argument makes religious people less likely to get excited by the hype or doomerism of AI. It does for me, but I don’t know if that’s a general trend.