12 Comments
User's avatar
Yancey Ward's avatar

Arnold posting from the future- I like it.

Handle's avatar

We really are living in the future!

gas station sushi's avatar

The more likely scenario: Arnold is probably still using a mediocre Windows trackpad on some budget PC and he inadvertently hit the post button. #MacbookNeo

Tom Grey's avatar

Maybe Arnold has an OpenClaw aigent both spell & grammar & link checking his article, as well as doing the scheduling and his claw ai messed up.

Tho a simple hitting post button instead of schedule remains most likely.

Ken in MIA's avatar

“What do you think would emerge on the computer screens if scientists used the vast computing power of AI to copy the human mind? Virtual zombies? Entities that are neurologically human in every material sense of the word, but lack a soul?”

The brain is more than electrical signals. How, for example, would those scientists apply the effects of cortisol or oxytocin or adrenaline to computer chips?

gas station sushi's avatar

GABA seems like the most likely neurotransmitter to mimic.

Ken in MIA's avatar

I guess they'll just vibe code the whole thing.

BenK's avatar

2nd Amendment Right to AI.

Chartertopia's avatar

That was my first reaction to claiming AI is a weapon, and the implication that it is too dangerous to let the general public use.

"God created man, and Sam Colt made them equal."

Before Sam Colt's practical revolver, and before practical single shot firearms, self-defense with swords and knives required too much training to be practical for peasants, and swords were expensive too. Democratization of weaponry is not a bad thing. AI might not be thought of as a personal defense weapon like firearms, but it is when defending against government.

BenK's avatar

AI should be thought of as an individual's first line of defense against paperwork. Any use of AI for gov't paperwork should be considered privileged as well.

Tom Grey's avatar

Noah is sad about being so clear about ai as a weapon, and a private company vs the govt when it comes to controlling weapons.

Ryan’s work with his OpenClaw, named Morpheus, is inspiring and frightening—imagining hundreds of terrorists with thousands of aigents being deliberately used to disrupt various processes seems like an inevitable train wreck coming. (Day 11 of the Current Iran Thing/war and I’m so glad the Iranians didn’t get aigents by the thousands. I fear China & Russia more)

If security measures aren’t taken that are effective, likely, then a big aigent attack seems likely to be successful, to be followed by much stronger security measures. Like huge ai based mass surveillance on all users of ai, and worse.

If there is some drone swarm used against the US, causing deaths to US military because of successful ai autonomous drone/bot actions, I’d strongly bet on the DoW (DoWar) getting cutting edge ai war powers.

The push by Boris to more clearly document the plan, before the coding, is excellent advice in general, and looks great as a specific recipe for getting good results.

Great links, thanks. Glad to get it a few days early.