8 Comments

"a minimum of tenfold the annual growth rate observed over the past century, sustained for at least a decade"

That statement is pretty ridiculous.

Expand full comment

"a minimum of tenfold the annual growth rate observed over the past century, sustained for at least a decade"

How does he intend to measure that growth? GDP? I understand AI optimists may think we can produce twice as many cars, devices, furniture, but then where are you going to put all those things? When are consumers going to use them ? Are days going to get longer? Where is the energy needed for all those consumer goods coming from?

And about services, when are we going to consume all those services? Are days going to get longer? Will AI produce a new drug to allow us to sleep only 4 hours a day?

"how AI will impact education"

Probably in the same way the Singapore method impacted it. Another fad.

Expand full comment

"How does he intend to measure that growth? GDP? I understand AI optimists may think we can produce twice as many cars, devices, furniture, but then where are you going to put all those things?"

The real value is not in material things. That's why so many of the world's wealthiest people are software people. Imagine the value of living 200 years with a body that's far better than your body ever was at any time in your life. Would that be valuable to you? Now, imagine having an autonomous vehicle drive you 10 minutes to a place where you could immediately board a vehicle that could travel at 500+ miles per hour, using *less* per passenger mile than any form of transportation that now exists. Would that be valuable to you? Now imagine living in a home that has robots to clean and repair and modify it at your every whim, charging a fraction of what humans would charge to do that work. Would that be valuable to you?

Etc. etc. etc.

Expand full comment

"John Luttig writes,

the AI frontier is being driven by a much smaller fraction of the ecosystem than former technological shifts like the internet and software. .. OpenAI and Anthropic combined employ fewer than 800 people.

...

During the early internet era of the mid-90s — and crypto in the early 2010s — very few people showed up in the first few years, but almost all of them made money.

AI seems like the inverse."

So does AI have more people involved or less?

"Zvi Mowshowitz writes,

…If it is safe to use LLM-generated summaries of your work, your work does not need to be read."

First, not everyone knows everything "conventional." Second, I'd argue most that is new or non-conventional takes the conventional and tweaks it or adds to it a bit. I'd hope LLM can find these differences. No?

Expand full comment
founding

Will LLMs eliminate jobs and tasks that produce written communication or will the net effect look more like an arms-race?

By analogy, did VisiCalc put thousands of bookkeepers and financial analysts out of jobs or are there now 10 Trillion times more spreadsheet calculations every day? 1000x more people building spreadsheets?

For example, my friend writes emails to current and potential donors requesting donations to a music education charity. Does my friend use Chat to write to 100x as many potential donors? Does my friend write unique letters to 100x as many donors?

For example, she recently generated a donation request email for a potential donor who loved TS Elliot with the prompt "You are T. S. Elliot, write a 4 paragraph email to potentialDonorX for an $XX,XXX donation request to charityX in the style of The Love Song of J Alfred Prufrock'.

Does that elicit 100x the value of donations?

What would a bank credit card offer sound like in the voice of Thomas Pynchon in the style of Gravity's Rainbow? Life insurance come-on by Marcel Proust?

Expand full comment

"Zvi Mowshowitz writes,

"'The interesting claim is that if the LLM can understand your core thesis well enough to summarize it correctly, that means your idea was conventional, because the LLM would otherwise substitute the nearest conventional thing, ignoring what you actually wrote or didn’t write.

"'…If it is safe to use LLM-generated summaries of your work, your work does not need to be read.'"

If I were Zvi, I'd be very careful to see how LLMs summarize my work.

Expand full comment

After more than two decades saying economic growth in the 21st century will be spectacular--reaching a per-capita GWP in the year 2100 of $10,000,000, in year 2000 dollars--I've just discovered that energy requirements may be a huge constraint. (This is a tremendously embarrassing finding for me, since energy analyses were a big part of what I did in my career.)

From my comment minutes ago on Marginal Revolution:

Mark Bahner 2024-04-08 08:36:28

2. Explosive growth from AI automation? (This paper is economically literature, and uses some simple models)

I hope to comment on this paper further, but I want to re-introduce an analysis that I presented in comments just a while ago. It seems like a very powerful argument *against* explosive growth. And that's shocking to me, because I've been saying that explosive growth will begin to occur in the 2020-2030 time frame for about two decades.

My analysis is this. The human brain takes about 15 watts. So the entire 8 billion human brains on earth take 120 gigawatts (15 watts times 8 billion people).

But computers are presently approximately two orders of magnitude *less* efficient than a human brain. So creating 8 billion human brain equivalents in computer power would take 100 times 120 gigawatts = 12,000 gigawatts. That's approximately *4 times* current global energy production.

And that does not even include the fact that, from the way I look at it, real value in computers needs to include some ability to interact with the physical world. For example, the Tesla Optimus robot apparently uses 100 watts just sitting, and 500 watts when walking briskly. Let's call it 400 watts on average. So 8 billion of those is another 3200 gigawatts.

I'd be very interested if someone can point to how my calculations are off, or how my analysis is not sound. Right now, I think that the energy required for AI, and AI plus robotic bodies, is a fundamental constraint on explosive economic growth caused by AI.

P.S. The paper seems to blow off the energy required by AI, but right at this minute, I think blowing off the energy required by AI is clearly wrong.

Expand full comment

P.S. I forgot to include this link to my prediction for economic growth in the 21st century on the Long Bets website:

https://longbets.org/194/

BET 194 DURATION 96 years (02005-02100)

“The world per-capita GDP in the year 2000 was approximately $7,200. The world per-capita GDP (in year 2000 dollars) will exceed $13,000 in the year 2020, $31,000 in 2040, $130,000 in 2060, $1,000,000 in 2080, and $10,000,000 in 2100.”

Expand full comment