LLM links
Ethan Mollick favors the paid version of ChatGPT; the Zvi reacts to Google Gemini; Chris Mims on AI and electricity demand; Tim B. Lee reacts to Google Gemini
GPT-4 is the most full-featured model. It has access to web browsing (so it is no longer stuck with just the knowledge it had at training), it is multimodal (which means it can “see” and listen to speech), it can create images, it can code and do data analysis, and it can talk back to you (at least in the phone app). There are lots of other nice features, like the option for a privacy mode that doesn’t share your data. It also has enough memory (its “context window”) that it can do pretty complex tasks
…When I speak in front of groups and ask them to raise their hands if they used the free version of ChatGPT, almost every hand goes up. When I ask the same group how many use GPT-4, almost no one raises their hand. I increasingly think the decision of OpenAI to make the “bad” AI free is causing people to miss what why AI seems like such a huge deal to a minority of people that use advanced systems and elicits a shrug from everyone else.
Use GPT-4.
I would add that GPT-4 allows you to experiment with creating GPTs, like my op-ed essay grader. I think that GPTs are to AI what web sites are to the Internet or what apps are to smart phones. That is, they will soon be ubiquitous and we will forget how we ever got along without them. People who develop outstanding GPTs will become rich and famous. If you are a late adopter, good luck to you.
GPT-4 costs $20 a month for a subscription, which is why people don’t use it. It’s a penny-wise, pound-foolish decision. Meanwhile, Mollick reports,
ChatGPT Plus subscriptions are temporarily not being sold, ChatGPT seems to be suffering from performance issues and frequent outages, and a bunch of new ways of using the system has become available. I would still recommend getting Plus, but you can’t.
Zvi Mowshowitz collects people’s reactions to the latest LLM from Google. The Zvi himself seems to be underwhelmed.
To me this is a strange combination of the impressive parts already having been ‘priced into’ my world model, and the new parts not seeming impressive.
So I’m probably selling it short somewhat to be bored by it. If this was representative of a smooth general multimodal experience, there is a lot to explore
For the WSJ, Chris Mims writes,
, which has already agreed to sell Microsoft nuclear power for its data centers, projects that AI’s demand for power in the U.S. could be five to six times the total amount needed in the future to charge America’s electric vehicles.
Alex de Vries, a researcher at the School of Business and Economics at the Vrije Universiteit Amsterdam, projected in October that, based on current and future sales of microchips built by Nvidia specifically for AI, global power usage for AI systems could ratchet up to 15 gigawatts of continuous demand. Nvidia currently has more than 90% of the market share for AI-specific chips in data centers, making its output a proxy for power use by the industry as a whole.
That’s about the power consumption of the Netherlands, and would require the entire output of about 15 average-size nuclear power plants.
Back to Google Gemini. Tim B. Lee is also “meh.” He goes even further.
My guess—and at this point it’s only a guess—is that we’re starting to see diminishing returns to scaling up conventional LLMs. Further progress may require significant changes to enable them to better handle long, complex inputs and to reason more effectively about abstract concepts.
substacks referenced above:
@
@
@
I use chat gpt 4, you get 40 questions for 3 hours, which is more than I need. It really is significantly better than 3.5
You're underestimating the learning curve for non-technical people. I still don't get image generation AI at all - all my graphics use canva templates. And even for writing, it took a lot of tries before it became useful to me.
"AI’s demand for power in the U.S. could be five to six times the total amount needed in the future to charge America’s electric vehicles."
That's good to know. Seems like a "selling shovels in a gold rush" piece of info for potential AI investors.