15 Comments
User's avatar
Alex F's avatar

Why not just ask AI in an existing app? I asked Perplexity:

To leave a WhatsApp group, follow these steps:

1. Open WhatsApp and go to the group you want to leave.

2. Tap the group name at the top of the screen.

3. Scroll down to the bottom of the menu.

4. Tap “Exit Group” and confirm that you want to leave.

The group admin will be notified, but other members won’t receive a notification unless they check the member list. Your chat history will remain on your device, but you won’t be able to send messages to the group anymore.

Expand full comment
Andy G's avatar

AK has been spot on previously that AI has huge potential benefits in improving UIs.

Here, however, he misses the mark (I say as a former software engineer). With the *possible* exceptions of Apple, Google and Microsoft themselves - by embedding within their respective operating systems - it will not be easy to develop that “I want to…” app.

At least if you want the app to *perform* the function as opposed to simply telling you how to perform the function. To paraphrase Jerry Seinfeld, it’s the *performing* of the function that’s the hard part.

Because that gets at the UI of each of these individual applications and websites - and the act of navigating these things designed for human users is not generally something that an external, 3rd party app can easily be programmed to do.

When app developers change their own apps to take advantage of AI for their own user interfaces - AK’s earlier observation - that’s when we’ll get what he wants.

Now one can certainly imagine a new - or existing improved - operating system that made interfacing with applications for a centralized AI/app easy to do. But that again requires new/rewritten applications.

Expand full comment
Kurtis Hingl's avatar

“The purpose of a system is what it does”

I imagine many of these frictions are at least partly strategic. Does WSJ want letters to the editor via one-click?

Expand full comment
Bill Byrd's avatar

Most of my Grok queries start with “explain xyz”. Then it tells me exactly what I want to know about xyz. Finding a user manual from a bankrupt company. Fixing an obscure airpod issue. Etc. like magic.

Expand full comment
Brettbaker's avatar

I think we all know the winning system will be low-cost..... and not much of an improvement over the current system.

Expand full comment
Helen Dale's avatar

Please make this app happen. I am rendered incompetent thanks to something like the WhatsApp situation you describe at least twice a week, and often more regularly.

Expand full comment
Candide III's avatar

"I want to" is not, properly speaking, an app like WhatsApp. WhatsApp is a tool for communication: you type a message and it sends it, no questions needed or asked. "I want to", on the other hand, is not a tool. It's a secretary. AI or no, a secretary can misunderstand you, do not quite the thing you wanted, neglect to clarify, misjudge what's important in what you told it to do, and so on. A tool, on the other hand, cannot possibly misunderstand you. You might not know how to handle the tool properly, or the tool itself may be badly designed and/or end up with a bad design even if the initial design was good (as is the case with many software user interfaces) such that it's awkward to handle or to learn. However, if you do know how to handle the tool properly, it does exactly what you predict, because you can form a mental model of it that's close enough to perfect as to make no difference, and _that_ is because tools begin their life as mental models. You can't do that with a person (living or AI). And that's part of what is fun about interacting with people!

Expand full comment
MikeDC's avatar

Many menu systems are clearly set up to gatekeep. They funnel everyone into where the company wants you to go (e.g. "Pay your bill") and don't offer much else.

We shouldn't expect AI help to be any different because the intent will be the same.

Expand full comment
stu's avatar

I'd be surprised if AI has enough data to figure some of these actions. Also, doesn't AI have a tendency to give varied answers?

Expand full comment
Ian Fillmore's avatar

Humans, LLM’s, and traditional computers each have comparative advantages. A a trivial example, if I ask ChatGPT to run a regression, it simply generates Python code that runs the regression and executes it. I can imagine a similar solution for database queries, etc.

Remember LLM’s are language experts. They excel at producing and interpreting language (both human and computer languages). I think the path forward is to let them specialize in that and don’t try to make them be something they’re not.

Expand full comment
Lupis42's avatar

Many modern software interfaces are mostly about 1) aesthetics, 2) driving internal metrics in directions that the company currently pays attention to, and 3) putting friction in the way of requests which impose marginal costs on the provider.

Unfortunately, this implies most companies will not provide helpful AI, though external tools may obviate more of the interface for you.

Expand full comment
Tom Grey's avatar

I thought you would want to tie the LLMs & ai problems & the fertility crisis together with a link to this MIT story:

https://archive.is/Mwuee

“My sex doll is mad at me”.

Coming soon, in an unmarked brown package.

Expand full comment
MrP's avatar

Do not too quickly attribute to user error that which can be adequately explained by poor design. Some version of this adage will likely follow us into the AI age. I am not paying Sam Altman to stop the ensh-tification of AI just to slow it down. I accept that it is the natural progression of all platforms or services.

Expand full comment
Lee Bressler's avatar

I’ve started using Perplexity for almost everything I used to use Google for. It is far superior.

Also I’m glad you read the Benedict Evans interview!

Expand full comment
David L. Kendall's avatar

Have you tried asking Ai to do economic analysis using the market model of supply and demand? I have; LLM pretty much can't do it.

Expand full comment