Of course they “want” it to just like they want Siri and everything else to be computed locally since best for privacy and offline performance. Whether they compete with a cloud AI time will tell. Will presumably be far more limited than ChatGPT 4 but hopefully light years better than Siri which is what is most important.
Also light years ahead when the cloud access isn’t available or spotty in an elevator, the middle of the lake, a parkade, travelling with cellular turned off and wifi is on.
> Whether they compete with a cloud AI time will tell.
But they're not competing with cloud AI. Why would a person need to go to the cloud to give you a reminder or download an app?
They're competing against the current local assistant, Siri.
Large models are great but they can't fit on 8 or 16gb of ram. And that's a very big deal.
They don't need to put all of the world's information locally, just the relevant bits. It doesn't need to know every celebrity's full history for example
You can have the basic stuff on-device with the "smarts" of an LLM that can have conversations with the user and have context to previous questions.
The other stuff can be fetched from the cloud (with the user's permission OFC) and optionally saved locally.