Tech
Why Apple Intelligence won’t run on older iPhones, or Vision Pro [U: Next year] – 9to5Mac
See Vision Pro update at the end of the piece.
When asked why Apple Intelligence won’t be available on older iPhones, the company has so far said that the chips simply weren’t sufficiently powerful to provide a good experience. Responses would take too long on older chips.
But many have said that if their phones weren’t up to the task, why not just use Apple Intelligence servers – aka Private Cloud Compute? After all, that’s what already happens with a lot of Siri requests today. We now have an answer to this …
John Gruber spoke with Apple about this.
One question I’ve been asked repeatedly is why devices that don’t qualify for Apple Intelligence can’t just do everything via Private Cloud Compute. Everyone understands that if a device isn’t fast or powerful enough for on-device processing, that’s that. But why can’t older iPhones (or in the case of the non-pro iPhones 15, new iPhones with two-year-old chips) simply use Private Cloud Compute for everything?
From what I gather, that just isn’t how Apple Intelligence is designed to work. The models that run on-device are entirely different models than the ones that run in the cloud, and one of those on-device models is the heuristic that determines which tasks can execute with on-device processing and which require Private Cloud Compute or ChatGPT.
Gruber acknowledges this likely isn’t the only reason; Apple is already having to provide a lot of server-side computing to handle requests that can’t be processed on-device, and the server demand would be massively higher if it had to handle all requests from older phones. But the reason given does sound plausible.
One other surprising revelation is that Apple Intelligence won’t be supported by Vision Pro, despite the M2 chip being powerful enough to do so. The reason here, says Gruber, is that the chip is already running at close to capacity, so doesn’t have enough spare capacity for Apple Intelligence.
According to well-informed little birdies, Vision Pro is already making significant use of the M2’s Neural Engine to supplement the R1 chip for real-time processing purposes — occlusion and object detection, things like that. With M-series-equipped Macs and iPads, the Neural Engine is basically sitting there, fully available for Apple Intelligence features. With the Vision Pro, it’s already being used.
Again, the explanation makes sense, though is a great shame, since the platform would seem almost tailor-made for AI – and is the precursor to an eventual Apple Glasses product which will for sure include Apple Intelligence.
Another snippet from Gruber’s round-up: Prepare to be consistently annoyed by the permission request for handoff to ChatGPT. At least as things stand in Apple’s internal versions, there’s no Always Allow option.
Some people are going to want an “Always allow” option for handing requests to ChatGPT, but according to Apple reps I’ve spoken with, such an option does not yet exist.
I suspect that will change, as it’s going to quickly become a consistent irritation, but Apple likely wants to play safe on the privacy front in the early days of the service.
Update: Mark Gurman says that Apple is aiming to bring Apple Intelligence to Vision Pro next year, though there’s no word on how the company will overcome the challenge referenced by Gruber.
Apple Intelligence is coming to the Vision Pro. When Apple Intelligence was unveiled earlier this month, it was only promised for the Mac, iPhone and iPad. But there’s another device primed to get it: the Vision Pro headset. I’m told that Apple is actively working on bringing the features to the device, but it won’t happen this year.
Photo by AltumCode on Unsplash
FTC: We use income earning auto affiliate links. More.