Apple Intelligence is cool, useful, and perhaps most importantly, charmingly unfinished. Unlike some other AI projects, this upgrade to the software that powers iPhones, iPads, and Macs does not appear to threaten our very existence. Its standout features are supposed to include privacy and a Siri that actually works. But none of it works quite yet, and despite its imminent launch, it probably won’t for many months.
At this year’s annual new iPhone announcement, Apple revealed that, months after announcing the new product, it will finally release Apple Intelligence in October. Apple Intelligence will only work on the latest iPhone models, including the iPhone 15 Pro, and MacBooks and iPads with M1 processors or better. It also won’t ship as a finished product. When you update your devices with Apple Intelligence in a few weeks, you’ll get a beta version that may or may not work better than the beta version I’ve been testing on my phone and laptop for the past couple of weeks.
Even through the unfinished glitches, however, I can see how Apple Intelligence will change the way I use my iPhone and my Mac. It’s a subtle, meaningful shift, but I doubt any of these new AI-powered habits would change my life, at least not in the next year or so.
Vox Technology
Get weekly dispatches from Vox writers about how technology is changing the world — and how it’s changing us.
Email (required)
By submitting your email, you agree to our Terms and Privacy Notice. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Before I dig into what it’s like to use Apple Intelligence, let’s review Apple’s surprisingly sober promises about the product. Apple Intelligence is not designed to blow you away. Apple is calling it “AI for the rest of us,” which translates as “AI that won’t frighten you” when you consider the competition. Google’s Gemini got in trouble earlier this year after its image generation showed an unusual bias. ChatGPT has been scaring people pretty much since it launched in the fall of 2022. Meanwhile, experts have said that the steady march of ultra-capable AI technology — something we don’t fully understand and that devours vast amounts of resources — might kill us all.
So yes, I personally would like the diet version of that. I am the rest of us. Apple Intelligence is for me. It’s just too bad I won’t really get to use it all that soon.
Apple Intelligence is slow and steady and definitely not scary
The new Apple Intelligence technology is baked into the latest versions of iOS, iPadOS, and macOS. When you update your software, you’ll get the option to turn it on, and at that point, you might not even notice it because some of the features are buried in menus. When you find them, though, they sort of work!
In its iPhone 16 announcement on Monday, Apple described the four pillars of Apple Intelligence: language, images, action, and personal context. The first two refer to features that exist in a lot of generative AI software. Language refers to Apple Intelligence’s ability to read and summarize things like your email inbox and notifications as well as rewrite stuff you write in apps like Notes and Pages. Images refers to its image-editing features, like Clean Up, which lets you remove objects from photos.
All of these features were available in the developer beta I tested. (Beta versions of software are pre-release versions that developers release to a large group of people to see how they stand up to stress tests.) The new Apple Intelligence features are fine. The summaries are adequate if occasionally error-prone. The email summaries were the first Apple Intelligence feature I noticed, although you won’t notice them if you use the Gmail app. The Writing Tools feature can do a lot more — if you think to highlight the text you want to fool with and then tap to activate the dashboard. There you can do some pretty simple rewriting of the text to make it sound friendlier or more professional. You can also record meetings or phone calls and then get Apple Intelligence to transcribe and summarize what was said. None of this is revolutionary, but even basic writing tools can save you time.
On the image front, the Clean Up feature is neat, although very similar to what you’ve been able to do with the Magic Eraser in Google Photos for months. Once you activate the Clean Up tool, you literally use your finger or mouse to erase the part of the photo you want gone and it magically disappears. It did, however, leave my photos looking obviously manipulated. Removing a swan from a picture of my wife and daughter by a pond made the water look doctored. But at least that gets around the problem of AI creating fake images you can’t tell are fake.
There are two more image-generation features — one called Image Playground that lets you create illustrations and one called Genmoji that lets you make your own custom emoji — that will roll out “later this year and in the months following,” according to Apple.
That disclaimer applies to the other two pillars of Apple Intelligence: action and personal context. Generally speaking, these vague terms just refer to the new Siri and how it can do things and know more about you. The example that Apple continues to offer for this further future of Apple Intelligence is that you’ll be able to ask Siri to send photos of a specific group of people at a specific event to a specific person (e.g. “Text Grandma the pictures of our family from last weekend’s barbecue”) and Siri will just do it. You can also now type your Siri request into a new menu that you pull up when you double-tap the bottom end of the screen. This is a game changer for people like me who don’t like talking to computers in public.
I have no idea if this works because these new Siri features were not available in the version of Apple Intelligence I tested, and it’s not clear when they will be. But the new Siri — or at least the initial features Apple has so far revealed — is better for sure. It can understand the context if you ask follow-up questions after an initial query. Siri will also understand what you’re saying if you stutter or change your mind, which does not represent a revolution in the capabilities of natural language processing but is a step in the right direction for the famously limited and clumsy Siri.
Once Apple Intelligence is released to the public, however, we’ll also start to see what third-party apps do with the new Siri functionality. “Siri will gain screen awareness,” Apple senior vice president Craig Federighi said at Monday’s event. “It will be able to take hundreds of new actions in your apps.”
In other words, Siri will know what you’re doing on your phone when you make requests and act on them — hence the buzzwords “action” and “personal context.” But again, these more advanced features, cool as they sound, are not ready yet.
We have to wait for the best features. That’s a good thing.
It would be a gross overstatement to say that Apple Intelligence has transformed the way I use my iPhone and MacBook. Because the AI-powered features are so far limited and out of view, I actually forget they’re there. It’s also worth emphasizing that I’m testing an unreleased version of the software. Although it probably looks very close to what Apple will release next month, the version of Apple Intelligence I’ve been using is buggy and unfinished. Apple will work out many of the bugs in the weeks to come. However, it will not release a finished product in October.
Apple hasn’t said when Apple Intelligence will graduate from beta. It could be years. After all, Google left Gmail in beta for five years before shedding the label. And as long as Apple wants to distance itself from any mistakes Apple Intelligence makes — and generative AI makes mistakes, namely hallucinations, essentially by design — we should expect the label to stick around.
One potential limitation Apple is facing in its quest to bring more advanced AI features to its users is its commitment to privacy. I mentioned privacy earlier as a key feature, although I haven’t described what it looks like because it’s invisible. While AI technology like Google Gemini and OpenAI’s ChatGPT require sending vast amounts of your data to servers in the cloud, Apple, which is famously serious about privacy and security, promises its proprietary AI models will do as much on your device as it can, as Apple does with much of your data. And when Apple does need to send your data to a server, it will do so in a secure way with a new system called Private Cloud Compute. As more advanced features require more computing power, it remains to be seen how this system keeps up with its competitors.
Then there’s the matter of cost. It was free for me to test Apple Intelligence through the beta version of iOS 18, but it’s not clear that all of the features will be free for everyone. First of all, you’ll need a device that supports Apple Intelligence, which, again, only works on the latest devices. For most people, that will mean buying a new iPhone. For the most advanced features, Apple will reportedly charge a monthly fee at some point in the future. So while a super-smart Siri sounds useful, your life probably won’t be totally transformed unless you’re willing to pay for the privilege.
So here I am, having almost stumbled into Apple Intelligence. I upgraded to an iPhone 15 Pro ahead of the Apple Intelligence announcement, and if I hadn’t, I would not buy a new phone just to get the new AI technology. But since Apple Intelligence works on my phone, I’ve been enjoying the extent to which it makes a lot of things I do a little easier. When I think to use it, Apple Intelligence saves me a few moments of work that I would have otherwise spent reading every notification or editing a photo. At this point, I have to check the AI’s work, but that doesn’t bother me.
It’s the imminent future in which the AI’s work is indistinguishable from our own that I dread. For now, I’m happy with the simplicity of Apple’s diet AI, and I don’t mind that it screws up sometimes. So do I.
A version of this story was also published in the Vox Technology newsletter. Sign up here so you don’t miss the next one!