Was I imagining things or did the iPhone 15 Pro’s screen seem to distort every time someone called up the new Siri? No. It’s a clever bit of screen animation that’s part of Apple’s upcoming iOS 18 Apple Intelligence integration, which includes everything from the Notes and Messages apps to Siri and powerful new Photos tools.
Apple unveiled its new set of Apple Intelligence capabilities at WWDC 2024 on Monday in a breakneck keynote that made it difficult to keep up with all the new features, artificial intelligence, platforms and app updates.
Now, though, I’ve seen some of these new features up close and noticed some surprises, interesting choices, and a few limitations that might frustrate consumers. Granted, Apple’s work on Apple Intelligence is just beginning, and what I’ve seen will likely look slightly different when it hits iOS 18, iPadOS 18, and macOS Sequoia. The public betas are not expected until next month.
New Siri does things differently
But back to that new Siri interaction. The update comes with iOS 18, but only those with iPhones A17 Pro inside (specifically the iPhone 15 Pro and Pro Max) and all M-class Macs and iPads will be able to experience it. Now that I’ve seen him in action up close, that seems like a real shame. As Apple’s Craig Federighi explained, older Apple mobile CPUs just don’t have enough power.
There will be no changes to the way you activate Siri. You can call him by name (with or without “Hey”) or press the power/sleep button. That movement often looks like squeezing the phone, and the new Siri understands that. One press of the button and the black edge of the screen distorts. It curves inward, so it feels like you’re really crushing the entire phone. At the same time, the edge lights up with iridescent colors that do not pulsate randomly. Instead, the glow responds to your voice. However, those animations will not appear on iPhones that do not support Apple Intelligence.
Apple’s intelligence allows the new Siri to have a conversation that provides at least some contextual runway. Ask how your favorite team is doing, and it can talk (in the demo I saw, it didn’t talk). For example, it can also show you the current standings of the Mets in the MLB (not good). The follow-up question could be about the next game without naming the team, or if you want to know something about the Mets. If you want to attend a match, you can ask Siri to add it to your calendar. Again, no mention of the Mets or their schedule. I would call this task-based content. However, it’s still unclear whether Siri can hold a conversation in the style of OpenAI’s GPT 4o.
The new Siri can be sneaky when you want it to be, by accepting text input (Type to Siri) that you invoke with a new double-tap gesture at the bottom of the screen. I’m not sure if Apple is going overboard with all these glowing boxes and borders, but yes, the type to Siri box is glowing. It can also be accessed virtually anywhere on the iPhone, including in apps.
photos
Apple is undeniably catching up in the generative image space, especially in photo editing, where it is finally taking the powerful lift-subject features introduced two iOS generations ago and expanding them with Apple Intelligence to eliminate distractions in the removing background and then filling in the blanks. Like other Apple Intelligence features, Photo Cleanup is only available for iPhones with the A17 Pro chip. Still, it’s an impressive piece of AI programming.
Cleaning takes place in Photos under Edit. Apple chose what looks like an eraser icon to represent the feature. Yes, it’s a kind of magic eraser. When you select this, a message will appear saying, “Tap, swipe, or circle what you want to delete.”
In practice, the function turns out to be smart, user-friendly and quite powerful. I saw how you could circle an unwanted person in the background of the photo; there is no need to carefully circle only the distraction and not the subject. Apple Intelligence purposely doesn’t let you delete topics. Once a distraction is circled and identified, press ‘checkmark’ and it will disappear. When Cleanup vaporized a few from a pleasant landscape, it was cool and a little unsettling.
To make sure no one gets confused about pure photography versus Apple Intelligence-supported content, Apple adds a note to the meta information: “Apple Photos Cleanup.”
Instead of Apple automatically generating Memory movies, Apple Intelligence lets you write a prompt describing, for example, a series of trips with a special person. You can even tell the system to record some type of photo, such as a landscape or selfie.
When Apple introduced this feature to the stage, I noticed all the cool animations that I assumed were performing arts. I was wrong, watching ‘Create a Memory Movie’ is a visual treat in itself, full of glowing photo squares with images lying in and out and text underneath that actually shows the work of Apple Intelligence.
The only bad news is that if you don’t like what Create a Memory Movie has created, you can’t tell the system to change the movie in a prompt. You have to start over.
To write
Apple Intelligence and its various local generative models are almost as ubiquitous as the platforms. AI tools for assistant enhancement and writing can be found in macOS Sequoia apps like Mail and Notes.
I was somewhat surprised to see how it works. If you’re typing something in Notes or Mail, you’ll need to select some of the text to enable it. From there you get a small app color-coded icon that gives you access to proofreading and rewriting tools.
Apple Intelligence provides pop-ups to show its work and explain its choices. It seems as adept at guiding your wiring to make it more professional or conversational as it is at putting together short summaries or summaries of selected text.
Call a ChatGPT friend
Apple Intelligence works locally and with the Private Compute Cloud. In most cases, Apple won’t say when or if it uses its cloud. This is because Apple considers its cloud to be as private and secure as on-device AI. It only uses that cloud when demand exceeds the on-premises system.
When Apple Intelligently decides that a third-party model might be better suited to handle your query, things are a little different.
When it’s time to use OpenAI’s ChatGPT to figure out what to cook with the broad beans you just photographed, Apple Intelligence will ask permission to send the prompt to ChatGPT. At least you don’t have to log into ChatGPT and, better yet, you can use the latest GPT-4o model.
The (magical) Statue Staff
One of the coolest features I’ve seen is the Image Wand, a tool that works especially well on an iPad with the Apple Pencil.
Apple didn’t call it a “magic wand,” but what it can do is somewhat magical.
In Notes you can circle any content, for example a poorly made sketch of a dog and some words that describe the dog or what you want it to do (“play with a ball”). Apple Intelligence is working to generate a professional-looking image that combines the two. So the scribble and the words become an illustration of a dog playing with a ball.
Even more impressive, you can circle the empty space next to your notes and the system will enter an image that works with the information. It’s also pretty fast.
New comments
I was also able to see how the new Gemojis work. There are only so many official emojis you can use in messages, but Genmoji pushes those boundaries aside and replaces them with your imagination.
Genmoji lives in iMessage and you just tap an icon, enter your prompt and it creates a cute image for you to share with Messages. I did learn that if you try to send one of these Genmojis to someone who isn’t on the latest iOS, they might see that someone replied to your message, but they’ll also get the new Genmoji delivered as a separate image message .
Of course, this is all just the beginning of what will be possible with Apple Intelligence. It’s expected to permeate iOS, macOS, and iPadOS, making its way into your data, apps, home screens, and more. It can also change your Apple ecosystem experience like never before.