“It’s still very early days here,” says Apple’s Craig Federighi about Apple Intelligence’s generative AI journey

Apple has its generative AI cake and eats it too. Unveiled today at WWDC 2024, Apple Intelligence is largely a product of local generative models of different sizes, but even the power of the A17 Pro chip isn’t always enough to answer all your substantive questions.

Sometimes Apple will have to go to the cloud. Not a single cloud, mind you, but its own Private Compute Cloud where your data is protected in a way that Apple says may not be found on other cloud-based generative AI systems.

In a keynote deep dive session after WWDC 2024, Apple Senior Vice President of Software Engineering Craig Federighi and Apple Senior Vice President of Machine Learning and AI Strategy John Giannandrea explained exactly how Apple Intelligence and the systems it will support will impact that entire new way Siri will decide when to leave your questions on the device, when to contact Apple’s Private Compute Cloud, and how Apple Intelligence decides what to share with that cloud.

“It’s still very early days here,” Federighi said as he explained the AI ​​journey, the challenges Apple faced, how they solved them and the road ahead.

What Apple is doing here is no small thing, and you could say that Apple has dug the hole it’s in. Apple Intelligence is essentially a series of generative AI models of varying sizes that look deep into your iPhone to get to know you. Because they know you, they can help you in ways that other LLM models and generative AIs probably can’t. It’s like your partner or parent can comfort you because they know everything about you, while a stranger can only guess what might comfort you but is just as likely to be wrong. Knowing you and all the data on your phone is Apple Intelligence’s superpower and potential weakness, especially when it comes to privacy.

Federighi explained that Apple has come up with a two-part solution to mitigate this problem and prevent disaster.

(Image credit: Future/Lance Ulanoff)

First, the built-in intelligence decides which pieces of all your data are crucial for deriving the right answer. It then sends only that data (encrypted and anonymized) to the Private Compute Cloud.

The second part of the solution is how the cloud is built and how it manages the data. This is a cloud that runs on the efficient Apple Silicon, but has no persistent storage. Security researchers have access to the server, but not your data, to conduct privacy audits. The iPhone does not send these data bits to a server that has not been publicly verified. Federighi compared it to the keys and tokens found on cryptocurrency servers.

“No one, not Apple or anyone else, would have access to your data,” Federighi added.

To be clear, your on-device data is at the heart of what Apple does with Apple Intelligence and the new Siri. It’s a “rich understanding of what’s on your device,” and that knowledge base is one “that will only get richer over time,” Giannandrea said.

We also got some insight into how Siri’s semantic index, which can see data from the entire phone, including meta information in photos and videos, gets a boost when combined with the Apple Intelligence models. All this contributes to a better understanding of what you are talking about, Federighi said.

Apple has been working on the semantic index for years. “So it’s really a story of us working for many, many years to create a really powerful capability on devices.”

The pair also clarified whose models you’ll be using and when. It turns out that unless you request ChatGPT for example, the local ones are all from Apple.

“It’s important to reemphasize that Apple Intelligence and the experiences we’re talking about are built on top of the models built by Apple,” Federighi said.

As they do, Apple trained these models on data. Some of it comes from the public web (based on Apple’s ongoing project on web-based search), although Giannandrea said publishers can opt out of having their data included. Apple has also licensed news archive data and even applied some internal data to its distribution model.

The duo also confirmed that Apple Intelligence only works on iPhones with the A17 Pro chip. By way of explanation, Giannandrea said, “The basic fundamental models require an enormous amount of computing.” Federighi added that the A17 Pro’s latest neural engine is “twice as powerful as the previous generation” and has an advanced architecture to support Apple’s AI. All this is probably cold comfort for iPhone 15 (A16 Bionic) and iPhone 14 (Pro and standard) owners.

As for how Apple Intelligence will work with third-party models, Federighi pointed out that some of them have expertise you might not find in their models, such as answering the question, “What can I make with these ingredients?” Then Fedeigi added something that could inadvertently cast OpenAI’s platform in an unintended light: “Even hallucinations are useful; you end up with a bizarre meal.”

You might like it too

Related Post