Apple Intelligence: How Apple used Google’s help to train its AI models

On stage Monday, CEO Tim Cook’s Apple announced a splashy deal with OpenAI to include its powerful artificial intelligence model as part of its voice assistant, Siri.

But in the fine print of a technical document Apple published after the event, the company makes it clear that Alphabet’s Google has emerged as a new winner in the Cupertino, California-based company’s quest to catch up AI to catch up.

To build Apple’s core AI models, the company’s engineers used its own framework software with a range of hardware, specifically Google’s own local graphics processing units (GPUs) and cloud-only chips called tensor processing units (TPUs).

Google has been building TPUs for about a decade and has publicly discussed two flavors of its fifth-generation chips that can be used for AI training; the fifth-generation performance version offers performance competitive with Nvidia H100 AI chips, Google said.

Google announced at its annual developer conference that a sixth generation will be released this year.

The processors are specifically designed to run AI applications and train models, and Google has built a cloud computing hardware and software platform around them.

Apple and Google did not immediately return requests for comment.

Apple did not elaborate on how dependent it was on Google’s chips and software compared to hardware from Nvidia or other AI vendors.

But using Google’s chips typically requires a customer to buy access to them through its cloud division, just as customers buy computing time from Amazon.com’s AWS or Microsoft’s Azure.

First print: June 12, 2024 | 9:34 am IST