Apple may be working on a way to let LLMs run on-device and change your iPhones forever

Apple researchers have apparently discovered a method that allows iPhones to host and run their own large language models (LLMs).

With this technology, future iPhone models could finally have the generative AI features that people have been eagerly waiting for. This information comes from a few papers published on arXiv, a research sharing platform owned by Cornell University. The documents are quite dense and can be difficult to read, so we're going to break things down for you. But if you want to read them yourself: the newspapers are free for everyone to view.

One of the biggest problems with putting an LLM on a mobile device is the limited amount of memory on the hardware. If Venture Beat explains in their reporting that recent AI models such as GPT-4 contain “hundreds of billions of parameters,” a quantity that smartphones have difficulty processing. To address this problem, Apple researchers propose two techniques. The first is called windowing, a method in which the built-in AI recycles already processed data instead of using new information. The goal is to take some of the load off the hardware.

The second is called row-column bundling. This collects data in large chunks that the AI ​​can read; a method that will increase the LLM's ability to “understand and generate language”, according to MacRumors. The article goes on to say that these two techniques allow AIs to use “up to twice the size of the available (memory)” on an iPhone. It is a technology that Apple must master if it wants to deploy advanced models “in resource-constrained environments.” Without this, the researchers' plans cannot get off the ground.

On-device avatars

The second article is about iPhones that may get the ability to create animated 3D avatars. The content is created using videos captured by the rear cameras through a process called HUGS (Human Gaussian Splats). This technology has existed in some form before. However, Apple's version is said to be able to render the avatars 100 times faster than older generations and also capture the finer details such as clothing and hair.

It is not known exactly what Apple plans to do with HUGS or any of the previously mentioned technologies. However, this research could open the door to a variety of possibilities, including a more powerful version of Siri, “real-time language translation,” new photography features, and chatbots.

Siri enabled

These upgrades may be closer to reality than some might think.

In October, rumors surfaced that Apple is working on a smarter version of Siri, which will be enhanced by artificial intelligence and some generative capabilities. A possible use case would be an integration with the Messages app, allowing users to ask tough questions or finish sentences “more effectively.” As for chatbots, there have been other rumors that the tech giant would be developing a conversational AI called Ajax. Some people have also thrown around 'Apple GPT' as a potential name.

No word on when Apple's AI projects will see the light of day. Over there been speculation that something could roll out in late 2024 alongside the launch of iOS 18 and iPadOS 18, although it's still unknown when exactly we'll see this.

Check out Ny Breaking's latest roundup of the best iPhone deals for December 2023.

You might also like it

Related Post