Apple to Bring On-Device AI Features With iOS 18: What This Means

JOIN US
Highlights
  • Apple is preferring on-device LLM due to privacy and speed reasons.
  • The iOS 18 AI features will be detailed during WWDC 2024.

Apple is said to introduce on-device AI features with iOS 18 based on information shared by Bloomberg’s Mark Gurman in the latest edition of his weekly newsletter Power On. These details will be confirmed at Apple’s upcoming WWDC event, which will likely happen in June. Here are the details to know.

On-device LLM for iOS 18 AI Features

On-device LLMs are known to run entirely locally on the device. They rely on the device’s chipset performance power to carry out tasks. Many modern flagship processors come with dedicated NPUs (neural processing units). This allows them to perform AI tasks quickly and more efficiently.

Since on-device LLMs do not communicate with cloud servers and data does not leave the device, these are considered more secure for users. This further means they can perform tasks rather quickly than sending data to the cloud and waiting for the relevant information. Another advantage is that they can work with no or low internet connectivity.

This is probably why Apple is interested in offering on-device LLM-powered AI features with iOS 18. The tech giant has recently published research papers about its on-device LLM. One paper mentioned Reference Resolution As Language Modeling (ReALM), a conversational AI system. It is expected to improve Siri’s conversational capabilities by remembering the context, detecting background activities, and processing on-screen content.

Apple is also rumoured to be developing its large language model, Ajax. It is the internal name and will likely be known as something else once it launches publically. The company is expected to offer deeper AI integration with various apps like Health, Messages, Numbers, Pages, and Shortcuts, which can be more useful in everyday usage.

Apple iPhone Could Get Google Gemini Too

While on-device LLM has its benefits, it also has limitations. The biggest one is they can never compete with LLMs (like ChatGPT and Gemini) that rely on the cloud. They will struggle with providing the latest possible information. They will also be limited in the amount and type of data that can be processed.

There are chances that Apple might collaborate with third parties for cloud-based AI models. One such partnership could happen with Google as Apple is reportedly interested in Gemini AI. If everything goes as planned, we could see future iPhones launch with Gemini integration similar to many Android smartphones.