Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

Here are Apple’s secret plans for adding AI to your iPhone

A person holding the Apple iPhone 15 Plus and Apple iPhone 15 Pro Max.
Apple iPhone 15 Pro Max (left) and Apple iPhone 15 Plus Andy Boxall / Digital Trends

After the AI gala that was Samsung’s Galaxy S24 series phones, Apple could be the next to tap into the magic of deep learning and Large Language Models (LLM) that power tools such as ChatGPT and Google Bard. According to an industry analysis report by Financial Times, Apple has been on a hot acquisition streak, team reorganizations, and fresh hiring to develop AI capabilities for iPhones.

At the center of these AI acquisitions could be Siri, the on-device virtual assistant that has recently lost a lot of competitive ground to the notably smarter Google Assistant. And it looks like Apple will follow in the same footsteps as Google at supercharging its digital assistant.

Recommended Videos

Google has already baked the generative AI smarts of Bard into Google Assistant, and the revamped experience will soon be out for both Android as well as iOS devices. So, what exactly is Bard going to change with Google Assistant?

Please enable Javascript to view this content

Google Assistant predicting Siri’s path?

Hey Siri
Nadeem Sarwar / Digital Trends

Thanks to the multi-modal capabilities of the underlying PaLM-2 large language model, Assistant with Bard will soon accept text, audio, as well as media-based inputs. Think of it as the multi-search facility that comes to life courtesy of Google Lens, which is also now a part of the Circle to Search feature coming to the Pixel 8 and Galaxy S24 series phones.

In addition, Assistant with Bard will also be integrated within popular Google services such as Gmail and Docs. The upgraded assistant will be aware of the on-screen content at all times and will execute contextually aware tasks such as writing a fitting social media post based on the photo currently on the screen.

The Financial Times report also mentions that Siri will soon be powered by an LLM, one that is developed in-house by Apple instead of a licensed product like Meta’s Llama, OpenAI’s GPT, or Google’s PaLM. Notably, and quietly, Apple has already released a large language model called Ferret earlier this year in partnership with experts at Columbia University.

On-device is the flavor of this AI season

A person holding the Apple iPhone 15 Plus.
Apple iPhone 15 Plus Andy Boxall / Digital Trends

Another focus of Apple seems to be running LLM-based tasks with an on-device approach, similar to the Pixel 8 Pro and Galaxy S24 running Google’s Gemini Nano model. The benefit here is that AI operations no longer need an internet connection to link with the cloud, dramatically speeding up the operations and also ensuring added privacy as user data never leaves the device.

A Bloomberg report last year talked about Apple working on something called “Apple GPT” based on the company’s own language model, but it was limited to internal testing. Apple’s AI efforts could finally bear fruit in 2024. Another report from the same outlet also notes that an AI-powered avatar of Siri could arrive this year, likely with the arrival of iOS 18.

Aside from making Siri smarter and more responsive, Apple also aims to integrate the generative AI chops into more apps such as Messages. Samsung and Google have already given us a glimpse of how it can be implemented, thanks to snazzy features such as Magic Compose, style suggestions, real-time offline language translation for chats, and more.

Siri in action on an iPhone.
Digital Trends

So far, Apple hasn’t revealed when and how exactly it aims to implement AI across its products — especially the iPhone. But if the competition is any indication, it won’t be surprising to see Apple giving us a glimpse at its next WWDC developer conference later this year.

Interestingly, Apple has talked in glowing terms about the AI chops of its latest silicon, including the A17 Pro powering the iPhone 15 Pro duo. Apple just might lay the foundations of on-device AI and a smarter Siri — finally — starting with its current-gen flagship phones.

Nadeem Sarwar
Nadeem is a tech and science journalist who started reading about cool smartphone tech out of curiosity and soon started…
HuggingSnap app serves Apple’s best AI tool, with a convenient twist
HuggingSnap recognizing contents on a table.

Machine learning platform, Hugging Face, has released an iOS app that will make sense of the world around you as seen by your iPhone’s camera. Just point it at a scene, or click a picture, and it will deploy an AI to describe it, identify objects, perform translation, or pull text-based details.
Named HuggingSnap, the app takes a multi-model approach to understanding the scene around you as an input, and it’s now available for free on the App Store. It is powered by SmolVLM2, an open AI model that can handle text, image, and video as input formats.
The overarching goal of the app is to let people learn about the objects and scenery around them, including plant and animal recognition. The idea is not too different from Visual Intelligence on iPhones, but HuggingSnap has a crucial leg-up over its Apple rival.

It doesn’t require internet to work
SmolVLM2 running in an iPhone
All it needs is an iPhone running iOS 18 and you’re good to go. The UI of HuggingSnap is not too different from what you get with Visual Intelligence. But there’s a fundamental difference here.
Apple relies on ChatGPT for Visual Intelligence to work. That’s because Siri is currently not capable of acting like a generative AI tool, such as ChatGPT or Google’s Gemini, both of which have their own knowledge bank. Instead, it offloads all such user requests and queries to ChatGPT.
That requires an internet connection since ChatGPT can’t work in offline mode. HuggingSnap, on the other hand, works just fine. Moreover, an offline approach means no user data ever leaves your phone, which is always a welcome change from a privacy perspective. 

Read more
Apple’s portless iPhone could be more than a concept
The Apple iPhone 16 Pro Max's charging port.

A portless iPhone may no longer be outside the realm of possibility for Apple. The European Union has confirmed that the Silicon Valley giant can create portless iPhones without USB-C.

We reported over the weekend that Apple wanted to make the iPhone 17 Air the first portless phone, but shelved the idea because of regulations in the EU, per a report from Bloomberg. One of those regulations was the Common Charger Directive, an environmental law that forced Apple to switch from the Lightning port to USB-C to reduce the amount of electronic waste from Lightning cables. Now, according to 9to5Mac, European Commission press officer Federica Miccoli said a portless iPhone would also comply with the directive.

Read more
Apple could be forced to make major changes to how your iPhone works
The back of the Apple iPhone 16 Pro Max.

Apple is facing yet another landmark push in Europe that could open some of the signature features of its ecosystem. The European Commission has today detailed a couple of broad interoperability measures that Apple must follow, in order to oblige with the Digital Markets Act (DMA) guidelines.
These measures cover a total of nine connectivity features available on iPhones, covering everything from smartwatches to headphones. The idea is to give developers access to the same set of advanced features — such as immersive notifications on watches and quick pairing for peripherals — that is locked to Apple’s own devices.
“The specification decisions are legally binding,” says the regulatory body, adding that interoperability is “key to opening up new possibilities for third parties to develop innovative products and services on Apple's gatekeeper platforms.”

Hello, AirDrop alternatives!

Read more