Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

Apple’s plans for a Siri evolution keep getting pushed into the future

Summoning Siri on an iPhone.
Nadeem Sarwar / Digital Trends

The biggest takeaway from Apple’s splashy WWDC event earlier this year was the next evolution of Siri in the age of AI. Unfortunately, many of those promising upgrades are yet to arrive for the masses, and whatever’s already available isn’t really groundbreaking.

The road ahead doesn’t look too gloomy, even though salvation still seems far away.

Recommended Videos

According to Bloomberg, Apple is internally working on LLM Siri based on an advanced AI stack that allows the assistant to carry back-and-forth conversations and handle more complex queries.

LLM, short for Large Language Model, is the secret sauce that powers conversation products like OpenAI’s ChatGPT and Google’s Gemini. Apple’s intention with LLM Siri doesn’t stray too far, as the company wants it to behave in roughly the same fashion as Gemini.

Pulling up Siri on lock screen of iPhone.
Nadeem Sarwar / Digital Trends

“The revamped Siri will rely on new Apple AI models to interact more like a human,” claims the report, adding that an announcement will happen at some point in 2025, followed by a spring 2026 release.

Now, that’s not catching up with the competition. You can already experience those perks on iPhones to a large extent. The Siri-ChatGPT integration, which is now live for iOS test builds, can pull that off.

Google recently released the standalone Gemini app for iPhones, which also brings the Gemini Live conversation mode to Apple smartphones. That begs an important question: Why wait for over a year when rival products already offer the convenience?

It is also worth noting here that Apple will officially add support for more third-party language models, like ChatGPT, as part of the Apple Intelligence bundle. As per Bloomberg, Google’s Gemini integration is already in the waiting line.

ChatGPT and Siri integration on iPhone.
Nadeem Sarwar / Digital Trends

At this point in time, Apple’s attempts to catch up with the virtual assistant competition are alarmingly sluggish. Google has already offloaded a lot of Google Assistant responsibilities to Gemini, and its integration with tools like Gmail and Docs is already quite rewarding.

OpenAI has also launched ChatGPT Search, making it easier for users to find information on the web, but in a lot more conversational fashion than Google Search. But that’s not all. The next move for the Microsoft-backed company is a web browser. Perplexity has also launched its own search and shopping products.

The most notable upgrade for LLM Siri will reportedly be its ability to interact with apps. “It also will make expanded use of App Intents, which allow for more precise control of third-party apps,” says the Bloomberg report.

Gemini Live on an iPhone.
Gemini Live on an iPhone 16 Pro. Nadeem Sarwar / Digital Trends

Having Siri execute tasks across different apps has long been the moonshot from a user perspective. That future now seems imminent, although still over a year away. But once again, Apple won’t be the lone warrior in this quest.

Android Authority reports that in Android 16 (which already has a Developer Preview out in the wild), Gemini could gain the ability to execute tasks across third-party apps. So far, Gemini’s activity has been limited to Workspace tools like Gmail, Docs, and Calendar, among others.

It seems Apple is definitely making the right strides with plans for LLM Siri. But by the time all those prophecies materialize, the competition would have raced far ahead with a proven track record of solid conversational AI chops.

Nadeem Sarwar
Nadeem is a tech and science journalist who started reading about cool smartphone tech out of curiosity and soon started…
Apple hit with lawsuit over Apple Intelligence delay
Invoking Siri on iPhone.

Apple has been hit with a lawsuit over allegations of false advertising and unfair competition regarding the delayed launch of some of its Apple Intelligence features.

The tech company has made much of its AI-infused Apple Intelligence tools when they were first unveiled at its developer event in June 2024, and while some of the features have made their way to its various devices since then, the company recently revealed that some of the more advanced AI-powered tools -- including for its Siri virtual assistant -- would not be ready until 2026.

Read more
HuggingSnap app serves Apple’s best AI tool, with a convenient twist
HuggingSnap recognizing contents on a table.

Machine learning platform, Hugging Face, has released an iOS app that will make sense of the world around you as seen by your iPhone’s camera. Just point it at a scene, or click a picture, and it will deploy an AI to describe it, identify objects, perform translation, or pull text-based details.
Named HuggingSnap, the app takes a multi-model approach to understanding the scene around you as an input, and it’s now available for free on the App Store. It is powered by SmolVLM2, an open AI model that can handle text, image, and video as input formats.
The overarching goal of the app is to let people learn about the objects and scenery around them, including plant and animal recognition. The idea is not too different from Visual Intelligence on iPhones, but HuggingSnap has a crucial leg-up over its Apple rival.

It doesn’t require internet to work
SmolVLM2 running in an iPhone
All it needs is an iPhone running iOS 18 and you’re good to go. The UI of HuggingSnap is not too different from what you get with Visual Intelligence. But there’s a fundamental difference here.
Apple relies on ChatGPT for Visual Intelligence to work. That’s because Siri is currently not capable of acting like a generative AI tool, such as ChatGPT or Google’s Gemini, both of which have their own knowledge bank. Instead, it offloads all such user requests and queries to ChatGPT.
That requires an internet connection since ChatGPT can’t work in offline mode. HuggingSnap, on the other hand, works just fine. Moreover, an offline approach means no user data ever leaves your phone, which is always a welcome change from a privacy perspective. 

Read more
Apple CEO should do a Steve Jobs on Siri delay, analyst says
Invoking Siri on iPhone.

Apple CEO Tim Cook should go public to explain the delay in integrating advanced Siri capabilities across its ecosystem, rather than Apple releasing the news quietly via a tech site last week, according to prominent Apple analyst Ming-Chi Kuo.

The tech giant showcased an AI-powered Siri at its WWDC event in 2024, as part of its Apple Intelligence initiative. While the virtual assistant does now have some AI smarts, the more advanced features -- including personalized responses, task completion across multiple apps, and on-screen awareness --have been delayed until next year at the earliest.

Read more