Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

Apple Intelligence acts as a personal AI agent across all your apps

Craig in front of a screen reading Apple Intelligence
Apple

During last year’s Worldwide Developers Conference (WWDC) keynote address, Apple executives mentioned the phrase “AI” exactly zero times. Oh, what a difference a year makes. At WWDC 2024 on Monday, Senior Vice President of Software Engineering Craig Federighi revealed Apple Intelligence, a new AI system “comprised of highly capable large language and diffusion models specialized for your everyday tasks” that will impact and empower apps across the company’s lineup of devices.

“This is a moment we’ve been working towards for a long time,” Federighi said. “Apple intelligence is the personal intelligence system that puts powerful generative models right at the core of your iPhone, iPad, and Mac. It draws on your personal context to give you intelligence.” The machine learning system promises to enable your mobile and laptop devices to “understand and create language, as well as images, and take action for you to simplify interactions across your apps.”

For example, Apple Intelligence will allow your iPhone to prioritize specific system notifications to minimize distractions while your focus on a task. New AI writing aides can proofread your works, rewrite them upon command, and summarize text for you. Those will be available across a variety of system apps including Mail, Notes, Safari and Pages — and third-party apps as well — Federighi explained. 

What’s more, Apple Intelligence can leverage its computer vision capabilities to create entirely new images from photos already on the camera roll. For example, “when you wish a friend a happy birthday, you can create an image of them surrounded by cake, balloons, and flowers to make it extra festive,” Federighi said. “And the next time you tell Mom that she’s your hero, you can send an image of her in a superhero cape to really land your point.” The user can even choose between a trio of artistic genres in which to display their generated works: sketch, illustration, and animation. 

Apple Intelligence’s biggest feature, however, will be its ability to interact with the various apps across a device on the user’s behalf, leveraging the user’s personal data to streamline everyday actions. For example, users will be able to find photos of specific groups and individuals within their camera roll simply be describing the shot and the people in it. Or, rather than digging through your email or messages to find the file a co-worker previously shared, users can simply say, “pull up the files that Joz shared with me last week.”

The system, is “grounded in your personal information and context, with the ability to retrieve and analyze the most relevant data from across your apps, as well as to reference the content on your screen,” Federighi said. It’s what allows the system to accurately predict whether a rescheduled business meeting might prevent the user from being late to attend their child’s dance recital. As Federighi illustrated, Apple Intelligence will “understand who my daughter is, the play details she sent several days ago, the time and location for my meeting, and predicted traffic between my office and the theater.”

Some users might blanch at the prospect of providing Apple Intelligence with that degree of access to their (and their children’s) personal data, Apple has taken extraordinary steps to ensure that information stays private. Most of Apple Intelligence’s operations happen on-device, powered by the company’s latest generations of A17 and M-family processors, Federighi said. “It’s aware of your personal data, without collecting your personal data,” he added.

Any operations that do need to be performed in the cloud will be done on Apple’s cloud compute data centers running Apple silicon. So rather than using the public clouds of hyperscalers like Google Cloud, Microsoft Azure, or Amazon’s AWS, Apple went out and built its own private data silo to handle just these machine learning compute requests.    

“When you make a request, Apple Intelligence analyzes whether it can be processed on-device,” Federighi explained. “If it needs greater computational capacity, it can draw on private Cloud compute and send only the data that’s relevant to your task to be processed on Apple silicon servers.”

“Your data is never stored or made accessible to Apple,” he continued. “It’s used exclusively to fulfill your request and, just like your iPhone, independent experts can inspect the code that runs on these servers to verify this privacy promise.”  In fact, he explained, a user’s mobile or laptop device won’t even connect to a server unless its software has been publicly logged for expert inspection.

While other companies in the burgeoning AI space have been scrambling to incorporate machine learning operations into their existing products and release them to the public as quickly as possible (with occasionally disastrous results), Apple has taken a far more measured approach toward developing and distributing its own AI capabilities.

“We continue to feel very bullish about our opportunity in generative AI and we’re making significant investments,” Apple CEO Tim Cook told Reuters in a May interview. He was also quick to point out that Apple has spent $100 billion on AI research and development over the past five years.

Although AI wasn’t mentioned directly in last year’s keynote, the company did roll out a number of machine learning-enhanced features during WWDC 2023. Those include the Lock Screen’s live video, “ducking autocorrect,” the Journal app’s personalized writing prompts, the Health app’s myopia test, and the AirPods’ ability to tune playback settings based on prevailing environmental conditions, among others. Though it’s gone by a different name in the past, this is clearly not Apple’s first rodeo.

Apple Intelligence will be available to try later this summer.

Editors' Recommendations

Andrew Tarantola
Andrew has spent more than a decade reporting on emerging technologies ranging from robotics and machine learning to space…
Apple’s AI is already beating Microsoft at its own game
Apple presenting Apple Intelligence features at WWDC 2024.

"AI for the rest of us." That's how Apple Intelligence was described at WWDC 2024, in a clear shot toward Microsoft's recently announced Copilot+ PCs. Not to be left behind in the rapidly evolving world of AI, Apple announced a suite of features that turn your Mac -- as well as your iPhone and iPad -- into a true AI computer.

Apple isn't just wandering the trail that Microsoft blazed, though.

Read more
Safari just trounced Microsoft Edge’s AI features
Highlights feature on Apple Safari

Apple has announced a round of new features for Safari at WWDC 2024, and there's a lot of AI involved. We knew it was going to happen eventually -- Google Chrome and Microsoft Edge have been trialing and launching new AI features for a while already.

First things first, the new Safari is speedy. On macOS, it's "the world's fastest browser," so we can expect to get some snappy responses when we're searching the web. Apple also claims that Safari can stream video for four hours longer than Google Chrome can manage before depleting your battery.

Read more
MacOS 15 will completely change how you use your iPhone
An iPhone being mirrored on a MacBook.

Apple just announced macOS 15 at WWDC 2024. Called macOS Sequoia, the updated operating system brings a suite of new features to Macs this fall. The key change, however, is a new Continuity feature that allows you to mirror your iPhone on your Mac, from the MacBook Air to the Mac Studio.

Although iPhone mirroring takes center stage, there are a ton of new features in MacOS 15. Here are all of them.
iPhone mirroring

Read more