Skip to main content

What is Deep Fusion? How it works, and what photos look like without it

Apple’s Deep Fusion camera feature has made a lot of buzz before its official release in iOS 13.2. Now that it’s live, we’re taking a deeper look at what’s being fused, and how.

It’s not that con-Fusing

Much like Apple’s Smart HDR, Deep Fusion relies on object and scene recognition, as well as a series of eight images captured before you click the shutter button.

Recommended Videos

Of the eight images, four are taken with standard exposure, and four with short exposure. A ninth picture is then taken with a long exposure when the shutter button is triggered. The short exposure shots are meant to freeze time and bolster high-frequency details like grass blades or stubble on a person’s face. Therefore, the sharpest image of this series is chosen to move on to the next step.

Three of the standard-exposure shots which display the best color, tones, and other low-frequency data are then fused with the long-exposure frame to compose a single image. This image and the sharpest short-exposure frame are then sent through neural networks, which choose between these two images for the best pixel to ultimately represent this photo. This pixel-by-pixel analysis enhances your images by ultimately minimizing noise, sharpening details, and accurately coloring your photos, doing so on a very granular and intelligent level.

how deep fusion works
Genevieve Poblano/Digital Trends

All of the post-shutter processing is done behind the scenes, so it won’t impact your photo capture time. In other words, you can still snap back-to-back photos just as quickly as you ever could on the iPhone, and if they’re all using Deep Fusion, they’ll simply be queued up in the camera roll to be processed in order. Apple says you could go into your camera roll and potentially see an image still processing for a half-second, but I’ve yet to encounter this. Deep Fusion won’t work on burst mode, however, and it will only be available on the iPhone 11 and iPhone 11 Pro models.

It just works

Apple’s iconic mantra is the guiding principle for Deep Fusion’s nonintrusiveness. There’s no toggle to flip this on; it will enable itself when possible in during various lighting situations that vary by the lens you’re shooting with.

The ultrawide-angle lens, for instance, cannot take advantage of Deep Fusion at all. On the main lens, Deep Fusion kicks in for what Apple describes as “indoor lighting” or anything below twilight in outdoor settings — that is, if the iPhone doesn’t explicitly offer night mode. The telephoto lens uses Deep Fusion for anything that isn’t very dark or exceedingly bright, but keep in mind that darker situations usually disable the telephoto camera and kick over the responsibilities to the main sensor, which will then determine what to do — be it Deep Fusion, night mode, or Smart HDR.

The results

So far, Deep Fusion’s impacts have been mostly subtle in our testing. We tested it on our iPhone 11 Pro Max running iOS 13.2 against the iPhone 11 Pro running iOS 13.1 which is not equipped with Deep Fusion, and at first, it’s hard to see Deep Fusion’s influence. But zooming in on a picture did reveal areas where finer details were more defined and less smoothed-over. This isn’t something that’s going to jump out at you, especially looking at it on a phone, though. Of course, Deep Fusion being a part of the equation never produced a worse image than without it.

We did have a couple of instances where, if you zoomed in a little, you could appreciate the difference Deep Fusion makes, particularly in the finer details of an image. So, while it may not be as magical as night mode, it’s still better to have than to not.

Corey Gaskin
Former Associate Editor, Mobile
Corey’s technological obsession started as a teenager, lusting after the brand-new LG VX8300 flip phone. This led him to…
Galaxy S25 Edge leak reveals a look that’s almost iPhone Air like
Galaxy S25 Edge

Two highly anticipated smartphone models are set to launch before the end of the year: Samsung’s Galaxy S25 Edge and Apple’s iPhone 17 Air. Newly released images of the Galaxy S25 Edge, scheduled to arrive this month, reveal a striking resemblance to rumored images of the iPhone 17 Air, which is expected to hit the market in September.

The Galaxy S25 Edge, which Samsung teased earlier this year, will arrive on May 13. When the phone launches, it’s expected to be one of the thinnest smartphones in the world.

Read more
iPhone prices are increasing, but not for the reason you think
Apple Invites on iPhone.

The average sales price of iPhones is increasing, but it isn't for the reason you think — or at least not entirely for that reason. According to new data from Consumer Intelligence Research Partners (CIRP), the average price of an iPhone has increased by $18 to $971. That's up from $953 in the fourth quarter of 2024.

While many are quick to point to tariffs as the cause, the impact of economic factors isn't clear yet. The current trend towards higher-priced iPhones lies entirely in Apple's recent decision to end the iPhone SE and iPhone 14 lines. Analysts use metrics like the U.S. Weighted Average Retail Price (US-WARP) to determine the average sale price, since Apple stopped publicly reporting those numbers at the end of 2018.

Read more
iPhone 17 final look leaks may be what we’re getting
Alleged Render of iPhone 17 Air.

The new iPhone 17 appears to be just ahead of final staging before going into mass production, adding weight to recent leaked images.

Yup, that should mean the image above, along with many leaks similar, are on track to be accurate.

Read more