Skip to main content

What is semantic rendering, and how it improves your iPhone 11’s camera

iPhone 11 Pro Max rear triple camera
Julian Chokkattu/Digital Trends

The biggest improvements to Apple’s new iPhones are in the cameras, and not just because of the new ultra-wide-angle lenses on the iPhone 11, iPhone 11 Pro, and iPhone 11 Pro Max. The software powering the cameras is responsible for a significant leap forward in image quality thanks to improvements in computational photography techniques. One of the most interesting is semantic rendering, an intelligent approach to automatically adjusting highlights, shadows, and sharpness in specific areas of a photo.

What is semantic rendering?

In artificial intelligence, “semantics” refers to a machine’s ability to smartly segment information similar to how a human would. Different branches of machine learning may have different uses for semantic segmentation, but for photography, it starts with subject recognition.

In Apple’s case, the camera is specifically looking for any people within the frame, but it goes a level deeper than that. When the iPhone detects a human subject, Apple told Digital Trends it further differentiates between skin, hair, and even eyebrows. It can then render these segments differently to achieve the best results, creating a portrait that is properly exposed over the background.

[/pullquote]Just take a picture, and the phone will do the work in fractions of a second. [/pullquote]

To understand why this is so important, it helps to also understand how a standard camera works. Whether an older iPhone or a professional DSLR, a camera usually doesn’t know what it’s shooting. It knows the color and brightness of any given pixel, but it can’t glean any meaning about what’s actually in the frame. When you select the “portrait” color profile on a Nikon or Canon, for example, the camera is merely applying settings to specific color ranges of pixels commonly found in human subjects; it doesn’t really know if a person is present or not.

Such an effect is called a global adjustment, meaning it is applied to the entire photo equally. This is also how standard high dynamic range, or HDR, photos work: Highlights are lowered, shadows are raised, and midrange contrast might be enhanced — but without regard to what’s in the picture. This approach works well for subjects like landscapes, but it doesn’t always work for portraits.

iPhone 11 portrait mode Julian Chokkattu/Digital Trends

With semantic rendering, an iPhone 11 can apply local, rather than global, adjustments. This means a bright sky can have its brightness reduced to maintain color and detail, while the highlights on a person’s face won’t be reduced as much, preserving depth in the subject. Sharpness can also be applied to the skin and hair in different strengths.

Photographers have been doing this kind of retouching by hand in programs like Adobe Photoshop for years, but the enhancements are applied instantly on an iPhone 11.

How do you use it? Just take a picture, and the phone will do the work in fractions of a second. Know that semantic rendering only affects human portraits; other types of photos receive the standard HDR treatment. It is not limited to portrait mode — any photo with a human subject is automatically a candidate for semantic rendering.

Computational photography — which incorporates everything from HDR to depth-sensing portrait modes — enables phone cameras to surpass the physical limitations of their small lenses and sensors. Apple’s semantic rendering is among the next evolution of these technologies — Google has been using similar machine learning to power the camera in its Pixel smartphones.

While the tech powering it is complex, its goal is simple. By giving the iPhone the ability to know when it’s looking at a person, it sees the world a little more like we do, leading to pictures that look natural and more true to life.

Editors' Recommendations

Daven Mathies
Former Digital Trends Contributor
Daven is a contributing writer to the photography section. He has been with Digital Trends since 2016 and has been writing…
This is our best look yet at the iPhone 16’s big design changes
iPhone 15 Pro in Natural Titanium held in hand in front of a cement brick wall.

It seems Apple is prepping yet another design refresh for its smartphones this fall season. In 2023, the iPhone 15 Pro made an aesthetic deviation by serving thinner bezels and titanium looks alongside a new multi-function button. This year, it’s going to be the entry-point iPhone 16 and its Plus variant that are apparently lined up for a design refresh.

Tech commentator Sonny Dickson has shared dummy units reportedly depicting all four iPhone 16 variants, which seem to confirm what previous leaks have predicted so far. On the iPhone 15 and iPhone 15 Plus, the camera lenses dance diagonally on a square bump. Apple is reportedly ditching the current camera arrangement for their respective successors in favor of a pill-shaped vertical setup.

Read more
This could be our first look at iOS 18’s huge redesign
An iPhone 14 Pro Max and iPhone 14 Pro standing upright on a desk.

While iOS 17 fell short on a visual overhaul, Apple is rumored to be working on an updated identity for its next iOS version. Previous reports have claimed that the upcoming iOS 18 will feature visionOS-like elements introduced on the Apple Vision Pro. A new report confirms this with a leaked image of the iOS 18 Camera app.

According to a report from MacRumors, the next version of the Camera app could feature visionOS-style design elements. It is based on an iPhone frame template that the publication received from an anonymous source who claimed to have received it from an iOS engineer. It is said to have been included as part of the Apple Design Resources for iOS 18.

Read more
Everything Apple says is wrong about the DOJ’s iPhone lawsuit
The Apple logo on the iPhone 14 Pro Max.

The antitrust season is in full swing in 2024. This time around, Apple is in the cross-hairs of regulators, bringing back memories of the historic Microsoft antitrust case that unfolded over two decades ago. Back then, the focus was on Windows and web browsers. In Apple’s case, the iPhone is the centerpiece, with a wide ecosystem woven around it.

Experts say the case against Apple, which dives deep into monopolistic conduct, is surprisingly strong. The Department of Justice, in its lawsuit, has targeted everything from the iMessage “green bubble” mess and Apple Watch incompatibility situation to the locked app ecosystem and objectionable practices that Apple has put in place to maintain its alleged monopoly.

Read more