Skip to main content

What is semantic rendering, and how it improves your iPhone 11’s camera

iPhone 11 Pro Max rear triple camera
Julian Chokkattu/Digital Trends

The biggest improvements to Apple’s new iPhones are in the cameras, and not just because of the new ultra-wide-angle lenses on the iPhone 11, iPhone 11 Pro, and iPhone 11 Pro Max. The software powering the cameras is responsible for a significant leap forward in image quality thanks to improvements in computational photography techniques. One of the most interesting is semantic rendering, an intelligent approach to automatically adjusting highlights, shadows, and sharpness in specific areas of a photo.

What is semantic rendering?

In artificial intelligence, “semantics” refers to a machine’s ability to smartly segment information similar to how a human would. Different branches of machine learning may have different uses for semantic segmentation, but for photography, it starts with subject recognition.

In Apple’s case, the camera is specifically looking for any people within the frame, but it goes a level deeper than that. When the iPhone detects a human subject, Apple told Digital Trends it further differentiates between skin, hair, and even eyebrows. It can then render these segments differently to achieve the best results, creating a portrait that is properly exposed over the background.

[/pullquote]Just take a picture, and the phone will do the work in fractions of a second. [/pullquote]

To understand why this is so important, it helps to also understand how a standard camera works. Whether an older iPhone or a professional DSLR, a camera usually doesn’t know what it’s shooting. It knows the color and brightness of any given pixel, but it can’t glean any meaning about what’s actually in the frame. When you select the “portrait” color profile on a Nikon or Canon, for example, the camera is merely applying settings to specific color ranges of pixels commonly found in human subjects; it doesn’t really know if a person is present or not.

Such an effect is called a global adjustment, meaning it is applied to the entire photo equally. This is also how standard high dynamic range, or HDR, photos work: Highlights are lowered, shadows are raised, and midrange contrast might be enhanced — but without regard to what’s in the picture. This approach works well for subjects like landscapes, but it doesn’t always work for portraits.

iPhone 11 portrait mode Julian Chokkattu/Digital Trends

With semantic rendering, an iPhone 11 can apply local, rather than global, adjustments. This means a bright sky can have its brightness reduced to maintain color and detail, while the highlights on a person’s face won’t be reduced as much, preserving depth in the subject. Sharpness can also be applied to the skin and hair in different strengths.

Photographers have been doing this kind of retouching by hand in programs like Adobe Photoshop for years, but the enhancements are applied instantly on an iPhone 11.

How do you use it? Just take a picture, and the phone will do the work in fractions of a second. Know that semantic rendering only affects human portraits; other types of photos receive the standard HDR treatment. It is not limited to portrait mode — any photo with a human subject is automatically a candidate for semantic rendering.

Computational photography — which incorporates everything from HDR to depth-sensing portrait modes — enables phone cameras to surpass the physical limitations of their small lenses and sensors. Apple’s semantic rendering is among the next evolution of these technologies — Google has been using similar machine learning to power the camera in its Pixel smartphones.

While the tech powering it is complex, its goal is simple. By giving the iPhone the ability to know when it’s looking at a person, it sees the world a little more like we do, leading to pictures that look natural and more true to life.

Editors' Recommendations

Why the $450 Samsung Galaxy A54 may be 2023’s most savvy smartphone buy
The green and purple Galaxy A54's camera modules.

-Samsung’s Galaxy A Series has long been a great choice for people who want Galaxy S Series style without the S Series price. The Galaxy A54 and Galaxy A34 are the latest, and Samsung isn’t being shy about giving these phones the latest flagship look.

Even better news is that the A54 continues to provide a great screen, long battery life, a decent camera, and solid everyday features you’ll love — all for a value-driven price. Could the new Galaxy A54 phone be 2023’s savviest smartphone buys? I tried it out for a short time to find out.
Getting the Awesome colors right

Read more
First Google Pixel 8 Pro renders reveal some surprising changes
Leaked render of the Google Pixel 8 Pro.

Once again, and in an unsurprising turn of events, Google has missed the leak train. Leakers @OnLeaks and SmartPrix have leaked rumored renders of the upcoming Pixel 8 Pro. Going by the Pixel 7's launch date, the next Pixel flagship has leaked a good six months ahead of its official introduction.
Google's terrible track record aside, let's discuss the leaked renders. It looks like the company's design team is favoring rounded edges again. The Pixel 7 duo was an improvement over the boxy looks of its predecessor, and the Pixel 8 Pro only appears to add more of that rounded corner profiling. It actually looks good, albeit a bit wider.

The overall design remains more or less the same. Google is sticking to the dual-tone approach with glass on both sides, and a metal frame that also extends over at the back to form the horizontal camera strip. Another change compared to the Pixel 7 Pro’s dual cut-out design is that the Pixel 8 Pro only has single elongated pill-shaped outline housing all three camera sensors.
This triple-lens setup likely includes a primary high-resolution snapper, an ultra-wide angle camera, and a periscope-style telephoto zoom shooter. There’s another round cutout right below the LED flash, but it’s unclear if it’s a macro or depth camera, or some kind of IR or a specialized photosensitive sensor. 

Read more
Here’s what’s really going on with those ‘fake’ Galaxy S23 Ultra moon photos
Close-up shot of the cameras on the Samsung Galaxy S23 Ultra.

A few days ago, a Reddit post sparked fresh debate asking if the Galaxy S23 Ultra was faking its moon photos. Ever since Samsung started offering a periscope-style telephoto camera on its flagships that delivers an unprecedented 10x optical and 100x digital zoom, moon photography has been marketed as one of the phone's hottest tricks. 
There’s some valid history behind the skepticism, though. In 2019, Huawei faced accusations that the P30 Pro's Moon Mode was faking the images using an overlay system, even though the company denied it. The Galaxy S23 Ultra finds itself in a similar storm, but the company has now explained how you are getting those crisp moon shots with its flagship. 

What Samsung has to say about all this

Read more