Skip to main content

5 features that make awful smartphone cameras a thing of the past

iPhone 8 Plus vs Note 8 camera shootout header

Smartphone cameras no longer produce the low-megapixel blurs of the past, but how exactly did they cross the line from simply being the most convenient to actually being good enough to shoot magazine covers? The camera-testing wizards at DxOMark have now been testing smartphone cameras for five years, and with that milestone comes five years worth of data on the tech inside our smartphone cameras. So what makes the smartphone camera of today so capable? DxOMark recently shared five technologies that have caused smartphone camera capabilities to grow exponentially in the last five years.

DxOMark
DxOMark
DxOMark
DxOMark

Better processors

An image sensor is nothing without the processor connected to it. This is the mini computer that turns the signal from the sensor into actual recorded data. Processors can do all sorts of things, but in general, the faster they are, the less noise (visual distortion) they will add to the image. The difference between the iPhone 5s and iPhone 6, hardware-wise, was only a change in the image signal processor — the sensor remained exactly the same — but that was enough for the iPhone 6 to capture images with less noise.

Related Videos

Noise is most apparent in low-light settings, but less noise also means more detail, particularly when digital noise reduction comes into play. Noise reduction is another thing the processor can do, but blurring away noise has the unfortunate side effect of blurring away detail as well. If a phone camera produces less noise, noise reduction can be dialed back, thus leaving more details intact. Not everyone agrees on whether less noise or more detail is better — for example, DxO says the Google Pixel 2 errs on the side of more detail with more grain, while the Samsung Galaxy 8 Note favors less grain but loses more details in the process.

DxOMark
DxOMark
DxOMark
DxOMark

Multi-shot HDR images

Phone cameras simply can’t fit the large sensors that DSLRs and mirrorless cameras use. Instead, they have to rely on software tricks to produce higher-quality images.

High dynamic range (HDR) imaging is a prime example of this. HDR requires multiple images to be shot at different exposure values and combined into one. For example, a camera may take three photos — one exposed properly for the shadows, one for the midtones, and one for the highlights — and then merge them into one photo that now holds detail across a wider range from dark to light. A process once limited to heavy-hitting desktop image-editing programs, many smartphone cameras today can now create HDR images automatically in the blink of an eye.

While HDR has been around in smartphones since 2010, DxOMark says the technology has accelerated over the last five years, leading to dramatic improvements. Facial detection is another feature that helps with exposure, as the camera now knows which part of the image to expose for. This feature was responsible for a big perceived jump in quality from the iPhone 5s to newer models.

Improved stabilization

Stabilization in a smartphone isn’t exactly new — but it has drastically changed over the last five years by integrating the smartphone’s gyroscope data into the feature. With that information, the stabilization algorithms require less processing and guesswork than using visual motion analysis alone. Another advancement, DxOMark says, uses an extra second of video as a buffer to actually expect the type of motion that will come next.

More recent phones also employ optical image stabilization, in which the lens or the sensor actually moves counter to the movement of the phone. This helps reduce shake from holding the phone, resulting in smoother video and sharper stills, particularly in low light where slow shutter speeds can otherwise lead to blur.

Faster autofocus

When DxOMark first started testing smartphones, the iPhone 5s wouldn’t adjust focus at all after a video started. Now, thanks to on-chip phase-detection autofocus — a more advanced focusing method that works without hunting back and forth — phone cameras can keep up with moving subjects much better and focus continuously.

The Samsung Galaxy S7 uses what’s known as a dual-pixel autofocus system, which is a form of phase detection that’s better for low light. (Most phones revert to the older contrast detection autofocus when there isn’t sufficient light for phase detection).

Google tried something even more unique in the first Pixel smartphone. That phone shines a beam of light on the subject and measures how long it takes for the light to return. This tells the camera how far away the subject is, and autofocus is set accordingly. However, a common complaint about this time-of-flight autofocus is that it doesn’t work well in bright light, so Google added in phase detection as a second autofocus system in the Pixel 2.

DxOMark
DxOMark
DxOMark
DxOMark

Dual lenses and computational photography

Many phones in recent years use not one, but two cameras — that is, two different lens and sensor pairs placed side by side. Using data from offset lenses allows software to fake an effect known as shallow depth of field, whereby the background is blurred by the subject is tack-sharp. While early attempts at this were decent, DxOMark says current-generation cameras do even better because they are more capable of producing better depth maps, thus reducing the amount of errors.

While the pace of the mobile imaging advancement is impressive, DxO says manufacturers are far from done adding better cameras to their phones. As phones grow faster and more capable, computational photography will likely improve, making phone cameras more powerful and giving users more control. We’re not there yet, but maybe one day, a smartphone really will be able to replace your DSLR or mirrorless camera.

Editors' Recommendations

You can now try the OnePlus 11 for free for 100 days — here’s how
Someone holding the OnePlus 11.

Buying a new smartphone is always a gamble in terms of customer satisfaction — mainly because they cost so much and can be so different from the device that the buyer is currently using.

To combat this, OnePlus is giving customers the chance to try the OnePlus 11 entirely risk-free. Starting today, the company is running the "100 Days No Regret" program, which allows OnePlus 11 buyers to use the new flagship for up to 100 days and still be eligible to return it for a full refund.

Read more
You aren’t ready for this Galaxy S23 vs. iPhone 14 Pro camera test
Deep purple iPhone 14 Pro and Cream Galaxy S23 crossed over

Samsung’s Galaxy S23 is here, and it's quickly become one of the best phones you can buy in 2023. For $800, you’re getting a small but mighty phone with Snapdragon 8 Gen 2 for Galaxy chipset, long-lasting battery life, and a powerful triple lens camera system with a 50-megapixel main shooter.

But how does one of the best Android phones stack up against Apple’s smallest flagship, the iPhone 14 Pro? It has just as many cameras as the Galaxy S23, a powerful 48MP main camera, and costs $200 more than Samsung's handset.

Read more
Nothing Phone 2: news, release date and price rumors, and more
Nothing Phone 1 with Glyph lights active.

The Nothing Phone 1 made its debut in July 2022, and it had a reasonable amount of hype behind it due to the involvement of Carl Pei, a co-founder of OnePlus. It was a quirky phone due to the unique light show on the back that makes it stand out from the competition, but on the software front, it’s very similar to other Android phones out there. It received mixed reviews, though the consensus leaned more on the positive side.

This year, we’re expecting the Nothing Phone 2, as confirmed by Pei in January during MWC 2023. Here’s everything we know so far about the Nothing Phone 2!
Nothing Phone 2: design

Read more