Skip to main content

5 features that make awful smartphone cameras a thing of the past

iPhone 8 Plus vs Note 8 camera shootout header
Image used with permission by copyright holder

Smartphone cameras no longer produce the low-megapixel blurs of the past, but how exactly did they cross the line from simply being the most convenient to actually being good enough to shoot magazine covers? The camera-testing wizards at DxOMark have now been testing smartphone cameras for five years, and with that milestone comes five years worth of data on the tech inside our smartphone cameras. So what makes the smartphone camera of today so capable? DxOMark recently shared five technologies that have caused smartphone camera capabilities to grow exponentially in the last five years.

DxOMark
DxOMark
DxOMark
DxOMark

Better processors

An image sensor is nothing without the processor connected to it. This is the mini computer that turns the signal from the sensor into actual recorded data. Processors can do all sorts of things, but in general, the faster they are, the less noise (visual distortion) they will add to the image. The difference between the iPhone 5s and iPhone 6, hardware-wise, was only a change in the image signal processor — the sensor remained exactly the same — but that was enough for the iPhone 6 to capture images with less noise.

Noise is most apparent in low-light settings, but less noise also means more detail, particularly when digital noise reduction comes into play. Noise reduction is another thing the processor can do, but blurring away noise has the unfortunate side effect of blurring away detail as well. If a phone camera produces less noise, noise reduction can be dialed back, thus leaving more details intact. Not everyone agrees on whether less noise or more detail is better — for example, DxO says the Google Pixel 2 errs on the side of more detail with more grain, while the Samsung Galaxy 8 Note favors less grain but loses more details in the process.

DxOMark
DxOMark
DxOMark
DxOMark

Multi-shot HDR images

Phone cameras simply can’t fit the large sensors that DSLRs and mirrorless cameras use. Instead, they have to rely on software tricks to produce higher-quality images.

High dynamic range (HDR) imaging is a prime example of this. HDR requires multiple images to be shot at different exposure values and combined into one. For example, a camera may take three photos — one exposed properly for the shadows, one for the midtones, and one for the highlights — and then merge them into one photo that now holds detail across a wider range from dark to light. A process once limited to heavy-hitting desktop image-editing programs, many smartphone cameras today can now create HDR images automatically in the blink of an eye.

While HDR has been around in smartphones since 2010, DxOMark says the technology has accelerated over the last five years, leading to dramatic improvements. Facial detection is another feature that helps with exposure, as the camera now knows which part of the image to expose for. This feature was responsible for a big perceived jump in quality from the iPhone 5s to newer models.

Improved stabilization

Stabilization in a smartphone isn’t exactly new — but it has drastically changed over the last five years by integrating the smartphone’s gyroscope data into the feature. With that information, the stabilization algorithms require less processing and guesswork than using visual motion analysis alone. Another advancement, DxOMark says, uses an extra second of video as a buffer to actually expect the type of motion that will come next.

More recent phones also employ optical image stabilization, in which the lens or the sensor actually moves counter to the movement of the phone. This helps reduce shake from holding the phone, resulting in smoother video and sharper stills, particularly in low light where slow shutter speeds can otherwise lead to blur.

Faster autofocus

When DxOMark first started testing smartphones, the iPhone 5s wouldn’t adjust focus at all after a video started. Now, thanks to on-chip phase-detection autofocus — a more advanced focusing method that works without hunting back and forth — phone cameras can keep up with moving subjects much better and focus continuously.

The Samsung Galaxy S7 uses what’s known as a dual-pixel autofocus system, which is a form of phase detection that’s better for low light. (Most phones revert to the older contrast detection autofocus when there isn’t sufficient light for phase detection).

Google tried something even more unique in the first Pixel smartphone. That phone shines a beam of light on the subject and measures how long it takes for the light to return. This tells the camera how far away the subject is, and autofocus is set accordingly. However, a common complaint about this time-of-flight autofocus is that it doesn’t work well in bright light, so Google added in phase detection as a second autofocus system in the Pixel 2.

DxOMark
DxOMark
DxOMark
DxOMark

Dual lenses and computational photography

Many phones in recent years use not one, but two cameras — that is, two different lens and sensor pairs placed side by side. Using data from offset lenses allows software to fake an effect known as shallow depth of field, whereby the background is blurred by the subject is tack-sharp. While early attempts at this were decent, DxOMark says current-generation cameras do even better because they are more capable of producing better depth maps, thus reducing the amount of errors.

While the pace of the mobile imaging advancement is impressive, DxO says manufacturers are far from done adding better cameras to their phones. As phones grow faster and more capable, computational photography will likely improve, making phone cameras more powerful and giving users more control. We’re not there yet, but maybe one day, a smartphone really will be able to replace your DSLR or mirrorless camera.

Editors' Recommendations

Hillary K. Grigonis
Hillary never planned on becoming a photographer—and then she was handed a camera at her first writing job and she's been…
5 Android camera features that I need to have on my iPhone
iPhone 14 Pro and Google Pixel 7 Pro.

One of my favorite things to do with my iPhone 14 Pro is to take photos. Whether it’s part of my attempt to be artsy by snapping spontaneous moments with my husband and daughter, or just capturing the magic at Disneyland, I have a ton of photos. Though I don’t have time to edit every single one, I do like to spend time making edits on my favorites just to improve how they look before I post them on social media. But the built-in tools on the iPhone for taking photos and editing them are, well, lacking.

I’ve been testing a few different Android devices since I joined Digital Trends, and let me tell you — it’s been a trip. I’ve discovered so many new photo and camera tools on various Android devices that just show how much Apple is behind in that regard, despite being one of the most popular devices for mobile photography.

Read more
Serious Pixel 7 vs. Pixel 5 camera test shows if it’s time to upgrade
The Google Pixel 5 and Pixel 7 held in someone's hand.

The Google Pixel 5 is, and remains, a great phone. But at this point, it’s coming up on being two years old, and the pull of a brand-new smartphone may be getting too hard to resist. The newly released Pixel 7 is the most obvious upgrade, and in terms of hardware and design, it’s a big step forward.

However, buying a Pixel phone means you like taking photos, so is the Pixel 7’s camera better than the Pixel 5’s? We took the two phones out over the course of several days to find out.
Pixel 7 and Pixel 5 camera specs

Read more
The future of blood oxygen monitoring lies with your phone’s camera
Measuring SpO2 with smartphone camera and flash

Smartphones are already capable of some neat health-centric tricks. From step counting and sleep tracking to measuring pulse and respiration rate, the phone in your pocket is quite a powerful-health monitoring machine. Now, a team of scientists from the University of Washington is looking to add blood oxygen level measurement to that bag of tricks.

In a paper published inNJP Digital Medicine, the team details what it calls the “first clinical development validation on a smartphone camera-based SpO2 sensing system.” To put it simply, the team developed an algorithm and proved that smartphones could measure the blood oxygen saturation level to the same baseline level as approved by the Food and Drug Administration (FDA) for over-the-counter pulse oximeters.

Read more