Skip to main content

This tiny sensor could be the future of smartphone photography

The all-in-one stack on the PxE imaging camera
Nirave Gondhia / Digital Trends
The CES 2025 logo.
Read and watch our complete CES coverage here

If you’ve been taking portrait photos on your phone for several years, you’ll know that most phones have gotten progressively better. As smartphone camera technology has evolved, so has the ability of phone makers to capture depth information better. As a result, we’re able to take portrait photos with better bokeh and detail.

However, there’s still a long way to go, especially as each smartphone takes wildly different portrait photos. Last week, during CES 2025, I saw a new holographic technology that holds a lot of promise and could usher in a new era of smartphone portrait photography. Here’s how it works and everything you need to know about it.

Recommended Videos

How PxE’s holographic technology works

Depth perception on the PxE holographic camera
Depth perception on the PxE holographic camera Nirave Gondhia / Digital Trends

The key issue at the heart of smartphone portrait photography is depth information. When a phone maker can accurately identify depth information, it can parlay this into the image signal processor (ISP) and ensure that it accurately picks up every strand of hair or minute details.

It works by treating light as waves rather than the light rays used in classical imaging. This allows it to pick up far more information, especially since the Bayer filter — used in traditional imaging — has been replaced by the PxE HoloCoder. The result is a much better understanding of depth information.

The PxE holographic imaging sensor
Nirave Gondhia / Digital Trends

The actual sensor is extremely small, but to demonstrate its capability, the company built it into a camera to showcase just how accurate its depth information is. The result is accuracy to a thousandth decimal, even at longer distances.

During the demonstration, the company showed off how the sensor picks up objects at different focal lengths and applies different colors to each subject based on how far away it is from the camera. A red outline means it’s closest to the camera, followed by yellow, green, and blue to denote objects further away.

Is this accurate? The best example is how it picks up micro movements when leaning against a wall. Even when we assume we’re standing perfectly still, the camera can pick up those slight movements that are barely perceptible to the human eye. It’s this technology that has vast potential to improve smartphone portrait photography.

How could this improve smartphone photography?

Depth perception on the PxE holographic stack
Nirave Gondhia / Digital Trends

The key benefit of this holographic technology is its ability to pick up depth information in a hitherto unheard of way. For example, we’ve all seen portrait photos where strands of hair have varying degrees of focus, but this technology can pick up the micro details in a way that allows it to identify strands of hair, even at a distance.

While this has fantastic potential to improve portrait photography, it also has the potential to improve all aspects of taking photos with your phone. For example, the real-time information collected by the holographic sensor and the accuracy to the thousandth decimal, even when moving, mean it has huge potential to improve photos of moving subjects.

We’ve all taken photos of someone or something that’s moving slightly — or when we’ve been moving — resulting in a slightly out of focus photo. PxE’s technology could help improve this — especially for moving subjects — as it can capture all these details. Applying this directly to the camera sensor ensures that the information is fed into the raw data captured by the ISP. While it’ll rely on phone makers’ ability to use this data, it has huge potential to improve smartphone photography.

When could we see this on phones?

The PxE holographic imaging sensor
The PxE holographic imaging sensor Nirave Gondhia / Digital Trends

I asked the founders when we can expect to see this on smartphones, and what does that process look like? The answer is somewhat nuanced, but the company hopes to bring this to smartphones in the next few years.

Considering that the smartphone development cycle is around two years on average, it makes sense that we’re unlikely to see this for at least that long, but the actual timing could be further away. In particular, the company is aiming to work with both Sony and Samsung — who make the majority of camera sensors used in phones — as well as the phone makers and chipset makers like MediaTek and Qualcomm to ensure that it’s implemented throughout all stages of the smartphone experience.

Building a smartphone is complex, and just applying this technology to the sensor wouldn’t be enough to fully realize its potential; using this depth information requires collaborating with each stakeholder in the smartphone process. That said, there are also far more applications beyond smartphone photography.

Other applications beyond smartphone photography

The PxE imaging camera all-in-one stack
Nirave Gondhia / Digital Trends

For many years, we’ve heard about the smart cities of the future, where autonomous vehicles talk to the environment around them, and this technology has huge potential to help usher in this era. The key issue so far has been accuracy, and PxE technology could help vastly improve the information gleaned.

Then there’s automotive, and as this video shows, this technology could help usher in much better autonomous driving. Applying this to the camera sensors used in cars could ensure that they pick up accurate depth information about the distance to the car ahead, meaning it’ll know exactly when to start slowing down or when it’s safe to speed up again. It also has the potential to make object and hazard detection far more accurate, which is a key problem seen in most current approaches to autonomous driving.

Lastly, imagine this applied to the world of science. The PxE website discusses its potential applications toward precision imaging, i.e., scientific applications such as microscopes. The camera used in the demonstration at CES could easily be applied to a vast array of applications, especially as it’s an all-in-one solution.

Why is this so exciting?

The all-in-one camera featuring the PxE holographic camera stack
The all-in-one camera featuring the PxE holographic camera stack. Nirave Gondhia / Digital Trends

The answer is fairly obvious: much like smartphone photography has improved from one generation to the next, this technology could usher in the next generation of smartphone photography. An era of holographic information has a wide array of potential applications, and crucially, it could vastly improve smartphone photography.

Natural bokeh, portrait photos, and even a better understanding of object data all mean it has the potential to improve all areas of smartphone photography. It might take a few years, but I can’t wait for this next era in imaging to become a reality.

Nirave Gondhia
Nirave is a creator, evangelist, and founder of House of Tech. A heart attack at 33 inspired him to publish the Impact of…
New iPad Air incoming? There’s a low stock warning
A person holding the Apple iPad Air (2024), showing the screen.

Less than a year has passed since the release of the current iPad Air. However, a new one could launch very soon. As Bloomberg’s Mark Gurman notes, the inventory for the popular tablet is dwindling, suggesting that a new model is set to launch.

The 2024 iPad Air was launched in May 2024, introducing a new 13-inch model alongside the traditional 11-inch model. Both versions feature Liquid Retina displays that offer vibrant colors and sharp details and are powered by the M2 chip, which provides improved performance compared to the previous generation. Beyond this, there were a few changes made between this and the previous model, which arrived in 2022.

Read more
Samsung might return to all-Exynos for its Galaxy S26 lineup
A close up of the triple camera on the Samsung Galaxy S25 Plus

Samsung has seen a smoother development with its Exynos 2600 chip than it did with the 2500, according to a new report. Prior to the release of the Samsung Galaxy S25, rumors suggested the phone could use the Exynos 2500 or the Snapdragon 8 Elite, and leaks provided a lot of conflicting information. Now, a report from a Korean news outlet says the company has already achieved a 30% yield from its manufacturing process.

The company is using a 2 nanometer production process, and it's initial yields were higher than expected according to The Bell. Samsung plans to start mass production of this chip in the second half of the year and say it could improve performance by 12% and power efficiency by 25%.

Read more
Google Messages might let you unsend awkward messages in RCS chats
The Google Messages app on the Galaxy S25 Ultra.

Google Messages, the default messaging app on Android phones, could soon get new features that will let you unsend texts like third-party messengers. The unsend functionality is reportedly under testing and will be available for chats over RCS protocol, which succeeds traditional SMS with improved support for multimedia, emoji, reactions, etc.

Presently, when you delete a message, it is only removed from your device without impacting other participants in the chat. Now, Google appears to be testing a new "delete for everyone" functionality for conversations that will delete messages for all parties, similar to instant messaging apps such as WhatsApp and Telegram. 9to5Google spotted references to the under-development functionality, suggesting it might be available for a wider audience to benefit from -- though the exact timeline of remains unknown.

Read more