Skip to main content

New A.I. system could upgrade smartphone cameras

Your smartphone camera might soon be getting a big upgrade thanks to the power of artificial intelligence.

A startup is using new technology to pack the power of a DSLR into phones. Glass Imaging wants to boost smartphone camera quality using deep neural networks and a new kind of sensor.

Recommended Videos

“By rethinking the optical system from the ground up to be tailor-made for smartphones, we managed to fit huge CMOS sensors that collect around 9x more light than traditional designs,” Ziv Attar, the company’s CEO, said in a news release. “Our state-of-the-art A.I. algorithms seamlessly correct all the distortions and aberrations, and as a result, smartphone image quality is radically increased, up 10x.”

Smarter lenses

A hand holds the DISH Celero 5G smartphone, with a close-up of the camera module.
Celero 5G Image used with permission by copyright holder

Attar explained in an interview with TechCrunch that up until recently, smartphone companies tried to improve image quality by using larger sensors and wider lenses. However, even with noise-reduction algorithms, the resulting imagery using this method ends up looking “weird and fake.”

To solve the image quality problem, Glass plans to put a larger lens inside a smartphone, but today’s ultraslim phones don’t have enough room to fit in the bigger optics. So, Glass instead intends to change the aspect ratio of the smartphone sensor. The company’s proposed method is to leverage the concepts behind anamorphic lenses. Anamorphic is a filmmaking technique of shooting for widescreen on 35mm film or sensors. The lenses fit the larger field of view to fit on a sensor, and then the footage is de-squeezed in postproduction to create a wider aspect ratio.

Mario Pérez, a professional photographer, said in an interview that anamorphic lenses had been used for years mainly for cinematic productions. But today. photography and videography enthusiasts have access to a wide array of anamorphic lenses.

“The main benefit these lenses bring is the ability to fit a wider angle of view than regular lenses, all within a small, average camera sensor, without any visible distortion (provided the video is duly processed later on),” he added.

Pérez said that it’s become relatively easy to get a phone with a regular lens and then attach a third-party anamorphic lens to it, which will get you many of the typical features a cinematic anamorphic lens offers: Wider angle of view, intense “bokeh” effect, and flares on light sources, among other benefits.

“Smartphone industry is evolving at an incredibly rapid pace,” Pérez said. “Videographers and photographers are shifting towards working with smartphone cameras as these grow in specs and performance. I wouldn’t be surprised to see smartphone brands equipping smartphone cameras with anamorphic lenses at some point soon, very much in accordance with how Apple added Cinematic Mode (Focus Shift) to iPhone 13 Pr, for instance.”

When it comes to smartphone camera innovations, Pérez said that variable focal length is another bold feature that will make a huge difference when it becomes a reality. Currently, the only way smartphone cameras can offer different focal lengths is by putting together two or more lenses with different focal lengths. The only exceptions to that are the Sony Xperia 1 III and and Xperia 5 III, both of which have moving lenses in the periscope telephoto module that provide different focal lengths. But it’s certainly a rarity.

“The day a smartphone brand will offer us, photographers and videographers, the ability to shoot with our smartphone camera at different focal lengths, all from one single lens, that day will mark the beginning of a new era in the industry of visual content creation,” Pérez said.

New takes on the camera lens

image of a city street with depth information displayed as colors.

Glass isn’t the only company trying to use new techniques to make better smartphone cameras. Researchers at Stanford University have created a new approach that allows standard image sensors to see light in three dimensions. These common cameras could soon be used to measure the distance to objects.

Measuring distance between objects with light is now possible only with specialized and expensive lidar – short for “light detection and ranging” – systems. But the Stanford scientists wrote in a recent paper that they came up with a solution that relies on a phenomenon known as acoustic resonance. The team built a simple acoustic modulator using a thin wafer of lithium niobate – a transparent crystal that is highly desirable for its electrical, acoustic, and optical properties – coated with two transparent electrodes.

“Existing lidar systems are big and bulky, but someday, if you want lidar capabilities in millions of autonomous drones or in lightweight robotic vehicles, you’re going to want them to be very small, very energy efficient, and offering high performance,” Okan Atalar, the first author on the new paper. said in a news release.

Topics
Sascha Brodsky
Sascha Brodsky is a writer who focuses on consumer technologies and privacy issues for a broad range of outlets. He’s been…
iOS 18.2 is rolling out now with a ton of new Apple Intelligence features
Apple Intelligence on the Apple iPhone 16 Plus.

Apple has started the public rollout of iOS 18.2 and the corresponding iPadOS update, and they bring a handful of long-awaited features in its AI kit. The release notes are pretty exhaustive, and they reveal a few features that are minor improvements to the already available Apple Intelligence bundle.

The most notable addition is ChatGPT integration with Siri, which shifts things over to the OpenAI chatbot if Apple’s assistant can’t provide an answer. ChatGPT integration is also expanding within the Writing Tools set, thanks to the compose feature that lets users create fresh content and generate images.

Read more
Face ID could get a big upgrade on the iPhone 17. Here’s what might change
Face ID tick icon on the iPhone 14 Pro's Dynamic Island.

Over the years, various rumors have indicated that Apple plans to integrate Face ID technology into the display of a future iPhone. Now, the company has been granted a U.S. patent covering the technology capable of achieving this integration. This development suggests that the feature could be introduced on at least one model in the upcoming iPhone 17 series, which is set to launch next year.

The patent, reported by Patently Apple, details the intricate placement of cameras and other sensors behind an active display part. This innovation might finally eliminate the need for the notorious iPhone notch/pill cutout that has characterized recent models.

Read more
Google Gemini arrives on iPhone as a native app
the Google extensions feature on iPhone

Google announced Thursday that it has released a new native Gemini app for iOS that will give iPhone users free, direct access to the chatbot without the need for a mobile web browser.

The Gemini mobile app has been available for Android since February, when the platform transitioned from the older Bard branding. However, iOS users could only access the AI on their phones through either the mobile Google app or via a web browser. This new app provides a more streamlined means of chatting with the bot as well as a host of new (to iOS) features.

Read more