The new Google Pixel 3 and Pixel 3 XL smartphones take low-light images, high-resolution photos, and well-timed shots — but these major photo features aren’t realized solely by the cameras packed inside. Instead, Google is tackling tasks typically left to larger cameras with computing power — specifically machines learning — not lenses and high-resolution sensors.
Like the Pixel 2, Google integrated a special chip designed just for photos, the Pixel Visual Core, and a dual-pixel sensor that enables dual-lens effects with a single lens. And like the original Pixel phone, the Pixel 3 shoots and merges multiple images without a delay by using HDR+. And like the first two generations of Google smartphones, Google isn’t done leveraging artificial intelligence and computational photography to take better photos.
Good-bye, crappy smartphone zoom?
Smartphone cameras have either a slight zoom using two lenses or digital zoom — and all digital zooms produce poor results by cropping the photos. You just can’t fit a big zoom lens inside a small smartphone. Google is promising better zoom with a fixed, single lens smartphone (on the rear, anyways) using Super Res Zoom.
The feature doesn’t appear to require a tripod, since it actually needs those small movements in your hands.
Super Res Zoom revamps an existing idea and reworks the concept to solve a new problem — that crappy smartphone zoom. Digital zoom doesn’t work well because the resolution is drastically reduced — but what if the image you started with had a higher resolution?
Super Res Zoom takes a burst of photos. Small movements in your hands will make those photos taken from a slightly different position. By stitching those slightly different photos together, the Pixel 3 creates a higher resolution image. And with a higher resolution image, you can use digital zoom with results that aren’t so cringe-worthy.
Perhaps what’s even more intriguing is that the feature doesn’t appear to require a tripod, since it actually needs those small movements in your hands. Panasonic, Olympus, and Pentax cameras have similar modes using pixel shift, but they are designed to create a higher resolution final file, not as an artificial zoom, and tripods are recommended.
A good low-light smartphone?
Speaking of cringe-worthy, Google’s Liza Ma says that the Pixel 3’s new low-light mode called Night Sight is so good, you’ll never use the flash. Like the Super Res Zoom, the feature is powered by machine learning. The Night Sight doesn’t use any of the usual hardware solutions for a better low light shot like a larger sensor and brighter aperture — instead, machine learning re-colors the photo to create brighter, more vivid colors even without using the flash.
Google didn’t dive much into detail how machine learning is used to brighten the photos, but says the A.I. recolors the image for a brighter shot without the flash. We’ll have to wait to see just how well that recoloring works — the feature isn’t launching until next month via software.
Top Shot mixes HDR+ with A.I. that chooses your best photos for you
The Top Shot feature inside the Pixel 3 is essentially burst mode — a fast series of photos — and a feature that DSLRs and even smartphones have long had. But what Google is doing different with Top Shot is automatically choosing which moment out of that burst is the best one.
Top Shot takes a fast burst of photos. The Pixel 3 highlights the one with your actual timing, and also highlights a recommended photo. Machine learning, Google says, determines which image in that burst is the best option. By feeding a computer a bunch of good photos and bad photos, essentially, the software learned that, yes, photos are better with everyone’s eyes open and a smile in the frame. And if you don’t agree with the A.I.’s pick, you can dig through the burst and choose the image yourself.
Google says the alternate shots are still also captured in HDR+ — so essentially, that burst mode is also taking smaller bursts to layer together for a more detailed image. HDR+ already impressed in earlier Pixel models, but managing both burst shots and multiple images at once suggests impressive computing power. (And yes, those photos will probably take up a lot of space, but Google is including unlimited Google Photos storage with the Pixel 3).
The idea of using A.I. to choose your best shots is nothing new — Adobe announced a beta tool for Lightroom to do just that a year ago. But what the Pixel 3 does is mix that new concept of automatically flagging your best shots without sifting through the bad ones with the old school burst mode. And it’s all done on one device.
So where does hardware fit in?
While the biggest new features are powered by A.I., the Pixel 3 doesn’t leave camera hardware unchanged. The front of the phone now houses two cameras — one an expected 8-megapixel camera, the other a wide angle lens with a 97-degree field of view so you can actually fit everyone into a groupie. A portrait booth mode will also trigger the shot hands-free by looking for a smile or a funny face, Google says.
The camera keeps a single lens at the back, yet manages to continue the impressive portrait mode from earlier models using dual pixel technology instead of dual lenses. That portrait mode is getting a boost, Google says — the Pixel 3 can edit the result, including changing the subject for a sharp background instead.
The camera’s dual pixel autofocus can also now track subjects — a feature that’s been around for some time on advanced cameras but is a nice addition to see integrated into a smartphone. The rear camera also includes optical and electronic image stabilization, a flicker sensor and a bright f/1.8 lens.
Video is shot at up to 30 fps 4K or 120 fps in 1080p.
Google may have made some claims that are no big deal for DSLR fans like tracking autofocus, but pit the Pixel 3’s camera against other smartphones and those A.I. features could give the Pixel 3 an edge. Annie Leibovitz, at least, agrees — she’s entered into a partnership with Google, the first time the photographer has signed an agreement with a brand. She’s not saying anything about leaving her dedicated camera behind, but Leibovitz did use the
Of course, we will be putting these features to the test when they become available, so stay tuned for our full reviews of both products.
- Key researcher behind the Pixel’s camera left Google in March
- How to take great photos with your Pixel 4 or 4 XL
- Google Pixel 4 XL review: Six months later, it’s an affordable flagship
- The best camera phones for 2020
- Optical vs. digital zoom: What you need to know