Skip to main content

Google Assistant gains eyes with Google Lens, now rolling out to Pixel phones

Google Assistant is getting smarter. While the digital assistant has traditionally only used the microphone to hear, now it’ll also use the phone’s camera to see. That’s thanks to Google Lens, which, after some testing, is now rolling out to all users of Google Pixel phones.

The news was announced by Google through a blog post, and while expected, it is exciting. Google Lens promises to apply Google’s machine learning expertise to what the phone can see through a camera. Lens was first announced at Google I/O in May.

Recommended Videos

“Looking at a landmark and not sure what it is? Interested in learning more about a movie as you stroll by the poster? With Google Lens and your Google Assistant, you now have a helpful sidekick to tell you more about what’s around you, right on your Pixel,” said Google in its blog post.

That will manifest in a number of different ways. Previously, Google Lens was available through Google Photos, but it involved users having to take a photo, then switch apps and hit the Lens button. Lens on Google Assistant promises to not only be more intuitive, but also smarter. According to Google, the feature will allow users to do things like save information from a photo of a business card, follow links, and recognize objects. You can also do things like point lens at a movie poster for information about the movie, or at landmarks like the Eiffel Tower to learn more about it and its history. Last but not least, Assistant can look up products through bar codes.

Google Lens
Image used with permission by copyright holder

Of course, we’ll have to wait and see how it all works once it’s rolled out, but the good thing about Google Lens is that it doesn’t really rely on a great camera — it’s more dependent on software, so it can be updated and improved over time.

Google Lens is currently rolling out to Pixel phones in the U.S., U.K., Australia, Canada, India, and Singapore. Google says it will roll out “over coming weeks.” When it is finally available on your phone, you’ll see the Google Lens logo at the bottom right-hand corner of your screen after you activate Google Assistant.

Christian de Looper
Christian de Looper is a long-time freelance writer who has covered every facet of the consumer tech and electric vehicle…
Waze nixes Google Assistant on iPhones, but something better may be coming
Take Maps offline

If you're still using Waze as your favorite GPS app, you'll be seeing the loss of a certain function. Waze is ending its support for Google Assistant in the iOS version of the app.

A representative for the company took to the Waze public forum yesterday to announce that it will be phasing out Google Assistant after trying to fix the feature for a year. They said the iOS version of Waze has been suffering numerous issues with Google's AI voice chat feature that it had tried to address for "over a year" to no avail, and as result it won't be able to patch it.

Read more
Android 16 brings a blind fingerprint unlock perk to Pixel phones
Enabling off screen fingerprint unlock feature in Android 16.

Google is currently moving full steam ahead with the development of Android 16. Following the release of a third beta update just over a week ago, Android 16 has reached the platform stability milestone. Though the latest test build is light on feature updates, it brings a cool new trick.
On Pixel smartphones, users can now unlock their phone even if the screen is completely dark. First spotted by the folks over at Android Authority, the new “Screen-off Fingerprint Unlock” feature has been integrated within the phone’s Security & Privacy dashboard.
So far, Pixel smartphone users had to wake up the screen and put their thumb atop the fingerprint sensor icon on the Lock Screen. This had to be done either by waking up the screen with a tap gesture, or by pressing the power button.
Thanks to the new screen-off unlock convenience, users can simply place their thumb atop the in-display fingerprint sensor and get past the Lock Screen. There is no longer an intermittent hassle of lighting up the screen.

 
I was able to enable this feature after installing the Android 16 Beta 3.1 build, which runs atop the March 2025 security update, on a Pixel 8 smartphone. The new feature is a thoughtful convenience and works flawlessly.
It does, however, take a bit of muscle memory to land the thumb right above the fingerprint sensor on an otherwise dark screen. Also worth noting here is the fact that Google won’t be the first smartphone maker to offer this convenience.
I tried unlocking my OnePlus 13 and Samsung Galaxy S25 without waking up the screen, and it works just fine. Both devices are currently running Android 15, and notably, offer a faster fingerprint unlock experience compared to the Pixel 8, irrespective of whether the screen is on or off.
I’d like to point out that the screen-off fingerprint unlock system has arrived with a beta build, and Google might remove — or delay it — when the stable Android 16 update starts rolling out widely in the coming months. 
For now, your only option to experience it is by enrolling in the Android 16 beta-testing program on a compatible Google Pixel smartphone. I would, however, recommend waiting out a few more weeks for the stable update to land on your Pixel smartphone, and save yourself the buggy mess of test builds. 

Read more
Your Google Assistant just lost a bunch of features ahead of the move to Gemini
Google Assistant messaging shortcut

We've only just learned that Google Assistant is being replaced with Gemini, and now, it turns out that some features are being quietly retired as a result. Some will be available as part of Gemini, but devices that don't yet have access to Google's latest AI companion may not have an immediate replacement. Here's what's going away.

As spotted by 9to5Google, Google Assistant will lose a total of seven features, and this will affect Android, Nest Hub, and Nest speakers users.

Read more