Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

Here’s what Google Lens’ Style Match, Smart Text Selection features look like

Like a pair of sneakers someone’s wearing? Or maybe a dress? There are quite a few apps and services — like Amazon’s Firefly or Samsung’s Bixby Vision — that let you simply point your smartphone camera at the object and search for it, or similar styles. Google is following suit with a similar feature in Google Lens, but it has the potential to reach far more people.

Google Lens is currently built into the Google Assistant on Android phones, as well as Google Photos. It lets you point the smartphone camera at objects to identify them, teach you more about landmarks, recognize QR codes, pull contact information from business cards, and more. At its annual Google I/O developer conference, the search giant announced four new improvements to Lens, and we got to try it out.

Built into camera apps

Dan Baker/Digital Trends

Google Lens is now built into the camera app on phones from 10 manufacturers: LG, Motorola, Xiaomi, Sony, Nokia, Transsion, TCL, OnePlus, BQ, Asus. That is not including Google’s very own Google Pixel 2. You are still able to access it through Google Assistant on all Android phones.

We got a chance to try it out on the recently announced LG G7 ThinQ, and the new option sits right next to the phone’s Portrait Mode.

Style Match

Dan Baker/Digital Trends

The biggest addition to Lens in this I/O announcement is Style Match. Like Bixby Vision or Amazon Firefly, you can point the smartphone camera at certain objects to find similar items. We pointed it at a few dresses and shoes, and were able to find similar-looking items, if not the exact same item. Once you find what you’re looking for, you can purchase it if available directly through Google Shopping.

It’s relatively quick, and an easy way to find things you can’t quite write into the Google Search bar.

Smart text selection

Perhaps even more useful is Smart Text Selection. Point Google Lens at text, say like from a book or a menu, and it can single out the text from everything else. You can then tap on the text and copy it or translate it. When we tried it, Lens managed to grab an entire three paragraphs of text, though we’d have to do more testing to see how well it can pick up handwritten text.

Real time

Google Lens now works in real time, so you don’t need to pause and take a photo for it to understand the subject. That means you can point it at several things and you will see it creating colored dots on the objects it grabs information for. Google said it is identifying billions of words, phrases, and things in a split second all thanks to “state-of-the-art machine learning, on-device intelligence, and cloud TPUs.”

Google said it will be rolling out all of these features toward the end of May.

Editors' Recommendations

Julian Chokkattu
Former Digital Trends Contributor
Julian is the mobile and wearables editor at Digital Trends, covering smartphones, fitness trackers, smartwatches, and more…
This new Google Lens feature looks like it’s straight out of a sci-fi movie
google lens ar translate news  2

Google has introduced AR Translate as part of a trio of updates for Google Lens that are aimed at taking image translation further into the future. The company held a demonstration at the Google Search On 2022 conference on Wednesday to show that AR Translate can use AI to make images featuring a foreign language look more natural after text is translated to another language.

Currently, any text that's converted into a different language uses colored blocks to mask bits of the background image. AR Translate better preserves the image by removing the blocks and just swapping the text outright to make the translated image look as though it was the original photo.

Read more
The new Google Wallet app has landed, and here’s what it looks like
The new Google Wallet app running on an Android phone.

Google Wallet was announced at Google I/O 2022 earlier this year, and now it's finally starting to roll out to the public. Although there was plenty of initial confusion about how the app would differ from G Pay, Google's other wallet-esque app, the differences have finally become clear as Android users are able to compare and contrast the two.

Because Wallet doesn't have the plethora of features included in G Pay, it's a little bare-bones at the moment. But that's kind of the point. It's a much more streamlined way to access credit cards and the like, but after logging in for the first time and putting information in, there's not much else to it. Take a look:

Read more
Here’s what Google’s Pixel Watch looks like on a wrist
Google Pixel Watch leaked render.

Google's Pixel Watch has leaked once more, this time with a view on a wrist. The leak comes from the same Redditor, tagtech414, who claimed to find the Pixel Watch in a bar in an earlier report. In a follow-up post, the user shared photos of the Pixel Watch on an actual wrist, including descriptions of his impressions.

"[The bands are] kind of a pain to attach the first time, but I imagine it would be easier after I know how to properly attach them. They are extremely secure and don't feel like they will easily release," tagtech414 wrote, "Most importantly, this is the most comfortable watch I've ever worn. It feels much thinner than the measurements would lead you to believe. Compared to my Galaxy Watch this feels like it's not even there."

Read more