Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

Here’s what Google Lens’ Style Match, Smart Text Selection features look like

Like a pair of sneakers someone’s wearing? Or maybe a dress? There are quite a few apps and services — like Amazon’s Firefly or Samsung’s Bixby Vision — that let you simply point your smartphone camera at the object and search for it, or similar styles. Google is following suit with a similar feature in Google Lens, but it has the potential to reach far more people.

Google Lens is currently built into the Google Assistant on Android phones, as well as Google Photos. It lets you point the smartphone camera at objects to identify them, teach you more about landmarks, recognize QR codes, pull contact information from business cards, and more. At its annual Google I/O developer conference, the search giant announced four new improvements to Lens, and we got to try it out.

Built into camera apps

Dan Baker/Digital Trends

Google Lens is now built into the camera app on phones from 10 manufacturers: LG, Motorola, Xiaomi, Sony, Nokia, Transsion, TCL, OnePlus, BQ, Asus. That is not including Google’s very own Google Pixel 2. You are still able to access it through Google Assistant on all Android phones.

We got a chance to try it out on the recently announced LG G7 ThinQ, and the new option sits right next to the phone’s Portrait Mode.

Style Match

Dan Baker/Digital Trends

The biggest addition to Lens in this I/O announcement is Style Match. Like Bixby Vision or Amazon Firefly, you can point the smartphone camera at certain objects to find similar items. We pointed it at a few dresses and shoes, and were able to find similar-looking items, if not the exact same item. Once you find what you’re looking for, you can purchase it if available directly through Google Shopping.

It’s relatively quick, and an easy way to find things you can’t quite write into the Google Search bar.

Smart text selection

Perhaps even more useful is Smart Text Selection. Point Google Lens at text, say like from a book or a menu, and it can single out the text from everything else. You can then tap on the text and copy it or translate it. When we tried it, Lens managed to grab an entire three paragraphs of text, though we’d have to do more testing to see how well it can pick up handwritten text.

Real time

Google Lens now works in real time, so you don’t need to pause and take a photo for it to understand the subject. That means you can point it at several things and you will see it creating colored dots on the objects it grabs information for. Google said it is identifying billions of words, phrases, and things in a split second all thanks to “state-of-the-art machine learning, on-device intelligence, and cloud TPUs.”

Google said it will be rolling out all of these features toward the end of May.

Editors' Recommendations

Julian Chokkattu
Former Digital Trends Contributor
Julian is the mobile and wearables editor at Digital Trends, covering smartphones, fitness trackers, smartwatches, and more…
Apple now lets you easily move iCloud Photos content to Google Photos
Google Photos

Apple has quietly launched a tool that lets you automatically transfer all of the photos and videos stored on your iCloud account to Google Photos.

This will be especially useful if you decide to ditch your Apple-made smartphone for an Android alternative and need to move your media to your new device.

Read more
Google needs to get back to basics with Android. Why? Take a look at iOS 14
Android 10 Assistant Navigation

For the last few weeks, I’ve been bouncing between an iPhone and an Android phone to explore what their new software updates have to offer. Similar to how it goes every year, the experience largely involves me poking around in all the latest features Google and Apple have baked into Android 11 and iOS 14 -- except this year brought one conspicuous and pivotal difference.

This time, the whole process left me yearning. As a longtime Android user, this time I was more partial to (and tempted by) iOS than ever before. I’m not picking sides, nor have I ever done that in the past. However, iOS 14 hammered home what I've been suspecting for years now: Google’s relentless quest to build a “smarter” mobile operating system has riddled the Android experience with gaping, glaring holes.

Read more
Google’s Pixel 4 looks different. Here’s what the designers changed and why
google pixel 4 design interview max yoshimoto alberto villarreal sketches  6

While riding the subway in New York City, I often find myself playing a guessing game to figure out what phones my fellow commuters are using. Occasionally, I get stumped, but whenever I see a two-tone design with a bit of contrast, there’s no doubt I’m looking at a Google Pixel phone.

Pixels don’t look like the rest. The first Pixel and Pixel 2 used different materials on the back -- a mix of metal and glass -- that were unusual for a phone. While the entire rear was made of glass on the Pixel 3, the distinct two-tone aesthetic remained with a mixture of gloss and matte. This look has become iconic, and it’s now unequivocally Google.

Read more