Got a Google Pixel smartphone? The artificially-intelligent Google Assistant on board now has eyes, and it can recognize objects and landmarks. The feature is called Google Lens, and it was first introduced back in May at Google I/O, the company’s developer conference. It’s similar to Samsung’s Bixby Vision on the Galaxy S8 — Google Lens visually analyzes what’s in front of you via the camera on your phone.
Google Lens is currently an exclusive Pixel feature, so you can only use it on a Pixel, Pixel XL, Pixel 2, or Pixel 2 XL. It originally was only available to use on photographed objects in the Google Photos app, but it’s now directly available in Google Assistant. That makes it much more useful, because you don’t have to take a picture of an object, open the Photos app, and click on the Lens logo to get information.
So how do you activate it? Open Google Assistant by pressing and holding down the home button. You should see a camera logo on the right. You’ll see a viewfinder window open. Point the camera on the item you are interested in, and tap on it.
Google Lens combines the power of AI with deep machine learning to provide you with information about many things you interact with in daily life. Instead of simply identifying what an object is, Google Lens can understand the context of the subject. So if you take a picture of a flower, Google Lens will not just identify the flower, but provide you with other helpful information like florists in your area.
Once Google Lens identifies an item, you can continue to interact with Assistant to learn more. If you point it at a book, for example, you’ll be presented with options to read a NY Times review, purchase the book on the Google Play Store, or use one of the recommended subject bubbles that will appear below the image.
If Google Lens accidentally focuses on the incorrect item, you can click the Lens icon and get another try.
Google Lens isn’t perfect. The company admits the technology works best for identifying books, landmarks, movie posters, album art, and more. Still, we’re always impressed when it offers up reviews, social media accounts, and business information when we pointed it at the awning for a small store. Point it at a business card and it will let you save the person as a contact, and it will fill in all the details on the card for you.
While Google Lens is still in its infancy, it shows a lot of promise. It’s deep learning capabilities means we should only expect it to get better in the future. Right now, Google Lens is only available on Pixel and Pixel 2 phones, though Google said it plans to bring the feature to other Android phones in the future.
- The Google Lens app is now available on Google Photos for iOS
- Google Lens’ landmark, text recognition expands to all Android devices
- Can Google’s Pixel 2 ace conventional cameras? We spent a week finding out
- Green machines: Ford commits to electrifying cars, trucks, SUVs – and Mustangs?
- Over 60 ARCore apps are now available from the Google Play Store