Skip to main content

As AR heads to Google search, Lens learns to translate, add tips, and more

Riley Young/Digital Trends

Computer vision puts the camera to use when you’re at a loss for words — but Google Lens can soon do more than just reverse search for similar items or details about what’s in that photo. During I/O on Tuesday, May 7, Google demonstrated new search capabilities powered by the camera and expanded Lens skills for calculating tips, translating text, and more.

During the keynote, Aparna Chennapragada, Google’s vice president for the camera and augmented reality products, demonstrated how Google’s search results can use AR to bring 3D models into the room with you, without leaving the search results. A new “view in 3D button” pops up in the search results whenever 3D content is available.

Besides allowing users to look around the 3D object from every angle, the update will also bring that 3D item into AR, mixing the model with the content from your camera to see the object in front of you. Chennapragada says the tool will be helpful for tasks such as research along with shopping.

The camera feature for search is expected to arrive later in May. Partners like NASA, New Balance, Samsung, Target, Visible Body, Volvo, Wayfair, and others will be among the first to have their 3D content pop up in the search results.

As search becomes more camera-heavy, Google Lens is moving beyond simply searching with a camera. At a restaurant, Lens can soon scan the menu, highlight the most popular dishes, bring up photos and even highlight reviews from other diners using Google Maps. The camera first has to differentiate between the different menu options before matching the text with relevant results online. At the end of the meal, Lens will calculate the tip or split the bill with friends when pointing the camera at the receipt.

Google Lens is also gaining the ability to verbally translate text. While earlier versions could use Smart Text to highlight text to copy or translate, Lens can soon read the text out loud or overlay the translated text over the original image in more than 100 languages. Alternately, Lens can also use text-to-speech in the original language, a feature that could be helpful for those with vision or reading difficulty.

The text-to-speech feature is launching first inside Google Go, a lightweight app designed for new smartphone users. Chennapragada says that the team managed to fit those languages onto just over 100KB of space, allowing the app to run on budget phones.

“Seeing is often understanding,” Chennapragada said. “With computer vision and AR, the camera is turning into a powerful visual tool to understand the world around you.”

Lens will also gain a handful of new features as part of partnerships. Readers of Bon Appetit, for example, can scan a recipe page to see a video of the dish being created. In June, Lens will uncover hidden details about paintings at San Fransisco’s de Young Museum.

The updates join a growing list of features for Google Lens like the ability to look up the artist behind a piece of artwork, shop for similar styles, or find the name of that flower you spotted. Google Lens, which has now been used more than a billion times, is available inside Google Assistant, Photos, and directly in the native camera app on a number of Android devices.

Editors' Recommendations

Hillary K. Grigonis
Hillary never planned on becoming a photographer—and then she was handed a camera at her first writing job and she's been…
Google designer believes the Pixel 4’s camera design is more iconic than Apple’s
Google Pixel 4 and 4 XL Hands on

Max Yoshimoto, director of Industrial Design on the Consumer Hardware team at Google. Google

When Apple unveiled its latest phones -- the iPhone 11 Pro and iPhone 11 Pro Max -- the design caused a bit of a stir. Namely, the camera design invoked several people's trypophobia, which is a fear of clusters of holes. Now that Google's Pixel 4 has finally been unveiled with a similar camera module, the Pixel phone's designer shared a few of his thoughts about Apple's approach.

Read more
How to navigate with the AR mode in Google Maps to find your way
google pixel 3a review xl hands on 17

Whether you live in a big city or are just visiting, finding a specific location can be a little difficult. It can be easy to get turned around right when you get off the subway or train, for example, and the compass in the maps app you're using may not offer much help. Google's solution is an integrated augmented reality (AR) mode for walking navigation, to help point the way.

The company first announced the feature at Google I/O 2018, and it's finally making its way to the Maps app. AR Mode was initially exclusive to Google's Pixel phones -- including the Pixel 3a -- but the company has now made it available for any Android phone that supports ARCore and any iPhone that supports ARKit. It's only available for use outdoors and in areas with recently published Street View images; AR Mode is not available in India; and because the camera is needed to recognize buildings, it will not work in low-light environments. The feature is still in beta.

Read more
New Google Search feature will let you see life-size animals up close with AR
Google search

Google Search rolled out a new feature that allows owners of augmented reality-enabled smartphones to go on a safari in the comfort of their own living room.

At Google's I/O conference in May,  Google’s vice president for camera and augmented reality products Aparna Chennapragada demonstrated how Google Search may use AR to layer 3D models on top of the real world. This will allow people to look at the 3D object from every angle, and place it in the location of their choice.

Read more